hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ed42f385c92a790e0931d53945ff5fe60479fd64 | 34,735 | py | Python | tests/permutils/test_insertion_encodable.py | quintant/Permuta | 4cdc7990e3dc298d0089ba8c48cd8967acd9b81f | [
"BSD-3-Clause"
] | 12 | 2015-09-09T02:40:50.000Z | 2021-06-02T13:40:25.000Z | tests/permutils/test_insertion_encodable.py | quintant/Permuta | 4cdc7990e3dc298d0089ba8c48cd8967acd9b81f | [
"BSD-3-Clause"
] | 80 | 2015-12-17T15:00:17.000Z | 2022-01-25T20:31:54.000Z | tests/permutils/test_insertion_encodable.py | quintant/Permuta | 4cdc7990e3dc298d0089ba8c48cd8967acd9b81f | [
"BSD-3-Clause"
] | 19 | 2015-12-16T13:16:10.000Z | 2021-06-01T14:37:33.000Z | from permuta import Perm
from permuta.permutils import InsertionEncodablePerms
def test_is_insertion_encodable_rightmost():
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((0, 1)),
Perm((0, 1, 2)),
Perm((4, 0, 2, 3, 5, 1)),
Perm((0, 6, 3, 4, 2, 5, 1)),
Perm((6, 0, 3, 7, 5, 1, 2, 4)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm(()),
Perm((0, 1)),
Perm((1, 0)),
Perm((3, 0, 1, 2)),
Perm((2, 1, 3, 0, 4)),
Perm((4, 1, 3, 2, 5, 0)),
Perm((5, 1, 2, 3, 4, 0)),
Perm((2, 0, 4, 1, 5, 6, 3)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((0,)),
Perm((1, 0, 2, 3)),
Perm((0, 4, 1, 2, 3)),
Perm((6, 0, 1, 7, 4, 5, 2, 3)),
Perm((7, 3, 6, 4, 2, 5, 1, 0)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm(()),
Perm((1, 0)),
Perm((2, 1, 0)),
Perm((0, 3, 1, 2)),
Perm((3, 2, 0, 5, 7, 4, 6, 1)),
Perm((4, 1, 5, 7, 0, 3, 2, 6)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm(()),
Perm((1, 0)),
Perm((0, 2, 1)),
Perm((1, 2, 0)),
Perm((0, 1, 4, 2, 3)),
Perm((2, 3, 0, 1, 4)),
Perm((4, 7, 3, 0, 1, 5, 6, 2)),
Perm((4, 7, 3, 2, 5, 6, 0, 1)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((3, 2, 1, 4, 0)),
Perm((4, 2, 3, 0, 1)),
Perm((2, 3, 4, 0, 1, 5)),
Perm((3, 2, 5, 4, 0, 1)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((4, 2, 5, 0, 3, 1)), Perm((6, 5, 3, 4, 1, 0, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((2, 1, 0)), Perm((3, 1, 2, 0))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((6, 1, 3, 0, 2, 4, 5)), Perm((4, 3, 5, 6, 0, 2, 1, 7))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((1, 0, 2)), Perm((0, 2, 3, 1, 4))]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((0, 1, 2)),
Perm((0, 2, 1)),
Perm((1, 0, 2, 3)),
Perm((3, 1, 0, 2)),
Perm((3, 4, 1, 2, 0)),
Perm((1, 5, 2, 3, 0, 4)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((2, 0, 1)),
Perm((0, 2, 3, 1)),
Perm((3, 1, 0, 2)),
Perm((3, 2, 1, 4, 0)),
Perm((4, 0, 2, 3, 1)),
Perm((4, 2, 0, 5, 1, 3)),
Perm((6, 5, 0, 2, 7, 4, 3, 1)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((1, 0, 2)),
Perm((1, 2, 0)),
Perm((2, 1, 0)),
Perm((3, 6, 1, 5, 0, 2, 4)),
Perm((6, 1, 2, 5, 4, 3, 0)),
Perm((5, 4, 6, 0, 3, 2, 7, 1)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((1, 0, 2)), Perm((1, 3, 2, 0))]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((1, 2, 0)),
Perm((2, 3, 0, 1)),
Perm((3, 2, 1, 0)),
Perm((1, 3, 2, 0, 4)),
Perm((2, 1, 6, 0, 5, 4, 3)),
Perm((4, 5, 0, 2, 3, 6, 7, 1)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((2, 1, 0)),
Perm((1, 2, 0, 3)),
Perm((2, 0, 1, 3, 4)),
Perm((4, 0, 1, 2, 3)),
Perm((1, 3, 0, 5, 2, 4)),
Perm((1, 5, 3, 0, 2, 4)),
Perm((4, 3, 5, 0, 2, 1, 6)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((1, 0, 2)),
Perm((2, 3, 1, 0)),
Perm((0, 4, 1, 3, 2)),
Perm((6, 3, 5, 1, 2, 4, 0)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((0, 1, 2)), Perm((4, 2, 5, 3, 1, 0)), Perm((1, 2, 3, 0, 5, 6, 4))]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((1, 2, 0)), Perm((2, 4, 0, 1, 3)), Perm((7, 5, 4, 3, 1, 0, 2, 6))]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((0, 1, 2)),
Perm((1, 2, 0)),
Perm((1, 0, 3, 2)),
Perm((2, 0, 3, 1)),
Perm((5, 3, 2, 4, 0, 1)),
Perm((2, 0, 3, 4, 6, 5, 1)),
Perm((5, 1, 6, 2, 7, 0, 4, 3)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((0, 2, 1)),
Perm((2, 0, 1)),
Perm((2, 0, 3, 1, 4)),
Perm((4, 0, 3, 1, 2)),
Perm((4, 3, 1, 2, 0)),
Perm((4, 2, 5, 1, 0, 3)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((0, 2, 1)),
Perm((1, 2, 0)),
Perm((0, 3, 1, 2)),
Perm((1, 0, 3, 2)),
Perm((2, 4, 1, 0, 3)),
Perm((4, 3, 0, 1, 2)),
Perm((2, 5, 0, 3, 6, 4, 1)),
Perm((5, 7, 1, 6, 3, 4, 0, 2)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((0, 2, 1)),
Perm((2, 0, 1, 3)),
Perm((1, 3, 0, 2, 4)),
Perm((5, 0, 4, 3, 2, 7, 6, 1)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((0, 2, 1)),
Perm((2, 1, 0)),
Perm((3, 0, 2, 1)),
Perm((0, 1, 2, 3, 4)),
Perm((0, 1, 3, 2, 4)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((0, 1, 2)),
Perm((2, 0, 1, 3)),
Perm((2, 1, 3, 0)),
Perm((1, 0, 4, 2, 3)),
Perm((1, 6, 2, 3, 4, 0, 5)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((2, 0, 1)),
Perm((3, 1, 0, 4, 2)),
Perm((1, 6, 4, 2, 0, 3, 7, 5)),
Perm((2, 6, 1, 5, 4, 7, 0, 3)),
Perm((3, 0, 7, 2, 1, 4, 5, 6)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((1, 0, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((2, 0, 1)),
Perm((3, 1, 0, 2)),
Perm((1, 0, 5, 2, 4, 3)),
Perm((4, 3, 0, 5, 6, 1, 2)),
Perm((1, 2, 5, 6, 7, 3, 0, 4)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((3, 4, 2, 1, 0)),
Perm((3, 6, 2, 0, 5, 4, 1)),
Perm((6, 3, 0, 2, 4, 5, 1)),
Perm((5, 7, 0, 4, 2, 1, 3, 6)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((3, 1, 0, 2)),
Perm((4, 0, 2, 3, 1)),
Perm((2, 5, 4, 1, 0, 3)),
Perm((0, 6, 1, 2, 4, 5, 3)),
Perm((6, 2, 4, 1, 5, 7, 3, 0)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((1, 2, 0)),
Perm((1, 2, 0, 3)),
Perm((1, 5, 7, 2, 3, 4, 6, 0)),
Perm((3, 6, 7, 4, 5, 2, 0, 1)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((2, 5, 1, 3, 0, 4))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((4, 0, 3, 1, 2)), Perm((1, 3, 5, 2, 0, 4)), Perm((1, 5, 4, 0, 3, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((1, 2, 0)), Perm((4, 1, 2, 5, 0, 3)), Perm((1, 4, 0, 6, 2, 3, 7, 5))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((1, 2, 0)),
Perm((3, 4, 1, 5, 2, 0)),
Perm((1, 4, 3, 5, 6, 2, 0)),
Perm((3, 2, 4, 1, 6, 5, 0)),
Perm((3, 0, 4, 2, 6, 7, 5, 1)),
Perm((4, 1, 3, 6, 5, 2, 7, 0)),
Perm((4, 2, 5, 0, 6, 1, 3, 7)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((2, 1, 3, 0)),
Perm((2, 1, 3, 0, 4)),
Perm((4, 0, 3, 1, 2)),
Perm((1, 3, 4, 2, 5, 0)),
Perm((2, 0, 6, 4, 1, 3, 5)),
Perm((3, 4, 1, 6, 2, 5, 7, 0)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((1, 3, 2, 0)), Perm((0, 4, 2, 1, 3)), Perm((4, 0, 3, 2, 1))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[
Perm((2, 1, 3, 0)),
Perm((3, 1, 2, 0)),
Perm((1, 2, 0, 4, 3)),
Perm((4, 2, 1, 0, 3)),
Perm((0, 1, 2, 5, 3, 4)),
Perm((3, 2, 5, 4, 0, 1)),
Perm((2, 5, 7, 3, 4, 0, 6, 1)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((3, 5, 0, 1, 4, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_rightmost(
[Perm((0, 1, 2)), Perm((6, 1, 3, 0, 4, 2, 5, 7))]
)
def test_is_insertion_encodable_maximum():
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((0,)), Perm((0, 1)), Perm((2, 0, 1, 3)), Perm((2, 1, 3, 0))]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 1, 2)),
Perm((1, 2, 0)),
Perm((0, 1, 2, 4, 3)),
Perm((2, 0, 4, 1, 3)),
Perm((3, 0, 6, 1, 4, 2, 5)),
Perm((3, 5, 1, 4, 6, 0, 2)),
Perm((6, 1, 2, 3, 0, 5, 7, 4)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((0, 2, 1)), Perm((2, 1, 0)), Perm((3, 1, 2, 0, 4, 5))]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((1, 2, 0)),
Perm((2, 0, 1)),
Perm((2, 3, 1, 0)),
Perm((3, 1, 4, 0, 2)),
Perm((0, 5, 4, 1, 3, 2)),
Perm((3, 2, 5, 4, 0, 1)),
Perm((0, 5, 3, 6, 4, 1, 7, 2)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((1, 2, 0)),
Perm((2, 1, 0)),
Perm((1, 2, 0, 3)),
Perm((2, 3, 1, 0, 4)),
Perm((1, 2, 5, 6, 3, 4, 0)),
Perm((5, 4, 6, 1, 0, 3, 2)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((2, 1, 0, 3)), Perm((3, 1, 0, 2)), Perm((0, 1, 4, 2, 3))]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 1, 2)),
Perm((0, 2, 1)),
Perm((2, 0, 1)),
Perm((2, 1, 0)),
Perm((1, 3, 2, 0)),
Perm((1, 4, 3, 2, 0)),
Perm((1, 2, 4, 0, 3, 6, 5)),
Perm((1, 7, 0, 3, 4, 5, 6, 2)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((2, 0, 1)),
Perm((1, 2, 3, 0)),
Perm((3, 0, 2, 1, 4)),
Perm((4, 1, 6, 3, 0, 5, 2)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((1, 0)), Perm((0, 2, 1)), Perm((1, 2, 0)), Perm((2, 1, 0))]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm(()), Perm((0, 1, 3, 2))]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 2, 1)),
Perm((0, 1, 2, 3)),
Perm((0, 3, 5, 1, 4, 2)),
Perm((3, 5, 2, 6, 0, 1, 4, 7)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((2, 0, 1, 3)),
Perm((3, 1, 2, 0)),
Perm((3, 2, 1, 0)),
Perm((2, 3, 4, 1, 0)),
Perm((4, 2, 0, 3, 5, 1)),
Perm((6, 5, 4, 3, 7, 1, 2, 0)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((0, 1, 2, 3)), Perm((3, 2, 5, 1, 4, 0))]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((0, 1)), Perm((0, 1, 2)), Perm((2, 1, 4, 0, 3)), Perm((3, 2, 0, 1, 4))]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 1, 2)),
Perm((0, 2, 1)),
Perm((0, 2, 1, 4, 3)),
Perm((2, 1, 4, 3, 0)),
Perm((2, 4, 3, 5, 1, 0)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 1, 2)),
Perm((2, 0, 1)),
Perm((0, 4, 2, 1, 3)),
Perm((1, 0, 3, 4, 2)),
Perm((3, 2, 4, 0, 1)),
Perm((1, 4, 5, 3, 0, 2)),
Perm((0, 4, 2, 5, 1, 3, 6)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 2, 1)),
Perm((0, 3, 2, 1)),
Perm((1, 2, 3, 0)),
Perm((1, 2, 3, 4, 0)),
Perm((1, 4, 0, 3, 2)),
Perm((4, 0, 1, 5, 2, 3)),
Perm((3, 6, 1, 2, 0, 5, 4)),
Perm((4, 0, 5, 1, 6, 2, 3)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 2, 1)),
Perm((1, 2, 0, 3)),
Perm((1, 3, 4, 0, 2)),
Perm((1, 2, 5, 4, 3, 0)),
Perm((5, 0, 2, 1, 4, 3)),
Perm((4, 5, 2, 6, 3, 1, 0)),
Perm((5, 0, 1, 4, 6, 3, 2)),
Perm((5, 0, 3, 1, 6, 2, 4)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 2, 1)),
Perm((1, 2, 0)),
Perm((3, 1, 2, 0)),
Perm((3, 2, 4, 0, 1)),
Perm((2, 0, 5, 1, 3, 4)),
Perm((3, 5, 0, 1, 4, 2, 7, 6)),
Perm((4, 5, 2, 0, 3, 6, 1, 7)),
Perm((6, 3, 2, 0, 1, 7, 4, 5)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 3, 2, 1)),
Perm((1, 2, 3, 0)),
Perm((0, 2, 1, 4, 3)),
Perm((6, 1, 0, 5, 3, 4, 2)),
Perm((6, 0, 2, 5, 1, 4, 7, 3)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((3, 0, 5, 2, 4, 1))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((1, 0, 2)),
Perm((1, 3, 2, 0)),
Perm((1, 5, 4, 2, 0, 3)),
Perm((4, 2, 0, 3, 5, 1)),
Perm((4, 3, 2, 0, 5, 1)),
Perm((5, 1, 2, 4, 3, 0, 6)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((2, 0, 1)),
Perm((1, 2, 0, 4, 3)),
Perm((3, 0, 2, 1, 4, 5)),
Perm((5, 3, 2, 0, 1, 6, 4)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((0, 4, 3, 1, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((0, 2, 1)), Perm((0, 4, 2, 3, 1))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((2, 3, 1, 5, 0, 4)), Perm((6, 2, 4, 1, 3, 7, 5, 0))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 4, 2, 1, 3)),
Perm((1, 4, 0, 2, 3)),
Perm((3, 1, 2, 5, 6, 0, 4)),
Perm((0, 4, 6, 5, 7, 1, 2, 3)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((1, 0, 3, 4, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((2, 0, 1, 3))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((1, 3, 2, 4, 0))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 2, 1, 3)),
Perm((0, 2, 3, 1)),
Perm((0, 5, 3, 1, 2, 6, 4)),
Perm((2, 4, 3, 0, 1, 6, 7, 5)),
Perm((2, 7, 1, 4, 5, 6, 0, 3)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((3, 0, 2, 1)),
Perm((3, 1, 2, 0)),
Perm((3, 1, 0, 4, 2)),
Perm((4, 0, 2, 3, 1)),
Perm((4, 0, 3, 1, 2)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((1, 5, 6, 4, 2, 0, 3))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((3, 2, 1, 0, 4)),
Perm((4, 3, 2, 0, 5, 1)),
Perm((5, 4, 2, 3, 0, 1)),
Perm((0, 3, 2, 1, 4, 6, 5)),
Perm((2, 3, 0, 5, 1, 6, 4)),
Perm((3, 6, 2, 4, 1, 0, 5)),
Perm((0, 5, 3, 4, 2, 6, 7, 1)),
Perm((0, 6, 4, 3, 7, 1, 5, 2)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((1, 2, 0)), Perm((4, 2, 5, 0, 1, 3)), Perm((6, 3, 5, 1, 7, 4, 0, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((3, 0, 2, 1)), Perm((4, 2, 1, 3, 0))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((3, 2, 0, 1)), Perm((6, 1, 3, 2, 4, 0, 5))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 2, 3, 1)),
Perm((0, 3, 2, 1)),
Perm((3, 0, 1, 2)),
Perm((4, 3, 2, 0, 5, 1)),
Perm((1, 0, 6, 4, 3, 5, 2)),
Perm((3, 1, 6, 2, 5, 0, 4)),
Perm((5, 2, 6, 7, 0, 3, 1, 4)),
Perm((7, 1, 6, 2, 5, 0, 4, 3)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[Perm((0, 2, 1)), Perm((3, 1, 2, 0))]
)
assert not InsertionEncodablePerms.is_insertion_encodable_maximum(
[
Perm((0, 2, 1)),
Perm((4, 0, 2, 1, 3)),
Perm((2, 5, 4, 1, 3, 0)),
Perm((3, 5, 4, 0, 2, 1)),
Perm((1, 0, 4, 2, 3, 6, 5)),
Perm((6, 3, 2, 0, 5, 1, 4)),
Perm((4, 0, 1, 2, 3, 5, 6, 7)),
]
)
def test_is_insertion_encodable():
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm(()), Perm((0, 1)), Perm((0, 1, 2))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 3, 2, 1)),
Perm((0, 2, 3, 4, 1)),
Perm((3, 0, 1, 2, 4)),
Perm((4, 2, 1, 0, 3, 5)),
Perm((0, 2, 1, 4, 6, 3, 5)),
Perm((6, 2, 1, 0, 5, 4, 3)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm((0, 1)), Perm((1, 0)), Perm((1, 0, 3, 2))]
)
assert InsertionEncodablePerms.is_insertion_encodable([Perm(()), Perm((1, 0))])
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 1, 2)),
Perm((2, 0, 1)),
Perm((0, 2, 4, 3, 1)),
Perm((1, 2, 4, 0, 3)),
Perm((3, 2, 4, 1, 5, 0)),
Perm((2, 0, 3, 6, 4, 1, 5)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm((1, 0)), Perm((0, 1, 2)), Perm((0, 3, 1, 2)), Perm((1, 3, 2, 0))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((2, 1, 0)),
Perm((0, 3, 1, 2)),
Perm((0, 3, 2, 1)),
Perm((3, 2, 1, 0)),
Perm((2, 1, 3, 5, 0, 4)),
Perm((3, 1, 5, 2, 4, 0)),
Perm((5, 1, 4, 3, 0, 2, 6)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm(()), Perm((1, 0)), Perm((2, 0, 1)), Perm((0, 1, 2, 4, 3))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm(()), Perm((0,)), Perm((0, 1)), Perm((1, 0)), Perm((0, 3, 1, 2, 4))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 2, 1)),
Perm((3, 2, 0, 1)),
Perm((2, 1, 0, 3, 4)),
Perm((3, 4, 2, 0, 5, 1)),
Perm((4, 2, 7, 5, 0, 6, 3, 1)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable([Perm(()), Perm((2, 1, 0))])
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((2, 0, 1)),
Perm((3, 0, 1, 2)),
Perm((1, 0, 2, 3, 4)),
Perm((2, 3, 0, 1, 4)),
Perm((3, 4, 1, 0, 2)),
Perm((4, 3, 1, 6, 0, 7, 2, 5)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 1, 2)),
Perm((1, 2, 0)),
Perm((0, 2, 1, 3)),
Perm((0, 3, 2, 1)),
Perm((3, 0, 1, 2)),
Perm((1, 0, 2, 4, 5, 6, 3)),
Perm((1, 3, 6, 7, 2, 5, 4, 0)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm(()), Perm((0,)), Perm((1, 0)), Perm((4, 0, 1, 2, 3))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 1, 2)),
Perm((2, 1, 0)),
Perm((1, 2, 0, 3, 4)),
Perm((3, 1, 0, 4, 2)),
Perm((0, 5, 1, 4, 3, 2, 6)),
Perm((2, 5, 4, 3, 6, 0, 1)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 0, 2)),
Perm((2, 3, 1, 0)),
Perm((0, 1, 4, 3, 2)),
Perm((0, 4, 3, 1, 2)),
Perm((4, 0, 1, 3, 2)),
Perm((0, 3, 2, 1, 5, 4)),
Perm((2, 4, 0, 3, 5, 1)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((2, 0, 1)),
Perm((3, 0, 1, 2)),
Perm((0, 1, 4, 3, 2)),
Perm((0, 5, 4, 7, 6, 3, 1, 2)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm(()), Perm((1, 0, 2)), Perm((2, 0, 1, 3)), Perm((0, 3, 1, 2, 4))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 2, 0)),
Perm((2, 0, 1)),
Perm((2, 1, 0)),
Perm((3, 0, 2, 1)),
Perm((1, 4, 2, 3, 5, 0)),
Perm((1, 5, 0, 2, 3, 7, 6, 4)),
Perm((4, 6, 5, 0, 2, 1, 3, 7)),
Perm((7, 0, 1, 6, 3, 5, 4, 2)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 2, 1)),
Perm((2, 0, 1, 3)),
Perm((0, 3, 2, 4, 1)),
Perm((3, 4, 2, 0, 1)),
Perm((1, 4, 5, 2, 0, 3)),
Perm((4, 5, 3, 2, 0, 1)),
Perm((6, 3, 0, 1, 2, 5, 4)),
Perm((4, 3, 7, 5, 2, 1, 6, 0)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable([Perm((0,))])
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 1, 2)),
Perm((0, 2, 1)),
Perm((2, 0, 1)),
Perm((2, 3, 1, 0, 4)),
Perm((4, 1, 3, 0, 2)),
Perm((1, 3, 2, 0, 6, 4, 5)),
Perm((1, 4, 5, 3, 2, 6, 0)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((3, 2, 1, 0)),
Perm((1, 0, 2, 3, 4)),
Perm((3, 1, 4, 0, 2, 7, 5, 6)),
Perm((4, 5, 1, 7, 0, 6, 3, 2)),
Perm((6, 5, 2, 4, 3, 0, 1, 7)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm((0,)), Perm((0, 1)), Perm((1, 2, 0)), Perm((2, 1, 0))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm((2, 0, 1)), Perm((2, 1, 3, 0, 4))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 0, 2)),
Perm((0, 2, 3, 1)),
Perm((2, 1, 3, 0)),
Perm((3, 0, 4, 1, 2)),
Perm((6, 5, 4, 2, 0, 3, 1)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 2, 1)),
Perm((2, 0, 1)),
Perm((2, 5, 4, 1, 0, 3)),
Perm((3, 2, 1, 5, 0, 4)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 0, 2, 3)),
Perm((1, 3, 2, 0)),
Perm((2, 1, 3, 0, 4)),
Perm((5, 2, 1, 4, 3, 0)),
Perm((0, 5, 1, 3, 6, 4, 2)),
Perm((1, 4, 0, 2, 3, 5, 6)),
Perm((6, 1, 2, 0, 3, 5, 4)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm(()), Perm((1, 0)), Perm((1, 2, 0, 3)), Perm((3, 1, 0, 2))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 2, 1)),
Perm((1, 0, 2)),
Perm((1, 2, 0)),
Perm((3, 1, 2, 5, 4, 0)),
Perm((0, 2, 3, 5, 6, 4, 1)),
Perm((4, 5, 6, 3, 1, 0, 2)),
Perm((0, 4, 6, 7, 2, 3, 5, 1)),
Perm((7, 5, 3, 6, 2, 4, 0, 1)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm((0, 1)), Perm((0, 1, 2)), Perm((1, 3, 0, 2)), Perm((0, 3, 4, 2, 1))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 0, 2)),
Perm((2, 0, 1)),
Perm((1, 3, 4, 0, 2)),
Perm((2, 6, 4, 5, 0, 1, 3)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 2, 0)),
Perm((2, 1, 0)),
Perm((0, 2, 3, 1)),
Perm((0, 2, 3, 1, 4)),
Perm((0, 5, 3, 2, 4, 1)),
Perm((0, 4, 6, 1, 5, 3, 2)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 2, 0)),
Perm((0, 2, 1, 3)),
Perm((0, 3, 2, 1)),
Perm((1, 0, 2, 3)),
Perm((3, 0, 1, 2, 4)),
Perm((4, 0, 1, 3, 2)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm(()), Perm((3, 1, 2, 0)), Perm((1, 0, 3, 2, 4)), Perm((4, 2, 1, 0, 3))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm((1, 2, 3, 0)), Perm((2, 1, 0, 3)), Perm((2, 1, 3, 0))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 2, 0, 3)),
Perm((1, 2, 3, 0)),
Perm((4, 3, 0, 2, 1)),
Perm((2, 1, 4, 7, 6, 3, 0, 5)),
Perm((7, 4, 1, 3, 6, 5, 2, 0)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 2, 1)),
Perm((2, 3, 1, 0)),
Perm((3, 1, 0, 2)),
Perm((3, 1, 2, 0, 4)),
Perm((2, 3, 1, 0, 4, 5)),
Perm((1, 4, 7, 2, 6, 0, 5, 3)),
Perm((5, 0, 3, 1, 7, 2, 4, 6)),
]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[Perm(()), Perm((0, 1, 3, 2)), Perm((2, 1, 4, 3, 0))]
)
assert InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 1, 2)),
Perm((1, 0, 3, 2)),
Perm((0, 1, 4, 3, 2)),
Perm((1, 4, 3, 0, 2)),
Perm((4, 2, 1, 3, 0, 5)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((3, 1, 6, 2, 7, 0, 4, 5))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 3, 1, 2)),
Perm((1, 0, 3, 2)),
Perm((1, 3, 2, 0, 4)),
Perm((3, 2, 4, 0, 1)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 2, 0)),
Perm((0, 2, 3, 1)),
Perm((2, 3, 1, 4, 0)),
Perm((2, 3, 4, 0, 1)),
Perm((4, 3, 1, 5, 0, 2)),
Perm((0, 3, 2, 6, 4, 1, 5)),
Perm((1, 5, 3, 2, 7, 6, 0, 4)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((2, 0, 1, 3)),
Perm((2, 1, 0, 3)),
Perm((3, 1, 0, 2)),
Perm((3, 1, 2, 0)),
Perm((6, 5, 1, 4, 0, 3, 2)),
Perm((4, 2, 1, 0, 6, 3, 7, 5)),
Perm((6, 2, 5, 3, 4, 1, 0, 7)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable([Perm((0, 3, 1, 2))])
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 2, 1, 3)),
Perm((1, 2, 3, 0, 4)),
Perm((3, 1, 0, 2, 4)),
Perm((3, 1, 4, 2, 0)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((1, 0, 2)), Perm((3, 1, 2, 6, 4, 0, 5)), Perm((3, 4, 2, 1, 0, 5, 6))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 2, 1, 3)),
Perm((2, 0, 3, 4, 1)),
Perm((3, 4, 0, 1, 2)),
Perm((4, 0, 3, 1, 2)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 4, 3, 2, 1)),
Perm((1, 0, 2, 4, 3)),
Perm((2, 3, 0, 1, 4)),
Perm((3, 1, 4, 2, 0)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((4, 5, 0, 1, 2, 6, 3))]
)
assert not InsertionEncodablePerms.is_insertion_encodable([Perm((0, 3, 4, 2, 1))])
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((3, 5, 6, 0, 7, 4, 2, 1))]
)
assert not InsertionEncodablePerms.is_insertion_encodable([Perm((3, 1, 2, 4, 0))])
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((1, 0, 3, 2)), Perm((2, 3, 0, 1, 4, 6, 5)), Perm((5, 4, 2, 1, 3, 0, 6))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((0, 2, 1)), Perm((2, 7, 6, 5, 0, 4, 3, 1))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((2, 0, 1, 3)),
Perm((3, 0, 2, 1)),
Perm((1, 2, 4, 0, 3)),
Perm((2, 3, 4, 0, 1)),
Perm((3, 0, 2, 4, 1)),
Perm((4, 2, 1, 0, 3)),
Perm((4, 3, 7, 2, 0, 6, 1, 5)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((2, 1, 0)), Perm((1, 3, 4, 2, 0)), Perm((0, 2, 6, 5, 1, 4, 3))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 4, 5, 1, 3, 2)),
Perm((5, 0, 2, 4, 1, 3)),
Perm((0, 2, 4, 3, 6, 5, 1)),
Perm((3, 0, 4, 1, 2, 5, 7, 6)),
Perm((7, 5, 1, 6, 3, 2, 4, 0)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable([Perm((0, 2, 1, 3))])
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((0, 1, 2)), Perm((0, 4, 5, 1, 3, 2)), Perm((0, 6, 3, 5, 4, 1, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((0, 2, 1)), Perm((0, 2, 1, 3)), Perm((0, 2, 3, 1))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((1, 2, 0)), Perm((2, 0, 3, 1)), Perm((2, 1, 3, 0))]
)
assert not InsertionEncodablePerms.is_insertion_encodable([Perm((3, 2, 1, 0))])
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((0, 1, 2)), Perm((2, 3, 0, 4, 1))]
)
assert not InsertionEncodablePerms.is_insertion_encodable([Perm((0, 1, 2, 3))])
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((0, 3, 1, 2)), Perm((3, 1, 0, 2, 4))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((2, 1, 0)), Perm((0, 4, 3, 2, 1)), Perm((4, 3, 0, 2, 1))]
)
assert not InsertionEncodablePerms.is_insertion_encodable([Perm((2, 1, 3, 0))])
assert not InsertionEncodablePerms.is_insertion_encodable([Perm((1, 0, 2))])
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((1, 3, 4, 0, 2, 6, 5)), Perm((4, 0, 3, 1, 6, 5, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((1, 2, 0)), Perm((3, 1, 2, 0))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((1, 2, 0, 3)),
Perm((4, 2, 0, 5, 1, 3)),
Perm((5, 4, 3, 1, 0, 2)),
Perm((6, 1, 5, 4, 0, 2, 3)),
Perm((7, 5, 4, 2, 0, 1, 6, 3)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((1, 3, 0, 2)), Perm((4, 0, 3, 2, 1))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((2, 3, 4, 5, 1, 0)),
Perm((1, 0, 5, 3, 4, 2, 6)),
Perm((0, 3, 6, 4, 1, 7, 2, 5)),
Perm((0, 6, 5, 3, 2, 1, 4, 7)),
Perm((4, 5, 1, 6, 0, 7, 3, 2)),
Perm((6, 3, 2, 5, 1, 4, 0, 7)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable([Perm((0, 1, 2))])
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((0, 1, 2)), Perm((0, 1, 2, 3, 4)), Perm((2, 5, 1, 0, 4, 6, 3))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[Perm((0, 3, 2, 1, 4)), Perm((4, 3, 1, 0, 2))]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((2, 0, 1)),
Perm((4, 2, 0, 1, 5, 6, 3)),
Perm((6, 7, 2, 4, 5, 1, 3, 0)),
Perm((7, 4, 2, 1, 3, 0, 5, 6)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((0, 1, 2)),
Perm((1, 3, 0, 2, 4)),
Perm((0, 1, 5, 3, 4, 6, 2)),
Perm((0, 2, 6, 1, 3, 4, 5)),
]
)
assert not InsertionEncodablePerms.is_insertion_encodable(
[
Perm((4, 2, 3, 1, 0)),
Perm((0, 4, 6, 3, 5, 2, 1)),
Perm((2, 3, 0, 6, 1, 5, 4)),
]
)
| 33.592843 | 86 | 0.437657 | 4,910 | 34,735 | 3.012424 | 0.004684 | 0.121222 | 0.220404 | 0.465148 | 0.931039 | 0.88209 | 0.837942 | 0.795957 | 0.740991 | 0.629504 | 0 | 0.149321 | 0.370692 | 34,735 | 1,033 | 87 | 33.625363 | 0.527334 | 0 | 0 | 0.374878 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155794 | 1 | 0.002921 | true | 0 | 0.001947 | 0 | 0.004869 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ed59a099dec6ce38eaa39c30cbcaed5bfc68ae07 | 246,270 | py | Python | Source/module_presentations.py | qt911025/qt-homemade-mod-osp | 97eb7c0a14d8d3a86ff5661105a29b50506d5d91 | [
"MIT"
] | 1 | 2015-11-05T19:59:21.000Z | 2015-11-05T19:59:21.000Z | Source/module_presentations.py | qt911025/qt-homemade-mod-osp | 97eb7c0a14d8d3a86ff5661105a29b50506d5d91 | [
"MIT"
] | null | null | null | Source/module_presentations.py | qt911025/qt-homemade-mod-osp | 97eb7c0a14d8d3a86ff5661105a29b50506d5d91 | [
"MIT"
] | null | null | null | from header_common import *
from header_presentations import *
from header_mission_templates import *
from ID_meshes import *
from header_operations import *
from header_triggers import *
from module_constants import *
import string
####################################################################################################################
# Each presentation record contains the following fields:
# 1) Presentation id: used for referencing presentations in other files. The prefix prsnt_ is automatically added before each presentation id.
# 2) Presentation flags. See header_presentations.py for a list of available flags
# 3) Presentation background mesh: See module_meshes.py for a list of available background meshes
# 4) Triggers: Simple triggers that are associated with the presentation
####################################################################################################################
presentations = [
("game_credits",prsntf_read_only,mesh_load_window,[
(ti_on_presentation_load,
[(assign, "$g_presentation_credits_obj_1", -1),
(assign, "$g_presentation_credits_obj_2", -1),
(assign, "$g_presentation_credits_obj_3", -1),
(assign, "$g_presentation_credits_obj_4", -1),
(assign, "$g_presentation_credits_obj_5", -1),
(assign, "$g_presentation_credits_obj_6", -1),
(assign, "$g_presentation_credits_obj_7", -1),
(assign, "$g_presentation_credits_obj_8", -1),
(assign, "$g_presentation_credits_obj_9", -1),
(assign, "$g_presentation_credits_obj_10", -1),
(assign, "$g_presentation_credits_obj_11", -1),
(assign, "$g_presentation_credits_obj_12", -1),
(assign, "$g_presentation_credits_obj_1_alpha", 0),
(assign, "$g_presentation_credits_obj_2_alpha", 0),
(assign, "$g_presentation_credits_obj_3_alpha", 0),
(assign, "$g_presentation_credits_obj_4_alpha", 0),
(assign, "$g_presentation_credits_obj_5_alpha", 0),
(assign, "$g_presentation_credits_obj_6_alpha", 0),
(assign, "$g_presentation_credits_obj_7_alpha", 0),
(assign, "$g_presentation_credits_obj_8_alpha", 0),
(assign, "$g_presentation_credits_obj_9_alpha", 0),
]),
(ti_on_presentation_run,
[
(store_trigger_param_1, ":cur_time"),
(set_fixed_point_multiplier, 1000),
(presentation_set_duration, 1000000),
(try_begin),
(this_or_next|key_clicked, key_space),
(this_or_next|key_clicked, key_enter),
(this_or_next|key_clicked, key_escape),
(this_or_next|key_clicked, key_back_space),
(this_or_next|key_clicked, key_left_mouse_button),
(this_or_next|key_clicked, key_right_mouse_button),
(this_or_next|key_clicked, key_xbox_ltrigger),
(key_clicked, key_xbox_rtrigger),
(presentation_set_duration, 0),
(try_end),
(try_begin),
(lt, "$g_presentation_credits_obj_1", 0),
(str_store_string, s1, "str_credits_1"),
(create_text_overlay, "$g_presentation_credits_obj_1", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_1", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_1", 0),
(position_set_x, pos1, 1500),
(position_set_y, pos1, 1500),
(overlay_set_size, "$g_presentation_credits_obj_1", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 375),
(overlay_set_position, "$g_presentation_credits_obj_1", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_1", 1000, 0xFF),
(else_try),
(gt, ":cur_time", 2000),
(eq, "$g_presentation_credits_obj_1_alpha", 0),
(assign, "$g_presentation_credits_obj_1_alpha", 1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_1", 1000, 0x00),
(else_try),
(gt, ":cur_time", 3500),
(lt, "$g_presentation_credits_obj_2", 0),
(str_store_string, s1, "str_credits_2"),
(create_text_overlay, "$g_presentation_credits_obj_2", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_2", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_2", 0),
(position_set_x, pos1, 1750),
(position_set_y, pos1, 1750),
(overlay_set_size, "$g_presentation_credits_obj_2", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 375),
(overlay_set_position, "$g_presentation_credits_obj_2", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_2", 1000, 0xFF),
(else_try),
(gt, ":cur_time", 5500),
(eq, "$g_presentation_credits_obj_2_alpha", 0),
(assign, "$g_presentation_credits_obj_2_alpha", 1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_2", 1000, 0x00),
(else_try),
(gt, ":cur_time", 7000),
(lt, "$g_presentation_credits_obj_3", 0),
(str_store_string, s1, "str_credits_3"),
(create_text_overlay, "$g_presentation_credits_obj_3", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_3", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_3", 0),
(position_set_x, pos1, 1750),
(position_set_y, pos1, 1750),
(overlay_set_size, "$g_presentation_credits_obj_3", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 375),
(overlay_set_position, "$g_presentation_credits_obj_3", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_3", 1000, 0xFF),
(else_try),
(gt, ":cur_time", 9000),
(eq, "$g_presentation_credits_obj_3_alpha", 0),
(assign, "$g_presentation_credits_obj_3_alpha", 1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_3", 1000, 0),
(else_try),
(gt, ":cur_time", 10500),
(lt, "$g_presentation_credits_obj_4", 0),
(str_store_string, s1, "str_credits_4"),
(create_text_overlay, "$g_presentation_credits_obj_4", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_4", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_4", 0),
(position_set_x, pos1, 1750),
(position_set_y, pos1, 1750),
(overlay_set_size, "$g_presentation_credits_obj_4", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 375),
(overlay_set_position, "$g_presentation_credits_obj_4", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_4", 1000, 0xFF),
(else_try),
(gt, ":cur_time", 12500),
(eq, "$g_presentation_credits_obj_4_alpha", 0),
(assign, "$g_presentation_credits_obj_4_alpha", 1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_4", 1000, 0),
(else_try),
(gt, ":cur_time", 14000),
(lt, "$g_presentation_credits_obj_5", 0),
(str_store_string, s1, "str_credits_5"),
(create_text_overlay, "$g_presentation_credits_obj_5", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_5", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_5", 0),
(position_set_x, pos1, 1750),
(position_set_y, pos1, 1750),
(overlay_set_size, "$g_presentation_credits_obj_5", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 375),
(overlay_set_position, "$g_presentation_credits_obj_5", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_5", 1000, 0xFF),
(else_try),
(gt, ":cur_time", 16000),
(eq, "$g_presentation_credits_obj_5_alpha", 0),
(assign, "$g_presentation_credits_obj_5_alpha", 1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_5", 1000, 0),
(else_try),
(gt, ":cur_time", 17500),
(lt, "$g_presentation_credits_obj_6", 0),
(str_store_string, s1, "str_credits_6"),
(create_text_overlay, "$g_presentation_credits_obj_6", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_6", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_6", 0),
(position_set_x, pos1, 1750),
(position_set_y, pos1, 1750),
(overlay_set_size, "$g_presentation_credits_obj_6", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 375),
(overlay_set_position, "$g_presentation_credits_obj_6", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_6", 1000, 0xFF),
(else_try),
(gt, ":cur_time", 19500),
(eq, "$g_presentation_credits_obj_6_alpha", 0),
(assign, "$g_presentation_credits_obj_6_alpha", 1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_6", 1000, 0),
(else_try),
(gt, ":cur_time", 21000),
(lt, "$g_presentation_credits_obj_7", 0),
(str_store_string, s1, "str_credits_7"),
(create_text_overlay, "$g_presentation_credits_obj_7", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_7", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_7", 0),
(position_set_x, pos1, 1750),
(position_set_y, pos1, 1750),
(overlay_set_size, "$g_presentation_credits_obj_7", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 375),
(overlay_set_position, "$g_presentation_credits_obj_7", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_7", 1000, 0xFF),
(else_try),
(gt, ":cur_time", 23000),
(eq, "$g_presentation_credits_obj_7_alpha", 0),
(assign, "$g_presentation_credits_obj_7_alpha", 1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_7", 1000, 0),
(else_try),
(gt, ":cur_time", 24500),
(lt, "$g_presentation_credits_obj_8", 0),
(str_store_string, s1, "str_credits_8"),
(create_text_overlay, "$g_presentation_credits_obj_8", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_8", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_8", 0),
(position_set_x, pos1, 1750),
(position_set_y, pos1, 1750),
(overlay_set_size, "$g_presentation_credits_obj_8", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 375),
(overlay_set_position, "$g_presentation_credits_obj_8", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_8", 1000, 0xFF),
(else_try),
(gt, ":cur_time", 26500),
(eq, "$g_presentation_credits_obj_8_alpha", 0),
(assign, "$g_presentation_credits_obj_8_alpha", 1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_8", 1000, 0),
(else_try),
(gt, ":cur_time", 28000),
(lt, "$g_presentation_credits_obj_9", 0),
(str_store_string, s1, "str_credits_10"),
(create_text_overlay, "$g_presentation_credits_obj_9", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_9", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_9", 0),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, "$g_presentation_credits_obj_9", pos1),
(position_set_x, pos1, 250),
(position_set_y, pos1, 485),
(overlay_set_position, "$g_presentation_credits_obj_9", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_9", 1000, 0xFF),
(str_store_string, s1, "str_credits_11"),
(create_text_overlay, "$g_presentation_credits_obj_10", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_10", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_10", 0),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, "$g_presentation_credits_obj_10", pos1),
(position_set_x, pos1, 750),
(position_set_y, pos1, 470),
(overlay_set_position, "$g_presentation_credits_obj_10", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_10", 1000, 0xFF),
(str_store_string, s1, "str_credits_12"),
(create_text_overlay, "$g_presentation_credits_obj_11", s1, tf_center_justify|tf_double_space|tf_vertical_align_center),
(overlay_set_color, "$g_presentation_credits_obj_11", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_11", 0),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, "$g_presentation_credits_obj_11", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 105),
(overlay_set_position, "$g_presentation_credits_obj_11", pos1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_11", 1000, 0xFF),
(else_try),
(gt, ":cur_time", 34000),
(eq, "$g_presentation_credits_obj_9_alpha", 0),
(assign, "$g_presentation_credits_obj_9_alpha", 1),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_9", 1000, 0),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_10", 1000, 0),
(overlay_animate_to_alpha, "$g_presentation_credits_obj_11", 1000, 0),
(else_try),
(gt, ":cur_time", 35500),
(lt, "$g_presentation_credits_obj_12", 0),
(str_store_string, s1, "str_credits_9"),
(create_text_overlay, "$g_presentation_credits_obj_12", s1, tf_center_justify|tf_double_space),
(overlay_set_color, "$g_presentation_credits_obj_12", 0),
(overlay_set_alpha, "$g_presentation_credits_obj_12", 0xFF),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(overlay_set_size, "$g_presentation_credits_obj_12", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, -4800),
(overlay_set_position, "$g_presentation_credits_obj_12", pos1),
(position_set_x, pos1, 500),
(position_set_y, pos1, 760),
(overlay_animate_to_position, "$g_presentation_credits_obj_12", 70000, pos1),
(else_try),
(gt, ":cur_time", 105500),
(presentation_set_duration, 0),
(try_end),
]),
]),
("game_profile_banner_selection", 0, mesh_load_window, [
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(str_store_string, s1, "str_profile_banner_selection_text"),
(create_text_overlay, reg1, s1, tf_center_justify),
(position_set_x, pos1, 500),
(position_set_y, pos1, 600),
(overlay_set_position, reg1, pos1),
(overlay_set_text, reg1, s1),
(create_button_overlay, "$g_presentation_obj_profile_banner_selection_1", "@Next Page", tf_center_justify),
(position_set_x, pos1, 700),
(position_set_y, pos1, 50),
(overlay_set_position, "$g_presentation_obj_profile_banner_selection_1", pos1),
(create_button_overlay, "$g_presentation_obj_profile_banner_selection_2", "str_use_default_banner", tf_center_justify),
(position_set_x, pos1, 300),
(position_set_y, pos1, 50),
(overlay_set_position, "$g_presentation_obj_profile_banner_selection_2", pos1),
(assign, ":x_pos", 150),
(assign, ":y_pos", 575),
(store_mul, ":starting_banner", 16, "$g_presentation_page_no"),
(store_add, ":ending_banner", ":starting_banner", 16),
(store_add, "$g_presentation_banner_start", "$g_presentation_obj_profile_banner_selection_2", 1),
(assign, ":num_valid_banners", 0),
(try_for_range, ":cur_banner_mesh", banner_meshes_begin, banner_meshes_end_minus_one),
(assign, ":already_used", 0),
(try_for_range, ":cur_faction", multiplayer_factions_begin, multiplayer_factions_end),
(faction_slot_eq, ":cur_faction", slot_faction_banner, ":cur_banner_mesh"),
(assign, ":already_used", 1),
(try_end),
(eq, ":already_used", 0),
(val_add, ":num_valid_banners", 1),
(gt, ":num_valid_banners", ":starting_banner"),
(le, ":num_valid_banners", ":ending_banner"),
(create_image_button_overlay, reg1, ":cur_banner_mesh", ":cur_banner_mesh"),
(position_set_x, pos1, ":x_pos"),
(position_set_y, pos1, ":y_pos"),
(overlay_set_position, reg1, pos1),
(position_set_x, pos1, 100),
(position_set_y, pos1, 100),
(overlay_set_size, reg1, pos1),
(val_add, ":x_pos", 100),
(ge, ":x_pos", 900),
(assign, ":x_pos", 150),
(val_sub, ":y_pos", 250),
(try_end),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_event_state_change, [
(store_trigger_param_1, ":object"),
(try_begin),
(eq, ":object", "$g_presentation_obj_profile_banner_selection_1"),
(val_add, "$g_presentation_page_no", 1),
(val_mod, "$g_presentation_page_no", 8),
(presentation_set_duration, 0),
(start_presentation, "prsnt_game_profile_banner_selection"),
(else_try),
(eq, ":object", "$g_presentation_obj_profile_banner_selection_2"),
(profile_set_banner_id, -1),
(presentation_set_duration, 0),
(else_try),
(store_sub, ":selected_banner", ":object", "$g_presentation_banner_start"),
(store_mul, ":page_adder", 16, "$g_presentation_page_no"),
(val_add, ":selected_banner", ":page_adder"),
(assign, ":num_valid_banners", 0),
(assign, ":end_cond", banner_meshes_end_minus_one),
(try_for_range, ":cur_banner_mesh", banner_meshes_begin, ":end_cond"),
(assign, ":already_used", 0),
(try_for_range, ":cur_faction", multiplayer_factions_begin, multiplayer_factions_end),
(faction_slot_eq, ":cur_faction", slot_faction_banner, ":cur_banner_mesh"),
(assign, ":already_used", 1),
(try_end),
(eq, ":already_used", 0),
(try_begin),
(eq, ":selected_banner", ":num_valid_banners"),
(store_sub, ":selected_banner_index", ":cur_banner_mesh", banner_meshes_begin),
(profile_set_banner_id, ":selected_banner_index"),
(assign, ":end_cond", 0), #break
(try_end),
(val_add, ":num_valid_banners", 1),
(try_end),
(presentation_set_duration, 0),
(try_end),
]),
]),
# ("game_multiplayer_admin_panel", prsntf_manual_end_only, 0, [
# (ti_on_presentation_load,
# [(set_fixed_point_multiplier, 1000),
#
# (try_begin),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_1"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_1"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_2"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_2"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_3"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_3"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_4"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_4"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_5"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_5"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_6"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_6"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_7"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_7"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_8"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_8"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_9"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_9"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_10"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_10"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_11"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_11"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_12"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_12"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_13"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_13"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_14"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_14"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_15"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_15"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_16"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_16"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_17"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_17"),
# (else_try),
# (eq, "$g_multiplayer_selected_map", "scn_multi_scene_18"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_18"),
# (else_try),
# (this_or_next|eq, "$g_multiplayer_selected_map", "scn_random_multi_plain_medium"),
# (eq, "$g_multiplayer_selected_map", "scn_random_multi_plain_large"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_randomp"),
# (else_try),
# (this_or_next|eq, "$g_multiplayer_selected_map", "scn_random_multi_steppe_medium"),
# (eq, "$g_multiplayer_selected_map", "scn_random_multi_steppe_large"),
# (assign, ":map_image", "mesh_mp_ui_host_maps_randoms"),
# (else_try),
# (assign, ":map_image", "mesh_mp_ui_host_maps_randomp"),
# (try_end),
#
# (create_mesh_overlay, reg0, ":map_image"),
# (position_set_x, pos1, -1),
# (position_set_y, pos1, 550),
# (overlay_set_position, reg0, pos1),
# (position_set_x, pos1, 1002),
# (position_set_y, pos1, 1002),
# (overlay_set_size, reg0, pos1),
#
# (create_mesh_overlay, reg0, "mesh_mp_ui_host_main"),
# (position_set_x, pos1, -1),
# (position_set_y, pos1, -1),
# (overlay_set_position, reg0, pos1),
# (position_set_x, pos1, 1002),
# (position_set_y, pos1, 1002),
# (overlay_set_size, reg0, pos1),
#
# (assign, ":cur_y", 1240),
# (assign, ":cur_y_adder", 40),
#
# (try_begin),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_team_deathmatch),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_battle),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_destroy),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_capture_the_flag),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_headquarters),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_siege),
# (val_add, ":cur_y", ":cur_y_adder"), #two more options for these mods (friendly fire options)
# (val_add, ":cur_y", ":cur_y_adder"),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_capture_the_flag),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_battle),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_destroy),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_siege),
# (val_add, ":cur_y", ":cur_y_adder"), #one more option for these mods
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_battle),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_destroy),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_siege),
# (val_add, ":cur_y", ":cur_y_adder"), #one more option for these mods
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_battle),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_destroy),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_siege),
# (val_add, ":cur_y", ":cur_y_adder"), #one more option for these mods
# (try_end),
#
# (str_clear, s0),
# (create_text_overlay, "$g_presentation_obj_admin_panel_container", s0, tf_scrollable),
# (position_set_x, pos1, 59),
# (position_set_y, pos1, 50),
# (overlay_set_position, "$g_presentation_obj_admin_panel_container", pos1),
# (position_set_x, pos1, 640),
# (position_set_y, pos1, 520),
# (overlay_set_area_size, "$g_presentation_obj_admin_panel_container", pos1),
# (set_container_overlay, "$g_presentation_obj_admin_panel_container"),
#
#
# (create_text_overlay, reg0, "str_add_to_official_game_servers_list", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_14", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_14", pos1),
# (server_get_add_to_game_servers_list, ":add_to_servers_list"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_14", ":add_to_servers_list"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_enable_valve_anti_cheat", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_41", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_41", pos1),
# (server_get_anti_cheat, ":server_anti_cheat"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_41", ":server_anti_cheat"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_server_name", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (str_store_server_name, s0),
# (try_begin),
# (eq, "$g_multiplayer_renaming_server_allowed", 1),
# (create_simple_text_box_overlay, "$g_presentation_obj_admin_panel_20"),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_20", pos1),
# (overlay_set_text, "$g_presentation_obj_admin_panel_20", s0),
# (else_try),
# (assign, "$g_presentation_obj_admin_panel_20", -1),
# (create_text_overlay, reg0, s0, 0),
# (position_set_x, pos1, 385),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
# (try_end),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_game_password", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_simple_text_box_overlay, "$g_presentation_obj_admin_panel_9"),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_9", pos1),
# (str_store_server_password, s0),
# (overlay_set_text, "$g_presentation_obj_admin_panel_9", s0),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_welcome_message", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_simple_text_box_overlay, "$g_presentation_obj_admin_panel_32"),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_32", pos1),
# (str_store_welcome_message, s0),
# (overlay_set_text, "$g_presentation_obj_admin_panel_32", s0),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_map", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (call_script, "script_multiplayer_fill_map_game_types", "$g_multiplayer_game_type"),
# (assign, ":num_maps", reg0),
# (assign, ":selected_index", 0),
#
# (try_begin),
# (gt, ":num_maps", 12),
# (create_combo_label_overlay, "$g_presentation_obj_admin_panel_1"),
# (else_try),
# (create_combo_button_overlay, "$g_presentation_obj_admin_panel_1"),
# (try_end),
# (position_set_x, pos1, 800),
# (position_set_y, pos1, 800),
# (overlay_set_size, "$g_presentation_obj_admin_panel_1", pos1),
# (try_begin),
# (gt, ":num_maps", 14),
# (position_set_x, pos1, 465),
# (else_try),
# (position_set_x, pos1, 490),
# (try_end),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_1", pos1),
# (troop_get_slot, ":first_map", "trp_multiplayer_data", multi_data_maps_for_game_type_begin),
# (assign, ":selected_map_available", 0),
# (try_for_range, ":i_map", 0, ":num_maps"),
# (store_add, ":map_slot", ":i_map", multi_data_maps_for_game_type_begin),
# (troop_get_slot, ":map_no", "trp_multiplayer_data", ":map_slot"),
# (store_sub, ":string_index", ":map_no", multiplayer_scenes_begin),
# (val_add, ":string_index", multiplayer_scene_names_begin),
# (str_store_string, s0, ":string_index"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_1", s0),
# (try_begin),
# (eq, ":map_no", "$g_multiplayer_selected_map"),
# (assign, ":selected_index", ":i_map"),
# (assign, ":selected_map_available", 1),
# (try_end),
# (try_end),
# (overlay_set_val, "$g_presentation_obj_admin_panel_1", ":selected_index"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_game_type", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (try_begin),
# (eq, "$g_multiplayer_changing_game_type_allowed", 1),
# (create_combo_button_overlay, "$g_presentation_obj_admin_panel_10"),
# (position_set_x, pos1, 800),
# (position_set_y, pos1, 800),
# (overlay_set_size, "$g_presentation_obj_admin_panel_10", pos1),
# (position_set_x, pos1, 490),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_10", pos1),
# (try_for_range, ":i_game_type", 0, multiplayer_num_game_types),
# (store_add, ":string_index", ":i_game_type", multiplayer_game_type_names_begin),
# (str_store_string, s0, ":string_index"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_10", s0),
# (try_end),
# (overlay_set_val, "$g_presentation_obj_admin_panel_10", "$g_multiplayer_game_type"),
# (else_try),
# (assign, "$g_presentation_obj_admin_panel_10", -1),
# (store_add, ":string_index", "$g_multiplayer_game_type", multiplayer_game_type_names_begin),
# (str_store_string, s0, ":string_index"),
# (create_text_overlay, reg0, s0, 0),
# (position_set_x, pos1, 385),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
# (try_end),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (assign, reg1, 1),
# (create_text_overlay, reg0, "str_team_reg1_faction", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_combo_button_overlay, "$g_presentation_obj_admin_panel_11"),
# (position_set_x, pos1, 800),
# (position_set_y, pos1, 800),
# (overlay_set_size, "$g_presentation_obj_admin_panel_11", pos1),
# (position_set_x, pos1, 490),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_11", pos1),
# (call_script, "script_multiplayer_fill_available_factions_combo_button", "$g_presentation_obj_admin_panel_11", "$g_multiplayer_next_team_1_faction", "$g_multiplayer_next_team_2_faction"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (assign, reg1, 2),
# (create_text_overlay, reg0, "str_team_reg1_faction", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_combo_button_overlay, "$g_presentation_obj_admin_panel_12"),
# (position_set_x, pos1, 800),
# (position_set_y, pos1, 800),
# (overlay_set_size, "$g_presentation_obj_admin_panel_12", pos1),
# (position_set_x, pos1, 490),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_12", pos1),
# (call_script, "script_multiplayer_fill_available_factions_combo_button", "$g_presentation_obj_admin_panel_12", "$g_multiplayer_next_team_2_faction", "$g_multiplayer_next_team_1_faction"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (assign, reg1, 1),
# (create_text_overlay, reg0, "str_max_number_of_players", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_21", 2, 65),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_21", pos1),
# (server_get_max_num_players, ":max_players"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_21", ":max_players"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (assign, reg1, 1),
# (create_text_overlay, reg0, "str_number_of_bots_in_team_reg1", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_3", 0, "$g_multiplayer_max_num_bots"),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_3", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_3", "$g_multiplayer_num_bots_team_1"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (assign, reg1, 2),
# (create_text_overlay, reg0, "str_number_of_bots_in_team_reg1", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_4", 0, "$g_multiplayer_max_num_bots"),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_4", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_4", "$g_multiplayer_num_bots_team_2"),
#
# (try_begin),
# (neq, "$g_multiplayer_game_type", multiplayer_game_type_deathmatch),
# (neq, "$g_multiplayer_game_type", multiplayer_game_type_duel),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_allow_friendly_fire", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_5", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_5", pos1),
# (server_get_friendly_fire, ":server_friendly_fire"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_5", ":server_friendly_fire"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_allow_melee_friendly_fire", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_36", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_36", pos1),
# (server_get_melee_friendly_fire, ":melee_friendly_fire"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_36", ":melee_friendly_fire"),
# (try_end),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_friendly_fire_damage_self_ratio", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_37", 0, 101),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_37", pos1),
# (server_get_friendly_fire_damage_self_ratio, ":friendly_fire_damage_self_ratio"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_37", ":friendly_fire_damage_self_ratio"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_friendly_fire_damage_friend_ratio", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_38", 0, 101),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_38", pos1),
# (server_get_friendly_fire_damage_friend_ratio, ":friendly_fire_damage_friend_ratio"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_38", ":friendly_fire_damage_friend_ratio"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_spectator_camera", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_combo_button_overlay, "$g_presentation_obj_admin_panel_19"),
# (position_set_x, pos1, 800),
# (position_set_y, pos1, 800),
# (overlay_set_size, "$g_presentation_obj_admin_panel_19", pos1),
# (position_set_x, pos1, 490),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_19", pos1),
# (overlay_add_item, "$g_presentation_obj_admin_panel_19", "str_free"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_19", "str_stick_to_any_player"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_19", "str_stick_to_team_members"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_19", "str_stick_to_team_members_view"),
# (server_get_ghost_mode, ":server_ghost_mode"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_19", ":server_ghost_mode"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_control_block_direction", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_combo_button_overlay, "$g_presentation_obj_admin_panel_15"),
# (position_set_x, pos1, 800),
# (position_set_y, pos1, 800),
# (overlay_set_size, "$g_presentation_obj_admin_panel_15", pos1),
# (position_set_x, pos1, 490),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_15", pos1),
# (overlay_add_item, "$g_presentation_obj_admin_panel_15", "str_automatic"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_15", "str_by_mouse_movement"),
# (server_get_control_block_dir, ":server_control_block_dir"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_15", ":server_control_block_dir"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_combat_speed", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_combo_button_overlay, "$g_presentation_obj_admin_panel_26"),
# (position_set_x, pos1, 800),
# (position_set_y, pos1, 800),
# (overlay_set_size, "$g_presentation_obj_admin_panel_26", pos1),
# (position_set_x, pos1, 490),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_26", pos1),
# (overlay_add_item, "$g_presentation_obj_admin_panel_26", "str_combat_speed_0"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_26", "str_combat_speed_1"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_26", "str_combat_speed_2"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_26", "str_combat_speed_3"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_26", "str_combat_speed_4"),
# (server_get_combat_speed, ":server_combat_speed"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_26", ":server_combat_speed"),
#
# (try_begin),
# (neq, "$g_multiplayer_game_type", multiplayer_game_type_headquarters),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_map_time_limit", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_7", 5, 121),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_7", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_7", "$g_multiplayer_game_max_minutes"),
# (else_try),
# (assign, "$g_presentation_obj_admin_panel_7", -1),
# (try_end),
#
# (try_begin),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_battle),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_destroy),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_siege),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_round_time_limit", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_16", multiplayer_round_max_seconds_min, multiplayer_round_max_seconds_max),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_16", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_16", "$g_multiplayer_round_max_seconds"),
# (else_try),
# (assign, "$g_presentation_obj_admin_panel_16", -1),
# (try_end),
#
# (try_begin),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_battle),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_destroy),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_players_take_control_of_a_bot_after_death", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_25", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_25", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_25", "$g_multiplayer_player_respawn_as_bot"),
# (else_try),
# (assign, "$g_presentation_obj_admin_panel_25", -1),
# (try_end),
#
# (try_begin),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_siege),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_defender_spawn_count_limit", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_combo_button_overlay, "$g_presentation_obj_admin_panel_27"),
# (position_set_x, pos1, 800),
# (position_set_y, pos1, 800),
# (overlay_set_size, "$g_presentation_obj_admin_panel_27", pos1),
# (position_set_x, pos1, 490),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_27", pos1),
# (assign, reg0, 5),
# (overlay_add_item, "$g_presentation_obj_admin_panel_27", "str_reg0"),
# (assign, reg0, 4),
# (overlay_add_item, "$g_presentation_obj_admin_panel_27", "str_reg0"),
# (assign, reg0, 3),
# (overlay_add_item, "$g_presentation_obj_admin_panel_27", "str_reg0"),
# (assign, reg0, 2),
# (overlay_add_item, "$g_presentation_obj_admin_panel_27", "str_reg0"),
# (assign, reg0, 1),
# (overlay_add_item, "$g_presentation_obj_admin_panel_27", "str_reg0"),
# (overlay_add_item, "$g_presentation_obj_admin_panel_27", "str_unlimited"),
#
# (store_sub, ":value_to_set", 5, "$g_multiplayer_number_of_respawn_count"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_27", ":value_to_set"),
# #(val_sub, ":cur_y", ":cur_y_adder"),
# (else_try),
# (assign, "$g_presentation_obj_admin_panel_27", -1),
# (try_end),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_team_points_limit", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_8", 3, 1001),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_8", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_8", "$g_multiplayer_game_max_points"),
#
# (try_begin),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_headquarters),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_point_gained_from_flags", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_17", 25, 401),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_17", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_17", "$g_multiplayer_point_gained_from_flags"),
# (else_try),
# (assign, "$g_presentation_obj_admin_panel_17", -1),
# (try_end),
#
# (try_begin),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_capture_the_flag),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_point_gained_from_capturing_flag", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_18", 0, 11),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_18", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_18", "$g_multiplayer_point_gained_from_capturing_flag"),
# (else_try),
# (assign, "$g_presentation_obj_admin_panel_18", -1),
# (try_end),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_respawn_period", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_6", multiplayer_respawn_period_min, multiplayer_respawn_period_max),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_6", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_6", "$g_multiplayer_respawn_period"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_initial_gold_multiplier", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_33", 0, 1001),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_33", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_33", "$g_multiplayer_initial_gold_multiplier"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_battle_earnings_multiplier", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_34", 0, 1001),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_34", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_34", "$g_multiplayer_battle_earnings_multiplier"),
#
# (try_begin),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_battle),
# (this_or_next|eq, "$g_multiplayer_game_type", multiplayer_game_type_destroy),
# (eq, "$g_multiplayer_game_type", multiplayer_game_type_siege),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_round_earnings_multiplier", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_35", 0, 1001),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_35", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_35", "$g_multiplayer_round_earnings_multiplier"),
# (else_try),
# (assign, "$g_presentation_obj_admin_panel_35", -1),
# (try_end),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_make_kick_voteable", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_28", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_28", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_28", "$g_multiplayer_kick_voteable"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_make_ban_voteable", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_29", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_29", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_29", "$g_multiplayer_ban_voteable"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_make_maps_voteable", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_24", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_24", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_24", "$g_multiplayer_maps_voteable"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_make_factions_voteable", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_23", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_23", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_23", "$g_multiplayer_factions_voteable"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_bots_upper_limit_for_votes", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (assign, ":upper_limit", 51),
# (val_min, ":upper_limit", "$g_multiplayer_max_num_bots"),
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_22", 0, ":upper_limit"),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_22", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_22", "$g_multiplayer_num_bots_voteable"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_valid_vote_ratio", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_number_box_overlay, "$g_presentation_obj_admin_panel_30", 50, 101),
# (position_set_x, pos1, 390),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_30", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_30", "$g_multiplayer_valid_vote_ratio"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_auto_team_balance_limit", 0),
# (position_set_x, pos1, 0),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_combo_button_overlay, "$g_presentation_obj_admin_panel_31"),
# (position_set_x, pos1, 800),
# (position_set_y, pos1, 800),
# (overlay_set_size, "$g_presentation_obj_admin_panel_31", pos1),
# (position_set_x, pos1, 490),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_31", pos1),
# (overlay_add_item, "$g_presentation_obj_admin_panel_31", "str_unlimited"),
# (assign, reg0, 6),
# (overlay_add_item, "$g_presentation_obj_admin_panel_31", "str_reg0"),
# (assign, reg0, 5),
# (overlay_add_item, "$g_presentation_obj_admin_panel_31", "str_reg0"),
# (assign, reg0, 4),
# (overlay_add_item, "$g_presentation_obj_admin_panel_31", "str_reg0"),
# (assign, reg0, 3),
# (overlay_add_item, "$g_presentation_obj_admin_panel_31", "str_reg0"),
# (assign, reg0, 2),
# (overlay_add_item, "$g_presentation_obj_admin_panel_31", "str_reg0"),
# (try_begin),
# (ge, "$g_multiplayer_auto_team_balance_limit", 1000),
# (overlay_set_val, "$g_presentation_obj_admin_panel_31", 0),
# (else_try),
# (store_sub, ":set_value", 7, "$g_multiplayer_auto_team_balance_limit"),
# (overlay_set_val, "$g_presentation_obj_admin_panel_31", ":set_value"),
# (try_end),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_allow_player_banners", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_39", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_39", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_39", "$g_multiplayer_allow_player_banners"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_disallow_ranged_weapons", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_42", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_42", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_42", "$g_multiplayer_disallow_ranged_weapons"),
#
# (val_sub, ":cur_y", ":cur_y_adder"),
#
# (create_text_overlay, reg0, "str_force_default_armor", 0),
# (position_set_x, pos1, 30),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, reg0, pos1),
#
# (create_check_box_overlay, "$g_presentation_obj_admin_panel_40", "mesh_checkbox_off", "mesh_checkbox_on"),
# (position_set_x, pos1, 7),
# (store_add, ":special_cur_y", ":cur_y", 7),
# (position_set_y, pos1, ":special_cur_y"),
# (overlay_set_position, "$g_presentation_obj_admin_panel_40", pos1),
# (overlay_set_val, "$g_presentation_obj_admin_panel_40", "$g_multiplayer_force_default_armor"),
#
# (set_container_overlay, -1),
#
# (create_button_overlay, "$g_presentation_obj_admin_panel_13", "str_back", tf_center_justify),
# (position_set_x, pos1, 825),
# (position_set_y, pos1, 50),
# (overlay_set_position, "$g_presentation_obj_admin_panel_13", pos1),
# (position_set_x, pos1, 1500),
# (position_set_y, pos1, 1500),
# (overlay_set_size, "$g_presentation_obj_admin_panel_13", pos1),
#
# (create_button_overlay, "$g_presentation_obj_admin_panel_2", "str_start_map", tf_center_justify),
# (position_set_x, pos1, 825),
# (position_set_y, pos1, 90),
# (overlay_set_position, "$g_presentation_obj_admin_panel_2", pos1),
# (position_set_x, pos1, 1500),
# (position_set_y, pos1, 1500),
# (overlay_set_size, "$g_presentation_obj_admin_panel_2", pos1),
#
# (presentation_set_duration, 999999),
#
# (try_begin),
# (eq, ":selected_map_available", 0),
# (assign, "$g_multiplayer_selected_map", ":first_map"),
# (presentation_set_duration, 0),
# (start_presentation, "prsnt_game_multiplayer_admin_panel"),
# (try_end),
# ]),
#
# (ti_on_presentation_event_state_change,
# [(store_trigger_param_1, ":object"),
# (store_trigger_param_2, ":value"),
# (try_begin),
# (eq, ":object", "$g_presentation_obj_admin_panel_1"),
# (store_add, ":slot_no", ":value", multi_data_maps_for_game_type_begin),
# (troop_get_slot, "$g_multiplayer_selected_map", "trp_multiplayer_data", ":slot_no"),
# (presentation_set_duration, 0),
# (start_presentation, "prsnt_game_multiplayer_admin_panel"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_2"),
# (multiplayer_send_2_int_to_server, multiplayer_event_admin_start_map, "$g_multiplayer_selected_map", "$g_multiplayer_game_type"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_3"),
# (multiplayer_send_2_int_to_server, multiplayer_event_admin_set_num_bots_in_team, 1, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_4"),
# (multiplayer_send_2_int_to_server, multiplayer_event_admin_set_num_bots_in_team, 2, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_5"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_friendly_fire, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_6"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_respawn_period, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_7"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_game_max_minutes, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_8"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_game_max_points, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_9"),
# (multiplayer_send_string_to_server, multiplayer_event_admin_set_game_password, s0),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_10"),
# (assign, "$g_multiplayer_game_type", ":value"),
# (presentation_set_duration, 0),
# (start_presentation, "prsnt_game_multiplayer_admin_panel"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_11"),
# (store_add, "$g_multiplayer_next_team_1_faction", ":value", multiplayer_factions_begin),
### (try_begin),
### (ge, "$g_multiplayer_next_team_1_faction", "$g_multiplayer_next_team_2_faction"),
### (val_add, "$g_multiplayer_next_team_1_faction", 1),
### (try_end),
# (multiplayer_send_2_int_to_server, multiplayer_event_admin_set_team_faction, 1, "$g_multiplayer_next_team_1_faction"),
# (presentation_set_duration, 0),
# (start_presentation, "prsnt_game_multiplayer_admin_panel"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_12"),
# (store_add, "$g_multiplayer_next_team_2_faction", ":value", multiplayer_factions_begin),
### (try_begin),
### (ge, "$g_multiplayer_next_team_2_faction", "$g_multiplayer_next_team_1_faction"),
### (val_add, "$g_multiplayer_next_team_2_faction", 1),
### (try_end),
# (multiplayer_send_2_int_to_server, multiplayer_event_admin_set_team_faction, 2, "$g_multiplayer_next_team_2_faction"),
# (presentation_set_duration, 0),
# (start_presentation, "prsnt_game_multiplayer_admin_panel"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_13"),
# (presentation_set_duration, 0),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_14"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_add_to_servers_list, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_15"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_control_block_dir, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_16"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_round_max_seconds, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_17"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_point_gained_from_flags, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_18"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_point_gained_from_capturing_flag, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_19"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_ghost_mode, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_20"),
# (multiplayer_send_string_to_server, multiplayer_event_admin_set_server_name, s0),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_21"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_max_num_players, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_22"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_num_bots_voteable, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_23"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_factions_voteable, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_24"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_maps_voteable, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_25"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_player_respawn_as_bot, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_26"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_combat_speed, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_27"),
# (store_sub, ":value_to_send", 5, ":value"), #opposite index of displayed list
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_respawn_count, ":value_to_send"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_28"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_kick_voteable, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_29"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_ban_voteable, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_30"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_valid_vote_ratio, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_31"),
# (try_begin),
# (eq, ":value", 0),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_auto_team_balance_limit, 1000),
# (else_try),
# (store_sub, ":value_to_send", 7, ":value"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_auto_team_balance_limit, ":value_to_send"),
# (try_end),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_32"),
# (server_set_welcome_message, s0),
# (multiplayer_send_string_to_server, multiplayer_event_admin_set_welcome_message, s0),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_33"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_initial_gold_multiplier, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_34"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_battle_earnings_multiplier, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_35"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_round_earnings_multiplier, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_36"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_melee_friendly_fire, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_37"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_friendly_fire_damage_self_ratio, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_38"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_friendly_fire_damage_friend_ratio, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_39"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_allow_player_banners, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_40"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_force_default_armor, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_41"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_anti_cheat, ":value"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_admin_panel_42"),
# (multiplayer_send_int_to_server, multiplayer_event_admin_set_disallow_ranged_weapons, ":value"),
# (try_end),
# ]),
# (ti_on_presentation_run,
# [
# (try_begin),
# (this_or_next|key_clicked, key_escape),
# (key_clicked, key_xbox_start),
# (presentation_set_duration, 0),
# (try_end),
# ]),
# ]),
("multiplayer_welcome_message", prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(str_store_welcome_message, s0),
(try_begin),
(neg|str_is_empty, s0),
(eq, "$g_multiplayer_welcome_message_shown", 0),
(create_mesh_overlay, reg0, "mesh_mp_ui_welcome_panel"),
(position_set_x, pos1, 200),
(position_set_y, pos1, 400),
(overlay_set_position, reg0, pos1),
(create_text_overlay, reg0, s0, tf_scrollable),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 230),
(position_set_y, pos1, 425),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 540),
(position_set_y, pos1, 150),
(overlay_set_area_size, reg0, pos1),
(presentation_set_duration, 999999),
(else_try),
(eq, "$g_multiplayer_show_server_rules", 1),
(create_mesh_overlay, reg0, "mesh_mp_ui_welcome_panel"),
(position_set_x, pos1, 200),
(position_set_y, pos1, 400),
(overlay_set_position, reg0, pos1),
(try_begin),
(neg|str_is_empty, s0),
(str_clear, s3),
(str_store_string, s2, s0),
(str_store_string, s2, "str_s2_s3"),
(str_store_string, s2, "str_s2_s3"),
(else_try),
(str_clear, s2),
(try_end),
(str_store_string, s3, "@Game Rules:^"),
(str_store_string, s2, "str_s2_s3"),
(assign, ":end_cond", 1000),
(call_script, "script_game_multiplayer_get_game_type_mission_template"),
(assign, ":cur_mt", reg0),
(str_store_server_name, s0),
(str_store_string, s3, "str_server_name_s0"),
(str_store_string, s2, "str_s2_s3"),
(str_store_string, s3, "str_game_type_s0"),
(str_store_string, s2, "str_s2_s3"),
(store_current_scene, ":cur_scene"),
(val_sub, ":cur_scene", "scn_multi_scene_1"),
(val_add, ":cur_scene", "str_multi_scene_1"),
(str_store_string, s0, ":cur_scene"),
(str_store_string, s3, "str_map_name_s0"),
(str_store_string, s2, "str_s2_s3"),
(store_mission_timer_a, ":mission_timer"),
(val_add, ":mission_timer", "$server_mission_timer_while_player_joined"),
(assign, reg0, ":mission_timer"),
(store_mul, "$g_multiplayer_game_max_seconds", "$g_multiplayer_game_max_minutes", 60),
(store_sub, ":remaining_seconds", "$g_multiplayer_game_max_seconds", ":mission_timer"),
(store_div, reg0, ":remaining_seconds", 60),
(store_mod, reg1, ":remaining_seconds", 60),
(try_begin),
(ge, reg0, 10),
(ge, reg1, 10),
(str_clear, s0),
(str_clear, s1),
(else_try),
(ge, reg0, 10),
(str_clear, s0),
(str_store_string, s1, "@0"),
(else_try),
(ge, reg1, 10),
(str_store_string, s0, "@0"),
(str_clear, s1),
(else_try),
(str_store_string, s0, "@0"),
(str_store_string, s1, "@0"),
(try_end),
(str_store_string, s3, "str_remaining_time_s0reg0_s1reg1"),
(str_store_string, s2, "str_s2_s3"),
(try_for_range, ":cur_option", 0, ":end_cond"),
(assign, reg0, -12345), #magic number
(call_script, "script_game_get_multiplayer_server_option_for_mission_template", ":cur_mt", ":cur_option"),
(try_begin),
(eq, reg0, -12345),
(assign, ":end_cond", 0),
(else_try),
(call_script, "script_game_multiplayer_server_option_for_mission_template_to_string", ":cur_mt", ":cur_option", reg0),
(str_store_string, s3, s0),
(str_store_string, s2, "str_s2_s3"),
(try_end),
(try_end),
(create_text_overlay, reg0, s2, tf_scrollable),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 230),
(position_set_y, pos1, 425),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 540),
(position_set_y, pos1, 150),
(overlay_set_area_size, reg0, pos1),
(presentation_set_duration, 999999),
(try_end),
]),
(ti_on_presentation_run,
[
(str_store_welcome_message, s0),
(try_begin),
(neq, "$g_multiplayer_show_server_rules", 1),
(this_or_next|str_is_empty, s0),
(eq, "$g_multiplayer_welcome_message_shown", 1),
(presentation_set_duration, 0),
(neg|is_presentation_active, "prsnt_multiplayer_escape_menu"),
# (neg|is_presentation_active, "prsnt_multiplayer_team_select"),
(multiplayer_get_my_player, ":my_player_no"),
(player_set_troop_id, ":my_player_no", -1),
(multiplayer_send_int_to_server, multiplayer_event_change_team_no, multi_team_spectator),
(player_set_team_no, ":my_player_no", multi_team_spectator),
(start_presentation, "prsnt_multiplayer_escape_menu"),
(else_try),
(store_mission_timer_a, ":mission_timer"),
(gt, ":mission_timer", 1),
(this_or_next|key_clicked, key_escape),
(this_or_next|key_clicked, key_space),
(this_or_next|key_clicked, key_enter),
(this_or_next|key_clicked, key_left_mouse_button),
(this_or_next|key_clicked, key_right_mouse_button),
(this_or_next|key_clicked, key_xbox_ltrigger),
(key_clicked, key_xbox_rtrigger),
(assign, "$g_multiplayer_welcome_message_shown", 1),
(presentation_set_duration, 0),
(neg|is_presentation_active, "prsnt_multiplayer_escape_menu"),
# (neg|is_presentation_active, "prsnt_multiplayer_team_select"),
(try_begin),
(eq, "$g_multiplayer_show_server_rules", 1),
(assign, "$g_multiplayer_show_server_rules", 0),
(start_presentation, "prsnt_multiplayer_escape_menu"),
(else_try),
(multiplayer_get_my_player, ":my_player_no"),
(player_set_troop_id, ":my_player_no", -1),
(multiplayer_send_int_to_server, multiplayer_event_change_team_no, multi_team_spectator),
(player_set_team_no, ":my_player_no", multi_team_spectator),
(start_presentation, "prsnt_multiplayer_escape_menu"),
(try_end),
(try_end),
]),
]),
("multiplayer_item_select", prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(multiplayer_get_my_player, ":my_player_no"),
(assign, "$g_presentation_obj_item_select_1", -1),
(assign, "$g_presentation_obj_item_select_2", -1),
(assign, "$g_presentation_obj_item_select_3", -1),
(assign, "$g_presentation_obj_item_select_4", -1),
(assign, "$g_presentation_obj_item_select_5", -1),
(assign, "$g_presentation_obj_item_select_6", -1),
(assign, "$g_presentation_obj_item_select_7", -1),
(assign, "$g_presentation_obj_item_select_8", -1),
(assign, "$g_presentation_obj_item_select_9", -1),
(assign, "$g_presentation_obj_item_select_10", -1),
(assign, "$g_presentation_obj_item_select_11", -1),
(assign, "$g_presentation_obj_item_select_12", -1),
(assign, "$g_presentation_obj_item_select_13", -1),
(assign, "$g_presentation_obj_item_select_14", -1),
(assign, "$g_presentation_obj_item_select_15", -1),
(assign, "$g_presentation_obj_item_select_16", -1),
(try_begin),
(neq, "$g_current_opened_item_details", -1),
(close_item_details),
(assign, "$g_current_opened_item_details", -1),
(try_end),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 0),
(player_get_slot, ":selected_item_id", ":my_player_no", ":selected_item_index"),
(try_begin),
(ge, ":selected_item_id", 0),
(create_image_button_overlay, "$g_presentation_obj_item_select_1", "mesh_mp_inventory_slot_empty", "mesh_mp_inventory_slot_empty"),
(create_mesh_overlay_with_item_id, reg0, ":selected_item_id"),
(position_set_x, pos1, 950),
(position_set_y, pos1, 526),
(overlay_set_position, reg0, pos1),
(assign, "$g_inside_obj_1", reg0),
(else_try),
(create_image_button_overlay, "$g_presentation_obj_item_select_1", "mesh_mp_inventory_slot_equip", "mesh_mp_inventory_slot_equip"),
(try_end),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, "$g_presentation_obj_item_select_1", pos1),
(position_set_x, pos1, 899),
(position_set_y, pos1, 475),
(overlay_set_position, "$g_presentation_obj_item_select_1", pos1),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 1),
(player_get_slot, ":selected_item_id", ":my_player_no", ":selected_item_index"),
(try_begin),
(ge, ":selected_item_id", 0),
(create_image_button_overlay, "$g_presentation_obj_item_select_2", "mesh_mp_inventory_slot_empty", "mesh_mp_inventory_slot_empty"),
(create_mesh_overlay_with_item_id, reg0, ":selected_item_id"),
(position_set_x, pos1, 950),
(position_set_y, pos1, 426),
(overlay_set_position, reg0, pos1),
(assign, "$g_inside_obj_2", reg0),
(else_try),
(create_image_button_overlay, "$g_presentation_obj_item_select_2", "mesh_mp_inventory_slot_equip", "mesh_mp_inventory_slot_equip"),
(try_end),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, "$g_presentation_obj_item_select_2", pos1),
(position_set_x, pos1, 899),
(position_set_y, pos1, 375),
(overlay_set_position, "$g_presentation_obj_item_select_2", pos1),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 2),
(player_get_slot, ":selected_item_id", ":my_player_no", ":selected_item_index"),
(try_begin),
(ge, ":selected_item_id", 0),
(create_image_button_overlay, "$g_presentation_obj_item_select_3", "mesh_mp_inventory_slot_empty", "mesh_mp_inventory_slot_empty"),
(create_mesh_overlay_with_item_id, reg0, ":selected_item_id"),
(position_set_x, pos1, 950),
(position_set_y, pos1, 326),
(overlay_set_position, reg0, pos1),
(assign, "$g_inside_obj_3", reg0),
(else_try),
(create_image_button_overlay, "$g_presentation_obj_item_select_3", "mesh_mp_inventory_slot_equip", "mesh_mp_inventory_slot_equip"),
(try_end),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, "$g_presentation_obj_item_select_3", pos1),
(position_set_x, pos1, 899),
(position_set_y, pos1, 275),
(overlay_set_position, "$g_presentation_obj_item_select_3", pos1),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 3),
(player_get_slot, ":selected_item_id", ":my_player_no", ":selected_item_index"),
(try_begin),
(ge, ":selected_item_id", 0),
(create_image_button_overlay, "$g_presentation_obj_item_select_4", "mesh_mp_inventory_slot_empty", "mesh_mp_inventory_slot_empty"),
(create_mesh_overlay_with_item_id, reg0, ":selected_item_id"),
(position_set_x, pos1, 950),
(position_set_y, pos1, 226),
(overlay_set_position, reg0, pos1),
(assign, "$g_inside_obj_4", reg0),
(else_try),
(create_image_button_overlay, "$g_presentation_obj_item_select_4", "mesh_mp_inventory_slot_equip", "mesh_mp_inventory_slot_equip"),
(try_end),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, "$g_presentation_obj_item_select_4", pos1),
(position_set_x, pos1, 899),
(position_set_y, pos1, 175),
(overlay_set_position, "$g_presentation_obj_item_select_4", pos1),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 4),
(player_get_slot, ":selected_item_id", ":my_player_no", ":selected_item_index"),
(try_begin),
(ge, ":selected_item_id", 0),
(create_image_button_overlay, "$g_presentation_obj_item_select_5", "mesh_mp_inventory_slot_empty", "mesh_mp_inventory_slot_empty"),
(create_mesh_overlay_with_item_id, reg0, ":selected_item_id"),
(position_set_x, pos1, 53),
(position_set_y, pos1, 576),
(overlay_set_position, reg0, pos1),
(assign, "$g_inside_obj_5", reg0),
(else_try),
(create_image_button_overlay, "$g_presentation_obj_item_select_5", "mesh_mp_inventory_slot_helmet", "mesh_mp_inventory_slot_helmet"),
(try_end),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, "$g_presentation_obj_item_select_5", pos1),
(position_set_x, pos1, 2),
(position_set_y, pos1, 525),
(overlay_set_position, "$g_presentation_obj_item_select_5", pos1),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 5),
(player_get_slot, ":selected_item_id", ":my_player_no", ":selected_item_index"),
(try_begin),
(ge, ":selected_item_id", 0),
(create_image_button_overlay, "$g_presentation_obj_item_select_6", "mesh_mp_inventory_slot_empty", "mesh_mp_inventory_slot_empty"),
(create_mesh_overlay_with_item_id, reg0, ":selected_item_id"),
(position_set_x, pos1, 53),
(position_set_y, pos1, 476),
(overlay_set_position, reg0, pos1),
(assign, "$g_inside_obj_6", reg0),
(else_try),
(create_image_button_overlay, "$g_presentation_obj_item_select_6", "mesh_mp_inventory_slot_armor", "mesh_mp_inventory_slot_armor"),
(try_end),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, "$g_presentation_obj_item_select_6", pos1),
(position_set_x, pos1, 2),
(position_set_y, pos1, 425),
(overlay_set_position, "$g_presentation_obj_item_select_6", pos1),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 6),
(player_get_slot, ":selected_item_id", ":my_player_no", ":selected_item_index"),
(try_begin),
(ge, ":selected_item_id", 0),
(create_image_button_overlay, "$g_presentation_obj_item_select_7", "mesh_mp_inventory_slot_empty", "mesh_mp_inventory_slot_empty"),
(create_mesh_overlay_with_item_id, reg0, ":selected_item_id"),
(position_set_x, pos1, 53),
(position_set_y, pos1, 376),
(overlay_set_position, reg0, pos1),
(assign, "$g_inside_obj_7", reg0),
(else_try),
(create_image_button_overlay, "$g_presentation_obj_item_select_7", "mesh_mp_inventory_slot_boot", "mesh_mp_inventory_slot_boot"),
(try_end),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, "$g_presentation_obj_item_select_7", pos1),
(position_set_x, pos1, 2),
(position_set_y, pos1, 325),
(overlay_set_position, "$g_presentation_obj_item_select_7", pos1),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 7),
(player_get_slot, ":selected_item_id", ":my_player_no", ":selected_item_index"),
(try_begin),
(ge, ":selected_item_id", 0),
(create_image_button_overlay, "$g_presentation_obj_item_select_8", "mesh_mp_inventory_slot_empty", "mesh_mp_inventory_slot_empty"),
(create_mesh_overlay_with_item_id, reg0, ":selected_item_id"),
(position_set_x, pos1, 53),
(position_set_y, pos1, 276),
(overlay_set_position, reg0, pos1),
(assign, "$g_inside_obj_8", reg0),
(else_try),
(create_image_button_overlay, "$g_presentation_obj_item_select_8", "mesh_mp_inventory_slot_glove", "mesh_mp_inventory_slot_glove"),
(try_end),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, "$g_presentation_obj_item_select_8", pos1),
(position_set_x, pos1, 2),
(position_set_y, pos1, 225),
(overlay_set_position, "$g_presentation_obj_item_select_8", pos1),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 8),
(player_get_slot, ":selected_item_id", ":my_player_no", ":selected_item_index"),
(try_begin),
(ge, ":selected_item_id", 0),
(eq, "$g_horses_are_avaliable", 1),
(create_image_button_overlay, "$g_presentation_obj_item_select_9", "mesh_mp_inventory_slot_empty", "mesh_mp_inventory_slot_empty"),
(create_mesh_overlay_with_item_id, reg0, ":selected_item_id"),
(position_set_x, pos1, 53),
(position_set_y, pos1, 176),
(overlay_set_position, reg0, pos1),
(assign, "$g_inside_obj_9", reg0),
(else_try),
(create_image_button_overlay, "$g_presentation_obj_item_select_9", "mesh_mp_inventory_slot_horse", "mesh_mp_inventory_slot_horse"),
(try_end),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, "$g_presentation_obj_item_select_9", pos1),
(position_set_x, pos1, 2),
(position_set_y, pos1, 125),
(overlay_set_position, "$g_presentation_obj_item_select_9", pos1),
(create_mesh_overlay, reg0, "mesh_mp_inventory_left"),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, reg0, pos1),
(position_set_x, pos1, 0),
(position_set_y, pos1, 14),
(overlay_set_position, reg0, pos1),
(create_mesh_overlay, reg0, "mesh_mp_inventory_right"),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, reg0, pos1),
(position_set_x, pos1, 894),
(position_set_y, pos1, 65),
(overlay_set_position, reg0, pos1),
(create_in_game_button_overlay, "$g_presentation_obj_item_select_10", "str_reset_to_default", 0),
(overlay_set_color, "$g_presentation_obj_item_select_10", 0xFFFFFF),
(position_set_x, pos1, 605),
(position_set_y, pos1, 25),
(overlay_set_position, "$g_presentation_obj_item_select_10", pos1),
(create_in_game_button_overlay, "$g_presentation_obj_item_select_11", "str_done", 0),
(overlay_set_color, "$g_presentation_obj_item_select_11", 0xFFFFFF),
(position_set_x, pos1, 395),
(position_set_y, pos1, 25),
(overlay_set_position, "$g_presentation_obj_item_select_11", pos1),
(assign, ":cur_y", 725),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_team_no, ":my_team_no", ":my_player_no"),
(assign, ":has_bots", 0),
(try_begin),
(eq, ":my_team_no", 0),
(try_begin),
(gt, "$g_multiplayer_num_bots_team_1", 0),
(assign, ":has_bots", 1),
(try_end),
(else_try),
(try_begin),
(gt, "$g_multiplayer_num_bots_team_2", 0),
(assign, ":has_bots", 1),
(try_end),
(try_end),
(team_get_faction, ":my_faction_no", ":my_team_no"),
(try_begin),
(eq, ":has_bots", 1),
(assign, ":num_lines", 0),
(try_begin),
(eq, ":has_bots", 1),
(try_for_range, ":ai_troop_no", multiplayer_ai_troops_begin, multiplayer_ai_troops_end),
(store_troop_faction, ":ai_troop_faction", ":ai_troop_no"),
(eq, ":ai_troop_faction", ":my_faction_no"),
(val_add, ":num_lines", 1),
(try_end),
(try_end),
(store_mul, ":board_height", ":num_lines", 20),
(val_add, ":board_height", 40),
(create_mesh_overlay, reg0, "mesh_mp_ui_command_border_r"),
(position_set_x, pos1, 280),
(position_set_y, pos1, 680),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 2500),
(position_set_y, pos1, 2500),
(overlay_set_size, reg0, pos1),
(create_mesh_overlay, reg0, "mesh_mp_ui_command_border_l"),
(position_set_x, pos1, 650),
(position_set_y, pos1, 680),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 2500),
(position_set_y, pos1, 2500),
(overlay_set_size, reg0, pos1),
(create_mesh_overlay, reg0, "mesh_mp_ui_command_panel"),
(position_set_x, pos1, 350),
(store_sub, ":board_pos_y", 750, ":board_height"),
(position_set_y, pos1, ":board_pos_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 3000),
(position_set_y, pos1, 3000),
(overlay_set_size, reg0, pos1),
(create_text_overlay, reg0, "str_command", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, reg0, pos1),
(position_set_x, pos1, 370),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", 20),
(assign, ":cur_ai_troop_index", 0),
(try_for_range, ":ai_troop_no", multiplayer_ai_troops_begin, multiplayer_ai_troops_end),
(store_troop_faction, ":ai_troop_faction", ":ai_troop_no"),
(eq, ":ai_troop_faction", ":my_faction_no"),
(create_check_box_overlay, reg0, "mesh_checkbox_off", "mesh_checkbox_on"),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, reg0, pos1),
(position_set_x, pos1, 377),
(store_add, ":special_cur_y", ":cur_y", 2),
(position_set_y, pos1, ":special_cur_y"),
(overlay_set_position, reg0, pos1),
(try_begin),
(eq, ":cur_ai_troop_index", 0),
(overlay_set_val, reg0, "$g_multiplayer_bot_type_1_wanted"),
(assign, "$g_presentation_obj_item_select_13", reg0),
(else_try),
(eq, ":cur_ai_troop_index", 1),
(overlay_set_val, reg0, "$g_multiplayer_bot_type_2_wanted"),
(assign, "$g_presentation_obj_item_select_14", reg0),
(else_try),
(eq, ":cur_ai_troop_index", 2),
(overlay_set_val, reg0, "$g_multiplayer_bot_type_3_wanted"),
(assign, "$g_presentation_obj_item_select_15", reg0),
(else_try),
(overlay_set_val, reg0, "$g_multiplayer_bot_type_4_wanted"),
(assign, "$g_presentation_obj_item_select_16", reg0),
(try_end),
(str_store_troop_name, s0, ":ai_troop_no"),
(create_text_overlay, reg0, "str_s0", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 800),
(position_set_y, pos1, 800),
(overlay_set_size, reg0, pos1),
(position_set_x, pos1, 397),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", 20),
(val_add, ":cur_ai_troop_index", 1),
(try_end),
(val_sub, ":cur_y", 20),
(try_end),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_gold, ":player_gold", ":my_player_no"),
(call_script, "script_multiplayer_calculate_cur_selected_items_cost", ":my_player_no", 1),
(create_text_overlay, "$g_presentation_obj_item_select_12", "str_total_item_cost_reg0", tf_left_align|tf_single_line|tf_with_outline),
(try_begin),
(ge, ":player_gold", reg0),
(overlay_set_color, "$g_presentation_obj_item_select_12", 0xFFFFFF),
(else_try),
(overlay_set_color, "$g_presentation_obj_item_select_12", 0xFF0000),
(try_end),
(position_set_x, pos1, 680),
(position_set_y, pos1, 652),
(overlay_set_position, "$g_presentation_obj_item_select_12", pos1),
(store_add, "$g_presentation_obj_item_select_next", "$g_presentation_obj_item_select_12", 1),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_mouse_enter_leave,
[(store_trigger_param_1, ":object"),
(store_trigger_param_2, ":enter_leave"),
(try_begin),
(eq, "$g_close_equipment_selection", 0),
(try_begin),
(eq, ":enter_leave", 0),
(assign, ":item_no", -1),
(try_begin),
(ge, ":object", "$g_presentation_obj_item_select_next"),
(store_sub, ":tested_object", ":object", "$g_presentation_obj_item_select_next"),
(store_mod, ":mod_value", ":tested_object", 2),
(store_sub, ":mod_value", 1, ":mod_value"),
(val_div, ":tested_object", 2),
(store_add, ":cur_slot", multi_data_item_button_indices_begin, ":tested_object"),
(troop_get_slot, ":item_no", "trp_multiplayer_data", ":cur_slot"),
(assign, ":target_obj", ":object"),
(val_add, ":target_obj", ":mod_value"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_1"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 1),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_1"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_2"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 2),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_2"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_3"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 3),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_3"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_4"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 4),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_4"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_5"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 5),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_5"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_6"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 6),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_6"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_7"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 7),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_7"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_8"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 8),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_8"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_9"),
(eq, "$g_horses_are_avaliable", 1),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 9),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_9"),
(try_end),
(try_begin),
(ge, ":item_no", 0),
(overlay_get_position, pos0, ":target_obj"),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_troop_id, ":my_player_troop_no", ":my_player_no"),
(try_begin),
(call_script, "script_cf_multiplayer_is_item_default_for_troop", ":item_no", ":my_player_troop_no"),
(show_item_details, ":item_no", pos0, 0),
(else_try),
(store_troop_faction, ":my_player_faction_no", ":my_player_troop_no"),
(store_sub, ":faction_slot", ":my_player_faction_no", multiplayer_factions_begin),
(val_add, ":faction_slot", slot_item_multiplayer_faction_price_multipliers_begin),
(item_get_slot, ":price_multiplier", ":item_no", ":faction_slot"),
(show_item_details, ":item_no", pos0, ":price_multiplier"),
(try_end),
(assign, "$g_current_opened_item_details", ":item_no"),
(try_end),
(else_try),
(assign, ":item_no", -1),
(try_begin),
(ge, ":object", "$g_presentation_obj_item_select_next"),
(store_sub, ":tested_object", ":object", "$g_presentation_obj_item_select_next"),
(val_div, ":tested_object", 2),
(store_add, ":cur_slot", multi_data_item_button_indices_begin, ":tested_object"),
(troop_get_slot, ":item_no", "trp_multiplayer_data", ":cur_slot"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_1"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 1),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_1"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_2"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 2),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_2"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_3"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 3),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_3"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_4"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 4),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_4"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_5"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 5),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_5"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_6"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 6),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_6"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_7"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 7),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_7"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_8"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 8),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_8"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_9"),
(eq, "$g_horses_are_avaliable", 1),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, 9),
(val_sub, ":player_slot_index", 1),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_slot, ":item_no", ":my_player_no", ":player_slot_index"),
(assign, ":target_obj", "$g_inside_obj_9"),
(try_end),
(try_begin),
(eq, "$g_current_opened_item_details", ":item_no"),
(close_item_details),
(assign, "$g_current_opened_item_details", -1),
(try_end),
(try_end),
(else_try),
(assign, "$g_close_equipment_selection", 0),
(presentation_set_duration, 0),
(try_end),
]),
(ti_on_presentation_event_state_change,
[(store_trigger_param_1, ":object"),
(store_trigger_param_2, ":value"),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_troop_id, ":my_troop_no", ":my_player_no"),
(try_begin),
(eq, "$g_close_equipment_selection", 0),
(try_begin),
(eq, "$g_presentation_state", 0),
(try_begin),
(eq, ":object", "$g_presentation_obj_item_select_1"),
(assign, "$g_presentation_state", 1),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_2"),
(assign, "$g_presentation_state", 2),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_3"),
(assign, "$g_presentation_state", 3),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_4"),
(assign, "$g_presentation_state", 4),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_5"),
(assign, "$g_presentation_state", 5),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_6"),
(assign, "$g_presentation_state", 6),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_7"),
(assign, "$g_presentation_state", 7),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_8"),
(assign, "$g_presentation_state", 8),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_9"),
(eq, "$g_horses_are_avaliable", 1),
(assign, "$g_presentation_state", 9),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(try_end),
(else_try),
(gt, "$g_presentation_state", 0),
(store_sub, ":tested_object", ":object", "$g_presentation_obj_item_select_next"),
(val_div, ":tested_object", 2),
(assign, ":end_cond", multi_data_item_button_indices_end),
(try_for_range, ":cur_slot", multi_data_item_button_indices_begin, ":end_cond"),
(neg|troop_slot_eq, "trp_multiplayer_data", ":cur_slot", -1),
(store_sub, ":button_id", ":cur_slot", multi_data_item_button_indices_begin),
(eq, ":tested_object", ":button_id"),
(troop_get_slot, ":item_no", "trp_multiplayer_data", ":cur_slot"),
(store_add, ":player_slot_index", slot_player_selected_item_indices_begin, "$g_presentation_state"),
(val_sub, ":player_slot_index", 1),
(player_set_slot, ":my_player_no", ":player_slot_index", ":item_no"),
(player_get_gold, ":player_gold", ":my_player_no"),
(call_script, "script_multiplayer_calculate_cur_selected_items_cost", ":my_player_no", 1),
(overlay_set_text, "$g_presentation_obj_item_select_12", "str_total_item_cost_reg0"),
(try_begin),
(ge, ":player_gold", reg0),
(overlay_set_color, "$g_presentation_obj_item_select_12", 0xFFFFFF),
(else_try),
(overlay_set_color, "$g_presentation_obj_item_select_12", 0xFF0000),
(try_end),
(assign, ":end_cond", 0), #break
(try_end),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(try_end),
(try_begin),
(eq, ":object", "$g_presentation_obj_item_select_10"),
(call_script, "script_multiplayer_set_default_item_selections_for_troop", ":my_troop_no"),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_11"),
(call_script, "script_multiplayer_send_item_selections"),
(presentation_set_duration, 0),
(try_begin),
(try_begin),
(assign, "$g_show_no_more_respawns_remained", 0),
(try_end),
(eq, "$g_show_no_more_respawns_remained", 1),
(store_mission_timer_a, "$g_multiplayer_respawn_start_time"),
(start_presentation, "prsnt_multiplayer_respawn_time_counter"),
(try_end),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_13"),
(assign, "$g_multiplayer_bot_type_1_wanted", ":value"),
(multiplayer_send_2_int_to_server, multiplayer_event_set_bot_selection, slot_player_bot_type_1_wanted, ":value"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_14"),
(assign, "$g_multiplayer_bot_type_2_wanted", ":value"),
(multiplayer_send_2_int_to_server, multiplayer_event_set_bot_selection, slot_player_bot_type_2_wanted, ":value"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_15"),
(assign, "$g_multiplayer_bot_type_3_wanted", ":value"),
(multiplayer_send_2_int_to_server, multiplayer_event_set_bot_selection, slot_player_bot_type_3_wanted, ":value"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_16"),
(assign, "$g_multiplayer_bot_type_4_wanted", ":value"),
(multiplayer_send_2_int_to_server, multiplayer_event_set_bot_selection, slot_player_bot_type_4_wanted, ":value"),
(try_end),
(else_try),
(assign, "$g_close_equipment_selection", 0),
(presentation_set_duration, 0),
(try_end),
]),
(ti_on_presentation_mouse_press,
[(store_trigger_param_1, ":object"),
(store_trigger_param_2, ":mouse_state"),
(try_begin),
(eq, "$g_close_equipment_selection", 0),
(try_begin),
(eq, ":mouse_state", 1), #right click (clears the item slot)
(try_begin),
(eq, "$g_presentation_state", 0),
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(eq, ":object", "$g_presentation_obj_item_select_1"),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 0),
(player_set_slot, ":my_player_no", ":selected_item_index", -1),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_2"),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 1),
(player_set_slot, ":my_player_no", ":selected_item_index", -1),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_3"),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 2),
(player_set_slot, ":my_player_no", ":selected_item_index", -1),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_4"),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 3),
(player_set_slot, ":my_player_no", ":selected_item_index", -1),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_5"),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 4),
(player_set_slot, ":my_player_no", ":selected_item_index", -1),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_6"),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 5),
(player_set_slot, ":my_player_no", ":selected_item_index", -1),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_7"),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 6),
(player_set_slot, ":my_player_no", ":selected_item_index", -1),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_8"),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 7),
(player_set_slot, ":my_player_no", ":selected_item_index", -1),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(else_try),
(eq, ":object", "$g_presentation_obj_item_select_9"),
(eq, "$g_horses_are_avaliable", 1),
(store_add, ":selected_item_index", slot_player_selected_item_indices_begin, 8),
(player_set_slot, ":my_player_no", ":selected_item_index", -1),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(try_end),
(else_try),
(gt, "$g_presentation_state", 0),
(presentation_set_duration, 0),
(assign, "$g_presentation_state", 0),
(start_presentation, "prsnt_multiplayer_item_select"),
(try_end),
(try_end),
(else_try),
(assign, "$g_close_equipment_selection", 0),
(presentation_set_duration, 0),
(try_end),
]),
(ti_on_presentation_run,
[(store_trigger_param_1, ":cur_time"),
## this causes an error sometimes
## (multiplayer_get_my_player, ":my_player_no"),
## (player_get_gold, ":player_gold", ":my_player_no"),
## (call_script, "script_multiplayer_calculate_cur_selected_items_cost", ":my_player_no", 1),
## (try_begin),
## (ge, ":player_gold", reg0),
## (overlay_set_color, "$g_presentation_obj_item_select_12", 0xFFFFFF),
## (else_try),
## (overlay_set_color, "$g_presentation_obj_item_select_12", 0xFF0000),
## (try_end),
(try_begin),
(eq, "$g_close_equipment_selection", 0),
(try_begin),
(this_or_next|key_clicked, key_escape),
(key_clicked, key_xbox_start),
(try_begin),
(neq, "$g_current_opened_item_details", -1),
(close_item_details),
(assign, "$g_current_opened_item_details", -1),
(try_end),
(gt, ":cur_time", 200),
(presentation_set_duration, 0),
(try_end),
(else_try),
(assign, "$g_close_equipment_selection", 0),
#daha sonra buraya siege modundaysa ve takimini yeni degistirdigi icin spawn olamiyorsa start_presentation, spawn_counter satirini ekle sdsd.
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_message_1", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(try_begin),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_round_result_in_battle_mode),
(assign, ":winner_agent_team", "$g_multiplayer_message_value_1"),
(try_begin),
(eq, ":winner_agent_team", -1),
(assign, ":text_font_color", 0xFFFFFFFF),
(str_store_string, s0, "str_round_draw_no_one_remained"),
(else_try),
(try_begin), #for spectators initializing, we assume spectators are fan of team0 so coloring is applied as they are at team0.
(eq, ":winner_agent_team", 0),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end), #initializing ends
(try_begin),
(lt, "$my_team_at_start_of_round", 2),
(try_begin),
(eq, "$my_team_at_start_of_round", ":winner_agent_team"),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end),
(try_end),
(team_get_faction, ":faction_of_winner_team", ":winner_agent_team"),
(str_store_faction_name, s1, ":faction_of_winner_team"),
(str_store_string, s0, "str_s1_won_round"),
(try_end),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(try_begin),
(neq, ":winner_agent_team", -1),
(position_set_x, pos1, 375), #375
(else_try),
(position_set_x, pos1, 400), #400
(try_end),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 300),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_capture_the_flag_score),
(agent_get_team, ":winner_agent_team", "$g_multiplayer_message_value_1"), #assign given agent's team to winner agent team.
(team_get_faction, ":winner_agent_faction", ":winner_agent_team"),
(str_store_faction_name, s1, ":winner_agent_faction"),
(try_begin), #for spectators initializing, we assume spectators are fan of team0 so coloring is applied as they are at team0.
(eq, ":winner_agent_team", 0),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end), #initializing ends
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(agent_get_team, ":my_player_team", ":my_player_agent"),
(try_begin),
(eq, ":my_player_team", ":winner_agent_team"),
(assign, ":text_font_color", 0xFF33DD11),
(play_sound, "snd_team_scored_a_point"),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(play_sound, "snd_enemy_scored_a_point"),
(try_end),
(try_end),
(try_end),
(str_store_string, s0, "str_s1_captured_flag"),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 350),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_flag_returned_home),
(try_begin),
(ge, "$g_multiplayer_message_value_1", 0),
(agent_get_team, ":returned_flag_agent_team", "$g_multiplayer_message_value_1"),
(team_get_faction, ":returned_flag_agent_faction", ":returned_flag_agent_team"),
(str_store_faction_name, s1, ":returned_flag_agent_faction"),
(str_store_string, s0, "str_s1_returned_flag"),
(else_try),
(val_add, "$g_multiplayer_message_value_1", 1),
(val_mul, "$g_multiplayer_message_value_1", -1),
(assign, ":returned_flag_agent_team", "$g_multiplayer_message_value_1"),
(team_get_faction, ":returned_flag_agent_faction", ":returned_flag_agent_team"),
(str_store_faction_name, s1, ":returned_flag_agent_faction"),
(str_store_string, s0, "str_s1_auto_returned_flag"),
(try_end),
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(play_sound, "snd_flag_returned"),
(try_end),
(try_end),
(assign, ":text_font_color", 0xFFFFFFFF),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 325),
(position_set_y, pos1, 400),
(position_set_x, pos1, 500), #new
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_capture_the_flag_stole),
(agent_get_team, ":stolen_flag_agent_team", "$g_multiplayer_message_value_1"),
(team_get_faction, ":stolen_flag_agent_faction", ":stolen_flag_agent_team"),
(str_store_faction_name, s1, ":stolen_flag_agent_faction"),
(assign, ":text_font_color", 0xFFFFFFFF),
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(agent_get_team, ":my_player_team", ":my_player_agent"),
(try_begin),
(eq, ":my_player_team", ":stolen_flag_agent_team"),
(play_sound, "snd_enemy_flag_taken"),
(else_try),
(play_sound, "snd_your_flag_taken"),
(try_end),
(try_end),
(try_end),
(str_store_string, s0, "str_s1_taken_flag"),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 365),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_flag_captured),
(store_div, ":winner_agent_team", "$g_multiplayer_message_value_1", 100),
(store_mod, reg0, "$g_multiplayer_message_value_1", 100),
(val_sub, ":winner_agent_team", 1),
(try_begin), #for spectators initializing, we assume spectators are fan of team0 so coloring is applied as they are at team0.
(eq, ":winner_agent_team", 0),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end), #initializing ends
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(agent_get_team, ":my_player_team", ":my_player_agent"),
(try_begin),
(eq, ":my_player_team", ":winner_agent_team"),
(assign, ":text_font_color", 0xFF33DD11),
(play_sound, "snd_team_scored_a_point"),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(play_sound, "snd_enemy_scored_a_point"),
(try_end),
(try_end),
(try_end),
(team_get_faction, ":winner_agent_faction", ":winner_agent_team"),
(str_store_faction_name, s1, ":winner_agent_faction"),
(str_store_string, s0, "str_s1_captured_flag_reg0"),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 345),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_flag_is_pulling),
(store_div, ":winner_agent_team", "$g_multiplayer_message_value_1", 100),
(store_mod, reg0, "$g_multiplayer_message_value_1", 100),
(val_sub, ":winner_agent_team", 1),
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(agent_get_team, ":my_player_team", ":my_player_agent"),
(try_begin),
(eq, ":my_player_team", ":winner_agent_team"),
(play_sound, "snd_enemy_flag_taken"),
(else_try),
(play_sound, "snd_your_flag_taken"),
(try_end),
(try_end),
(try_end),
(assign, ":text_font_color", 0xFFFFFFFF),
(team_get_faction, ":winner_agent_faction", ":winner_agent_team"),
(str_store_faction_name, s1, ":winner_agent_faction"),
(str_store_string, s0, "str_s1_pulling_flag_reg0"),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 345),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_flag_neutralized),
(store_div, ":winner_agent_team", "$g_multiplayer_message_value_1", 100),
(store_mod, reg0, "$g_multiplayer_message_value_1", 100),
(val_sub, ":winner_agent_team", 1),
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(play_sound, "snd_flag_returned"),
(try_end),
(try_end),
(try_begin), #for spectators initializing, we assume spectators are fan of team0 so coloring is applied as they are at team0.
(eq, ":winner_agent_team", 0),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end), #initializing ends
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(agent_get_team, ":my_player_team", ":my_player_agent"),
(try_begin),
(eq, ":my_player_team", ":winner_agent_team"),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end),
(try_end),
(try_end),
(team_get_faction, ":winner_agent_faction", ":winner_agent_team"),
(str_store_faction_name, s1, ":winner_agent_faction"),
(str_store_string, s0, "str_s1_neutralized_flag_reg0"),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 345),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_round_result_in_siege_mode),
(assign, ":winner_agent_team", "$g_multiplayer_message_value_1"),
(try_begin), #for spectators initializing, we assume spectators are fan of team0 so coloring is applied as they are at team0.
(eq, ":winner_agent_team", 0),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end), #initializing ends
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(agent_get_team, ":my_player_team", ":my_player_agent"),
(try_begin),
(eq, ":my_player_team", ":winner_agent_team"),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end),
(try_end),
(try_end),
(try_begin),
(eq, "$g_multiplayer_message_value_1", 0),
(str_store_string, s0, "str_s1_defended_castle"),
(else_try),
(eq, "$g_multiplayer_message_value_1", 1),
(str_store_string, s0, "str_s1_captured_castle"),
(else_try),
(str_store_string, s0, "str_round_draw"),
(assign, ":text_font_color", 0xFFFFFFFF),
(try_end),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(try_begin),
(neq, "$g_multiplayer_message_value_1", -1),
(position_set_x, pos1, 325),
(else_try),
(position_set_x, pos1, 400),
(try_end),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_round_draw),
(assign, ":text_font_color", 0xFFFFFFFF),
(str_store_string, s0, "str_round_draw"),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 375),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_start_death_mode),
(assign, ":text_font_color", 0xFFFFFFFF),
(str_store_string, s0, "str_death_mode_started"),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 350),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_target_destroyed),
(try_begin),
(lt, "$g_multiplayer_message_value_1", 0),
(val_mul, "$g_multiplayer_message_value_1", -1),
(assign, ":scene_prop_team", 0),
(team_get_faction, ":faction_of_winner_team", 1),
(str_store_faction_name, s1, ":faction_of_winner_team"),
(else_try),
(assign, ":scene_prop_team", 1),
(team_get_faction, ":faction_of_winner_team", 0),
(str_store_faction_name, s1, ":faction_of_winner_team"),
(try_end),
(try_begin), #for spectators initializing, we assume spectators are fan of team0 so coloring is applied as they are at team0.
(eq, "$g_multiplayer_message_value_1", 1),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end), #initializing ends
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(try_begin),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_team_no, ":my_team_no", ":my_player_no"),
(neq, ":scene_prop_team", ":my_team_no"), #if scene prop and I have different teams this means we won
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end),
(try_end),
(try_begin),
(eq, "$g_multiplayer_message_value_1", 9),
(str_store_string, s0, "str_s1_destroyed_all_targets"),
(else_try),
(eq, "$g_multiplayer_message_value_1", 1),
(str_store_string, s0, "str_s1_destroyed_catapult"),
(else_try),
(eq, "$g_multiplayer_message_value_1", 2),
(str_store_string, s0, "str_s1_destroyed_trebuchet"),
(try_end),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 350),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_defenders_saved_n_targets),
(try_begin), #for spectators initializing, we assume spectators are fan of team0 so coloring is applied as they are at team0.
(eq, "$g_defender_team", 0),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end), #initializing ends
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(agent_get_team, ":my_player_team", ":my_player_agent"),
(try_begin),
(eq, ":my_player_team", "$g_defender_team"),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end),
(try_end),
(try_end),
(assign, ":num_targets_saved", "$g_multiplayer_message_value_1"),
(team_get_faction, ":faction_of_winner_team", "$g_defender_team"),
(str_store_faction_name, s1, ":faction_of_winner_team"),
(try_begin),
(eq, ":num_targets_saved", 1),
(str_store_string, s0, "str_s1_saved_1_target"),
(else_try),
(eq, ":num_targets_saved", 2),
(str_store_string, s0, "str_s1_saved_2_targets"),
(try_end),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 350),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_attackers_won_the_round),
(assign, ":winner_agent_team", "$g_multiplayer_message_value_1"),
(try_begin), #for spectators initializing, we assume spectators are fan of team0 so coloring is applied as they are at team0.
(eq, ":winner_agent_team", 0),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end), #initializing ends
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_agent_id, ":my_player_agent", ":my_player_no"),
(try_begin),
(ge, ":my_player_agent", 0),
(agent_get_team, ":my_player_team", ":my_player_agent"),
(try_begin),
(eq, ":my_player_team", ":winner_agent_team"),
(assign, ":text_font_color", 0xFF33DD11),
(else_try),
(assign, ":text_font_color", 0xFFFF4422),
(try_end),
(try_end),
(try_end),
(try_begin),
(eq, "$g_defender_team", 0),
(team_get_faction, ":faction_of_winner_team", 1),
(else_try),
(team_get_faction, ":faction_of_winner_team", 0),
(try_end),
(str_store_faction_name, s1, ":faction_of_winner_team"),
(str_store_string, s0, "str_s1_won_round"),
(create_text_overlay, "$g_multiplayer_message_1", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_1", ":text_font_color"),
(position_set_x, pos1, 350),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 400),
(overlay_set_position, "$g_multiplayer_message_1", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_1", pos1),
(presentation_set_duration, 400),
(try_end),
]),
(ti_on_presentation_run,
[
]),
]),
("multiplayer_message_2", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(try_begin),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_auto_team_balance_done),
(assign, ":text_font_color", 0xFFFFFFFF),
(str_store_string, s0, "str_auto_team_balance_done"),
(create_text_overlay, "$g_multiplayer_message_2", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_2", ":text_font_color"),
(position_set_x, pos1, 375),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 550),
(overlay_set_position, "$g_multiplayer_message_2", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_2", pos1),
(presentation_set_duration, 300),
(else_try),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_auto_team_balance_next),
(assign, ":text_font_color", 0xFFFFFFFF),
(try_begin),
(str_store_string, s0, "str_auto_team_balance_in_20_seconds"),
(position_set_x, pos1, 375),
(else_try),
(str_store_string, s0, "str_auto_team_balance_next_round"),
(position_set_x, pos1, 375),
(try_end),
(create_text_overlay, "$g_multiplayer_message_2", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_2", ":text_font_color"),
(position_set_y, pos1, 550),
(position_set_x, pos1, 500), #new
(overlay_set_position, "$g_multiplayer_message_2", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_2", pos1),
(presentation_set_duration, 300),
(try_end),
]),
(ti_on_presentation_run,
[
]),
]),
("multiplayer_message_3", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(try_begin),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_poll_result),
(assign, ":text_font_color", 0xFFFFFFFF),
(try_begin),
(eq, "$g_multiplayer_message_value_3", 1),
(str_store_string, s0, "str_poll_result_yes"),
(else_try),
(str_store_string, s0, "str_poll_result_no"),
(try_end),
(create_text_overlay, "$g_multiplayer_message_3", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_message_3", ":text_font_color"),
(position_set_x, pos1, 380),
(position_set_x, pos1, 500), #new
(position_set_y, pos1, 475),
(overlay_set_position, "$g_multiplayer_message_3", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_message_3", pos1),
(presentation_set_duration, 400),
(try_end),
]),
(ti_on_presentation_run,
[
]),
]),
("multiplayer_round_time_counter", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(assign, "$g_multiplayer_last_round_time_counter_value", -1),
(str_clear, s0),
(create_text_overlay, "$g_multiplayer_round_time_counter_overlay", s0, tf_left_align|tf_single_line|tf_with_outline),
(overlay_set_color, "$g_multiplayer_round_time_counter_overlay", 0xFFFFFF),
(position_set_x, pos1, 900),
(position_set_y, pos1, 690),
(overlay_set_position, "$g_multiplayer_round_time_counter_overlay", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_round_time_counter_overlay", pos1),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_run,
[(store_mission_timer_a, ":current_time"),
(store_sub, ":seconds_past_in_round", ":current_time", "$g_round_start_time"),
(store_sub, ":seconds_left_in_round", "$g_multiplayer_round_max_seconds", ":seconds_past_in_round"),
(val_max, ":seconds_left_in_round", 0),
(try_begin),
(neq, "$g_multiplayer_last_round_time_counter_value", ":seconds_left_in_round"),
(assign, "$g_multiplayer_last_round_time_counter_value", ":seconds_left_in_round"),
(store_div, reg0, ":seconds_left_in_round", 60),
(store_div, reg1, ":seconds_left_in_round", 10),
(val_mod, reg1, 6),
(assign, reg2, ":seconds_left_in_round"),
(val_mod, reg2, 10),
(str_store_string, s0, "str_reg0_dd_reg1reg2"),
(overlay_set_text, "$g_multiplayer_round_time_counter_overlay", s0),
(try_end),
]),
]),
("multiplayer_team_score_display", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(assign, "$g_multiplayer_team_1_last_displayed_score", -1),
(assign, "$g_multiplayer_team_2_last_displayed_score", -1),
(str_clear, s0),
(create_text_overlay, "$g_multiplayer_team_1_score_display_overlay", s0, tf_left_align|tf_single_line|tf_with_outline),
(overlay_set_color, "$g_multiplayer_team_1_score_display_overlay", 0xFFFFFF),
(position_set_x, pos1, 40),
(position_set_y, pos1, 700),
(overlay_set_position, "$g_multiplayer_team_1_score_display_overlay", pos1),
(position_set_x, pos1, 1500),
(position_set_y, pos1, 1500),
(overlay_set_size, "$g_multiplayer_team_1_score_display_overlay", pos1),
(create_text_overlay, "$g_multiplayer_team_2_score_display_overlay", s0, tf_left_align|tf_single_line|tf_with_outline),
(overlay_set_color, "$g_multiplayer_team_2_score_display_overlay", 0xFFFFFF),
(position_set_x, pos1, 40),
(position_set_y, pos1, 645),
(overlay_set_position, "$g_multiplayer_team_2_score_display_overlay", pos1),
(position_set_x, pos1, 1500),
(position_set_y, pos1, 1500),
(overlay_set_size, "$g_multiplayer_team_2_score_display_overlay", pos1),
# (try_begin),
# (eq, "$g_multiplayer_team_1_faction", "fac_kingdom_4"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_1"),
# (else_try),
# (eq, "$g_multiplayer_team_1_faction", "fac_kingdom_2"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_2"),
# (else_try),
# (eq, "$g_multiplayer_team_1_faction", "fac_kingdom_3"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_3"),
# (else_try),
# (eq, "$g_multiplayer_team_1_faction", "fac_kingdom_5"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_4"),
# (else_try),
# (eq, "$g_multiplayer_team_1_faction", "fac_kingdom_6"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_5"),
# (else_try),
# (eq, "$g_multiplayer_team_1_faction", "fac_kingdom_1"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_6"),
# (try_end),
(position_set_x, pos3, 25),
(position_set_y, pos3, 715),
(overlay_set_position, reg0, pos3),
(position_set_x, pos1, 50),
(position_set_y, pos1, 50),
(overlay_set_size, reg0, pos1),
# (try_begin),
# (eq, "$g_multiplayer_team_1_faction", "$g_multiplayer_team_2_faction"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_7"),
# (else_try),
# (eq, "$g_multiplayer_team_2_faction", "fac_kingdom_4"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_1"),
# (else_try),
# (eq, "$g_multiplayer_team_2_faction", "fac_kingdom_2"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_2"),
# (else_try),
# (eq, "$g_multiplayer_team_2_faction", "fac_kingdom_3"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_3"),
# (else_try),
# (eq, "$g_multiplayer_team_2_faction", "fac_kingdom_5"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_4"),
# (else_try),
# (eq, "$g_multiplayer_team_2_faction", "fac_kingdom_6"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_5"),
# (else_try),
# (eq, "$g_multiplayer_team_2_faction", "fac_kingdom_1"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_6"),
# (try_end),
(position_set_x, pos3, 25),
(position_set_y, pos3, 660),
(overlay_set_position, reg0, pos3),
(position_set_x, pos1, 50),
(position_set_y, pos1, 50),
(overlay_set_size, reg0, pos1),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_run, [
(team_get_score, ":team_1_score", 0),
(team_get_score, ":team_2_score", 1),
(try_begin),
(this_or_next|neq, ":team_1_score", "$g_multiplayer_team_1_last_displayed_score"),
(neq, ":team_2_score", "$g_multiplayer_team_2_last_displayed_score"),
(assign, "$g_multiplayer_team_1_last_displayed_score", ":team_1_score"),
(assign, "$g_multiplayer_team_2_last_displayed_score", ":team_2_score"),
(str_store_faction_name, s0, "$g_multiplayer_team_1_faction"),
(assign, reg0, ":team_1_score"),
(overlay_set_text, "$g_multiplayer_team_1_score_display_overlay", "str_reg0"),
(str_store_faction_name, s0, "$g_multiplayer_team_2_faction"),
(assign, reg0, ":team_2_score"),
(overlay_set_text, "$g_multiplayer_team_2_score_display_overlay", "str_reg0"),
## (str_store_faction_name, s0, "$g_multiplayer_team_1_faction"),
## (assign, reg0, ":team_1_score"),
## (overlay_set_text, "$g_multiplayer_team_1_score_display_overlay", "str_s0_dd_reg0"),
## (str_store_faction_name, s0, "$g_multiplayer_team_2_faction"),
## (assign, reg0, ":team_2_score"),
## (overlay_set_text, "$g_multiplayer_team_2_score_display_overlay", "str_s0_dd_reg0"),
(try_end),
]),
]),
("multiplayer_flag_projection_display", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[
(set_fixed_point_multiplier, 1000),
(store_sub, ":flag_mesh", "$g_multiplayer_team_1_faction", multiplayer_factions_begin),
(val_add, ":flag_mesh", multiplayer_flag_projections_begin),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_1", ":flag_mesh"),
(val_sub, ":flag_mesh", multiplayer_flag_projections_begin),
(val_add, ":flag_mesh", multiplayer_flag_taken_projections_begin),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_2", ":flag_mesh"),
(try_begin),
(neq, "$g_multiplayer_team_1_faction", "$g_multiplayer_team_2_faction"),
(store_sub, ":flag_mesh", "$g_multiplayer_team_2_faction", multiplayer_factions_begin),
(val_add, ":flag_mesh", multiplayer_flag_projections_begin),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_3", ":flag_mesh"),
(val_sub, ":flag_mesh", multiplayer_flag_projections_begin),
(val_add, ":flag_mesh", multiplayer_flag_taken_projections_begin),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_4", ":flag_mesh"),
(else_try),
(assign, ":flag_mesh", "mesh_flag_project_rb"),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_3", ":flag_mesh"),
(assign, ":flag_mesh", "mesh_flag_project_rb_miss"),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_4", ":flag_mesh"),
(try_end),
(position_set_x, pos1, 250),
(position_set_y, pos1, 250),
(overlay_set_size, "$g_presentation_obj_flag_projection_display_1", pos1),
(overlay_set_size, "$g_presentation_obj_flag_projection_display_2", pos1),
(overlay_set_size, "$g_presentation_obj_flag_projection_display_3", pos1),
(overlay_set_size, "$g_presentation_obj_flag_projection_display_4", pos1),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 0),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_2", 0),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_3", 0),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_4", 0),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_run,
[
(set_fixed_point_multiplier, 1000),
(scene_prop_get_instance, ":flag_red_id", "$team_1_flag_scene_prop", 0),
(team_get_slot, ":team_0_flag_situation", 0, slot_team_flag_situation),
(try_begin),
(neq, ":team_0_flag_situation", 1),
(prop_instance_get_position, pos1, ":flag_red_id"), #hold position of flag of team 1 (red flag) at pos1
(else_try),
(entry_point_get_position, pos1, multi_base_point_team_1), #moved from above to here after auto-set position
(try_end),
(position_move_z, pos1, 200, 1),
(scene_prop_get_instance, ":flag_blue_id", "$team_2_flag_scene_prop", 0),
(team_get_slot, ":team_1_flag_situation", 1, slot_team_flag_situation),
(try_begin),
(neq, ":team_1_flag_situation", 1),
(prop_instance_get_position, pos2, ":flag_blue_id"), #hold position of flag of team 1 (red flag) at pos1
(else_try),
(entry_point_get_position, pos2, multi_base_point_team_2), #moved from above to here after auto-set position
(try_end),
(position_move_z, pos2, 200, 1),
(position_get_screen_projection, pos3, pos1),
(position_get_x, ":x_pos", pos3),
(position_get_y, ":y_pos", pos3),
(position_set_y, pos3, ":y_pos"),
(try_begin),
(is_between, ":x_pos", -100, 1100),
(is_between, ":y_pos", -100, 850),
(multiplayer_get_my_player, ":my_player_number"),
(try_begin),
(ge, ":my_player_number", 0),
(player_get_team_no, ":my_player_team", ":my_player_number"),
(else_try),
(assign, ":my_player_team", multi_team_spectator),
(try_end),
(try_begin),
(neq, ":my_player_team", 1), #if I am at team 0 or spectator
(try_begin),
(neq, ":team_0_flag_situation", 1), #if our flag is not stolen
(overlay_set_position, "$g_presentation_obj_flag_projection_display_1", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 1),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_2", 0),
(else_try), #if our flag is stolen
(try_begin),
(eq, ":my_player_team", 0),
(assign, ":our_base_entry_id", multi_base_point_team_1),
(else_try),
(assign, ":our_base_entry_id", multi_base_point_team_2),
(try_end),
(entry_point_get_position, pos5, ":our_base_entry_id"), #moved from above to here after auto-set position
(position_get_screen_projection, pos3, pos5),
(overlay_set_position, "$g_presentation_obj_flag_projection_display_2", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_2", 1),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 0),
(try_end),
(else_try),
(try_begin),
(neq, ":team_0_flag_situation", 1),
(overlay_set_position, "$g_presentation_obj_flag_projection_display_1", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 1),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_2", 0),
(try_end),
(try_end),
(else_try),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 0),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_2", 0),
(try_end),
(position_get_screen_projection, pos3, pos2),
(position_get_x, ":x_pos", pos3),
(position_get_y, ":y_pos", pos3),
(position_set_y, pos3, ":y_pos"),
(try_begin),
(is_between, ":x_pos", -100, 1100),
(is_between, ":y_pos", -100, 850),
(team_get_slot, ":team_1_flag_situation", 1, slot_team_flag_situation),
(multiplayer_get_my_player, ":my_player_number"),
(try_begin),
(ge, ":my_player_number", 0),
(player_get_team_no, ":my_player_team", ":my_player_number"),
(else_try),
(assign, ":my_player_team", multi_team_spectator),
(try_end),
(try_begin),
(neq, ":my_player_team", 0), #if I am at team 0 or spectator
(try_begin),
(neq, ":team_1_flag_situation", 1), #if our flag is not stolen
(overlay_set_position, "$g_presentation_obj_flag_projection_display_3", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_3", 1),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_4", 0),
(else_try), #if our flag is stolen
(try_begin),
(eq, ":my_player_team", 0),
(assign, ":our_base_entry_id", multi_base_point_team_1),
(else_try),
(assign, ":our_base_entry_id", multi_base_point_team_2),
(try_end),
(entry_point_get_position, pos5, ":our_base_entry_id"), #moved from above to here after auto-set position
(position_get_screen_projection, pos3, pos5),
(overlay_set_position, "$g_presentation_obj_flag_projection_display_4", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_4", 1),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_3", 0),
(try_end),
(else_try),
(try_begin),
(neq, ":team_1_flag_situation", 1),
(overlay_set_position, "$g_presentation_obj_flag_projection_display_3", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_3", 1),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_4", 0),
(try_end),
(try_end),
(else_try),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_3", 0),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_4", 0),
(try_end),
]),
]),
("multiplayer_flag_projection_display_bt", prsntf_read_only|prsntf_manual_end_only, 0, [ #this is for search and destroy death mode flags.
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(store_sub, ":flag_mesh", "$g_multiplayer_team_1_faction", multiplayer_factions_begin),
(val_add, ":flag_mesh", multiplayer_flag_projections_begin),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_1", ":flag_mesh"),
(try_begin),
(neq, "$g_multiplayer_team_1_faction", "$g_multiplayer_team_2_faction"),
(store_sub, ":flag_mesh", "$g_multiplayer_team_2_faction", multiplayer_factions_begin),
(val_add, ":flag_mesh", multiplayer_flag_projections_begin),
(else_try),
(assign, ":flag_mesh", "mesh_flag_project_rb"),
(try_end),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_3", ":flag_mesh"),
(position_set_x, pos1, 250),
(position_set_y, pos1, 250),
(overlay_set_size, "$g_presentation_obj_flag_projection_display_1", pos1),
(overlay_set_size, "$g_presentation_obj_flag_projection_display_3", pos1),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 0),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_3", 0),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_run, [
(try_begin),
(eq, "$g_round_ended", 0),
(set_fixed_point_multiplier, 1000),
(scene_prop_get_instance, ":flag_1_id", "$team_1_flag_scene_prop", 0),
(prop_instance_get_position, pos1, ":flag_1_id"), #hold position of flag of team 1 at pos1
(position_move_z, pos1, 250, 1),
(scene_prop_get_instance, ":flag_2_id", "$team_2_flag_scene_prop", 0),
(prop_instance_get_position, pos2, ":flag_2_id"), #hold position of flag of team 2 at pos2
(position_move_z, pos2, 250, 1),
(position_get_screen_projection, pos3, pos1),
(position_get_x, ":x_pos", pos3),
(position_get_y, ":y_pos", pos3),
(position_set_y, pos3, ":y_pos"),
(try_begin),
(is_between, ":x_pos", -100, 1100),
(is_between, ":y_pos", -100, 850),
(overlay_set_position, "$g_presentation_obj_flag_projection_display_1", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 1),
(else_try),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 0),
(try_end),
(position_get_screen_projection, pos3, pos2),
(position_get_x, ":x_pos", pos3),
(position_get_y, ":y_pos", pos3),
(position_set_y, pos3, ":y_pos"),
(try_begin),
(is_between, ":x_pos", -100, 1100),
(is_between, ":y_pos", -100, 850),
(overlay_set_position, "$g_presentation_obj_flag_projection_display_3", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_3", 1),
(else_try),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_3", 0),
(try_end),
(else_try),
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_destructible_targets_display", prsntf_read_only|prsntf_manual_end_only, 0, [ #this is for search and destroy death mode flags.
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(try_begin),
(eq, "$g_defender_team", 0),
(store_sub, ":flag_mesh", "$g_multiplayer_team_1_faction", multiplayer_factions_begin),
(else_try),
(store_sub, ":flag_mesh", "$g_multiplayer_team_2_faction", multiplayer_factions_begin),
(try_end),
(val_add, ":flag_mesh", multiplayer_flag_projections_begin),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_1", ":flag_mesh"),
(create_mesh_overlay, "$g_presentation_obj_flag_projection_display_2", ":flag_mesh"),
(position_set_x, pos1, 250),
(position_set_y, pos1, 250),
(overlay_set_size, "$g_presentation_obj_flag_projection_display_1", pos1),
(overlay_set_size, "$g_presentation_obj_flag_projection_display_2", pos1),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 0),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_2", 0),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_run, [
(try_begin),
(eq, "$g_round_ended", 0),
(set_fixed_point_multiplier, 1000),
(scene_prop_get_instance, ":target_1_id", "$g_destructible_target_1", 0),
(prop_instance_get_position, pos1, ":target_1_id"),
(prop_instance_get_position, pos1, ":target_1_id"),
(position_move_z, pos1, 250, 1),
(scene_prop_get_instance, ":target_2_id", "$g_destructible_target_2", 0),
(prop_instance_get_position, pos2, ":target_2_id"),
(prop_instance_get_position, pos2, ":target_2_id"),
(position_move_z, pos2, 250, 1),
(position_get_screen_projection, pos3, pos1),
(position_get_x, ":x_pos", pos3),
(position_get_y, ":y_pos", pos3),
(position_set_y, pos3, ":y_pos"),
(try_begin),
(is_between, ":x_pos", -100, 1100),
(is_between, ":y_pos", -100, 850),
(prop_instance_get_starting_position, pos0, ":target_1_id"),
(prop_instance_get_position, pos1, ":target_1_id"),
(get_sq_distance_between_positions_in_meters, ":dist", pos0, pos1),
(le, ":dist", 2), #this can be 0 or 1 too.
(overlay_set_position, "$g_presentation_obj_flag_projection_display_1", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 1),
(else_try),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_1", 0),
(try_end),
(position_get_screen_projection, pos3, pos2),
(position_get_x, ":x_pos", pos3),
(position_get_y, ":y_pos", pos3),
(position_set_y, pos3, ":y_pos"),
(try_begin),
(is_between, ":x_pos", -100, 1100),
(is_between, ":y_pos", -100, 850),
(prop_instance_get_starting_position, pos0, ":target_2_id"),
(prop_instance_get_position, pos1, ":target_2_id"),
(get_sq_distance_between_positions_in_meters, ":dist", pos0, pos1),
(le, ":dist", 2), #this can be 0 or 1 too.
(overlay_set_position, "$g_presentation_obj_flag_projection_display_2", pos3),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_2", 1),
(else_try),
(overlay_set_display, "$g_presentation_obj_flag_projection_display_2", 0),
(try_end),
(else_try),
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_respawn_time_counter", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(assign, "$g_multiplayer_respawn_counter_overlay", -1),
(assign, "$g_multiplayer_respawn_remained_overlay", -1),
(assign, ":do_not_show_respawn_counter", 0),
(try_begin),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_round_result_in_siege_mode),
(this_or_next|eq, "$g_round_ended", 1),
(eq, "$g_flag_is_not_ready", 1),
(assign, ":do_not_show_respawn_counter", 1),
(try_end),
(eq, ":do_not_show_respawn_counter", 0),
(assign, "$g_multiplayer_last_respawn_counter_value", -1),
(str_clear, s0),
(create_text_overlay, "$g_multiplayer_respawn_counter_overlay", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_respawn_counter_overlay", 0xFFFFFF),
(position_set_x, pos1, 500),
(position_set_y, pos1, 600),
(overlay_set_position, "$g_multiplayer_respawn_counter_overlay", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_respawn_counter_overlay", pos1),
(str_clear, s0),
(create_text_overlay, "$g_multiplayer_respawn_remained_overlay", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_respawn_remained_overlay", 0xFFFFFF),
(position_set_x, pos1, 500),
(position_set_y, pos1, 570),
(overlay_set_position, "$g_multiplayer_respawn_remained_overlay", pos1),
(position_set_x, pos1, 1400),
(position_set_y, pos1, 1400),
(overlay_set_size, "$g_multiplayer_respawn_remained_overlay", pos1),
#(store_mission_timer_a, "$g_multiplayer_respawn_start_time"),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_run, [
(ge, "$g_multiplayer_respawn_counter_overlay", 0),
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_team_no, ":player_team", ":my_player_no"),
(try_begin),
(eq, ":player_team", multi_team_spectator),
(presentation_set_duration, 0),
(else_try),
(store_mission_timer_a, ":current_time"),
(store_sub, ":seconds_past_in_respawn", ":current_time", "$g_multiplayer_respawn_start_time"),
(try_begin),
(eq, "$g_show_no_more_respawns_remained", 0),
(assign, ":total_respawn_time", "$g_multiplayer_respawn_period"),
(else_try),
(assign, ":total_respawn_time", 6),
(try_end),
(store_sub, ":seconds_left_in_respawn", ":total_respawn_time", ":seconds_past_in_respawn"),
(try_begin),
(le, ":seconds_left_in_respawn", 0),
(presentation_set_duration, 0),
(else_try),
(neq, "$g_multiplayer_last_respawn_counter_value", ":seconds_left_in_respawn"),
(assign, "$g_multiplayer_last_respawn_counter_value", ":seconds_left_in_respawn"),
(try_begin),
(eq, "$g_show_no_more_respawns_remained", 0),
(assign, reg0, ":seconds_left_in_respawn"),
(str_store_string, s0, "str_respawning_in_reg0_seconds"),
(try_begin),
(gt, "$g_multiplayer_number_of_respawn_count", 0),
(store_sub, reg0, "$g_multiplayer_number_of_respawn_count", "$g_my_spawn_count"),
(multiplayer_get_my_player, ":my_player_no"),
(player_get_team_no, ":my_player_team", ":my_player_no"),
(eq, ":my_player_team", 0),
(try_begin),
(gt, reg0, 1),
(str_store_string, s1, "str_reg0_respawns_remained"),
(else_try),
(str_store_string, s1, "str_this_is_your_last_respawn"),
(try_end),
(else_try),
(str_clear, s1),
(try_end),
(else_try),
(eq, "$g_show_no_more_respawns_remained", 1),
##(assign, "$g_informed_about_no_more_respawns_remained", 1),
(str_store_string, s0, "str_no_more_respawns_remained_this_round"),
(str_clear, s1),
(str_store_string, s1, "str_wait_next_round"),
(try_end),
(overlay_set_text, "$g_multiplayer_respawn_counter_overlay", s0),
(overlay_set_text, "$g_multiplayer_respawn_remained_overlay", s1),
(try_end),
(try_end),
(else_try),
(presentation_set_duration, 0),
(try_end),
(try_begin),
(eq, "$g_multiplayer_message_type", multiplayer_message_type_round_result_in_siege_mode),
(this_or_next|eq, "$g_round_ended", 1),
(eq, "$g_flag_is_not_ready", 1),
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_stats_chart", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(create_mesh_overlay, reg0, "mesh_mp_score_b"),
(position_set_x, pos1, 100),
(position_set_y, pos1, 100),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(assign, ":team_1_rows", 0),
(assign, ":team_2_rows", 0),
(assign, ":spectator_rows", 0),
(get_max_players, ":num_players"),
(try_for_range, ":player_no", 0, ":num_players"),
(store_add, ":slot_index", ":player_no", multi_data_player_index_list_begin),
(try_begin),
(player_is_active, ":player_no"),
(troop_set_slot, "trp_multiplayer_data", ":slot_index", 1),
(player_get_team_no, ":player_team", ":player_no"),
(try_begin),
(eq, ":player_team", 0),
(val_add, ":team_1_rows", 1),
(else_try),
(eq, ":player_team", 1),
(val_add, ":team_2_rows", 1),
(else_try),
(eq, ":player_team", multi_team_spectator),
(val_add, ":spectator_rows", 1),
(try_end),
(else_try),
(troop_set_slot, "trp_multiplayer_data", ":slot_index", 0),
(try_end),
(try_end),
(try_begin),
(gt, "$g_multiplayer_num_bots_team_1", 0),
(val_add, ":team_1_rows", 1),
(try_end),
(try_begin),
(gt, "$g_multiplayer_num_bots_team_2", 0),
(val_add, ":team_2_rows", 1),
(try_end),
(assign, ":total_rows", ":team_1_rows"),
(val_max, ":total_rows", ":team_2_rows"),
(val_add, ":total_rows", ":spectator_rows"),
(str_clear, s0),
(create_text_overlay, "$g_presentation_obj_stats_chart_container", s0, tf_scrollable_style_2),
(position_set_x, pos1, 100),
(position_set_y, pos1, 120),#120
(overlay_set_position, "$g_presentation_obj_stats_chart_container", pos1),
(position_set_x, pos1, 746),
(position_set_y, pos1, 530),#530
(overlay_set_area_size, "$g_presentation_obj_stats_chart_container", pos1),
(set_container_overlay, "$g_presentation_obj_stats_chart_container"),
(store_mul, ":y_needed", ":total_rows", 20),
(val_add, ":y_needed", 100),
(try_begin),
(gt, ":spectator_rows", 0),
(val_add, ":y_needed", 70),
(try_end),
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(gt, ":y_needed", 490),
(assign, "$g_stats_chart_update_period", 8),
(else_try),
(assign, "$g_stats_chart_update_period", 1),
(try_end),
#assuming only 2 teams in scene
(try_for_range, ":i_team", 0, multi_team_spectator),
(assign, ":number_of_players", 0),
(get_max_players, ":num_players"),
(try_for_range, ":player_no", 0, ":num_players"),
(player_is_active, ":player_no"),
(player_get_team_no, ":team_no", ":player_no"),
(eq, ":team_no", ":i_team"),
(val_add, ":number_of_players", 1),
(try_end),
(assign, reg0, ":number_of_players"),
(try_begin),
(neq, ":number_of_players", 1),
(create_text_overlay, reg1, "str_reg0_players", 0),
(else_try),
(create_text_overlay, reg1, "str_reg0_player", 0),
(try_end),
(assign, ":cur_y", ":y_needed"),
(team_get_faction, ":cur_faction", ":i_team"),
(str_store_faction_name, s1, ":cur_faction"),
(create_text_overlay, reg0, s1, 0),
(try_begin),
(eq, ":i_team", 0),
(overlay_set_color, reg0, 0xFF0000),
(overlay_set_color, reg1, 0xFF0000),
(else_try),
(overlay_set_color, reg0, 0x0099FF),
(overlay_set_color, reg1, 0x0099FF),
(try_end),
(assign, ":distance_between_teams", 373),
(store_mul, ":cur_x", ":distance_between_teams", ":i_team"),
(val_add, ":cur_x", 42),
(store_add, ":cur_x_add_15", ":cur_x", 15),
(position_set_x, pos3, ":cur_x_add_15"),
(position_set_y, pos3, ":cur_y"),
(store_add, ":cur_x_add_35", ":cur_x", 35),
(position_set_x, pos1, ":cur_x_add_35"),
(position_set_y, pos1, ":cur_y"),
(copy_position, pos2, pos1),
(store_sub, ":cur_y_sub_10", ":cur_y", 10),
(position_set_x, pos2, ":cur_x_add_35"),
(position_set_y, pos2, ":cur_y_sub_10"),
(overlay_set_position, reg0, pos1),
(overlay_set_position, reg1, pos2),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(position_set_x, pos2, 600),
(position_set_y, pos2, 600),
(overlay_set_size, reg0, pos1),
(overlay_set_size, reg1, pos2),
# (team_get_faction, ":faction_of_team_1", 0),
# (team_get_faction, ":faction_of_team_2", 1),
# (try_begin),
# (eq, ":faction_of_team_1", ":faction_of_team_2"),
# (eq, ":i_team", 1),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_7"),
# (else_try),
# (eq, ":cur_faction", "fac_kingdom_4"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_1"),
# (else_try),
# (eq, ":cur_faction", "fac_kingdom_2"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_2"),
# (else_try),
# (eq, ":cur_faction", "fac_kingdom_3"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_3"),
# (else_try),
# (eq, ":cur_faction", "fac_kingdom_5"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_4"),
# (else_try),
# (eq, ":cur_faction", "fac_kingdom_6"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_5"),
# (else_try),
# (eq, ":cur_faction", "fac_kingdom_1"),
# (create_mesh_overlay, reg0, "mesh_ui_kingdom_shield_6"),
# (try_end),
(position_set_x, pos1, 100),
(position_set_y, pos1, 100),
(overlay_set_position, reg0, pos3),
(position_set_x, pos1, 50),
(position_set_y, pos1, 50),
(overlay_set_size, reg0, pos1),
(team_get_score, reg0, ":i_team"),
(create_text_overlay, reg0, "str_score_reg0", tf_right_align),
(overlay_set_color, reg0, 0xFFFFFF),
(store_add, ":sub_cur_x", ":cur_x", 325), #325
(store_add, ":sub_cur_y", ":cur_y", 0),
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":sub_cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1200),
(position_set_y, pos1, 1200),
(overlay_set_size, reg0, pos1),
(val_sub, ":cur_y", 60),
(create_text_overlay, reg0, "str_player_name", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, ":cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(create_text_overlay, reg0, "str_kills", tf_center_justify),
(overlay_set_color, reg0, 0xFFFFFF),
(store_add, ":sub_cur_x", ":cur_x", 206), #191
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(create_text_overlay, reg0, "str_deaths", tf_center_justify),
(overlay_set_color, reg0, 0xFFFFFF),
(store_add, ":sub_cur_x", ":cur_x", 260), #232
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(create_text_overlay, reg0, "str_ping", tf_center_justify),
(overlay_set_color, reg0, 0xFFFFFF),
(store_add, ":sub_cur_x", ":cur_x", 308), #291
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(create_mesh_overlay, reg0, "mesh_white_plane"),
(overlay_set_color, reg0, 0xFFFFFF),
(overlay_set_alpha, reg0, 0xD0),
(store_add, ":sub_cur_x", ":cur_x", 0),
(position_set_x, pos1, ":sub_cur_x"),
(store_add, ":sub_cur_y", ":cur_y", -10),
(position_set_y, pos1, ":sub_cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 16500),
(position_set_y, pos1, 50),
(overlay_set_size, reg0, pos1),
(val_sub, ":cur_y", 35),
(store_add, ":end_cond", ":num_players", 1),
(try_for_range, ":unused", 0, ":end_cond"),
(assign, ":max_score_plus_death", -30030),
(assign, ":max_score_plus_death_player_no", -1),
(try_for_range, ":player_no", 0, ":num_players"),
(store_add, ":slot_index", ":player_no", multi_data_player_index_list_begin),
(troop_slot_eq, "trp_multiplayer_data", ":slot_index", 1),
(player_get_team_no, ":player_team", ":player_no"),
(eq, ":player_team", ":i_team"),
(try_begin),
(player_get_kill_count, ":kill_count", ":player_no"), #get kill count in "siege" or "battle" or "team deathmatch" or "deathmatch"
(try_end),
(player_get_death_count, ":death_count", ":player_no"), #get_death_count
(store_mul, ":player_score_plus_death", ":kill_count", 1000),
(val_sub, ":player_score_plus_death", ":death_count"),
(this_or_next|gt, ":player_score_plus_death", ":max_score_plus_death"),
(eq, ":player_score_plus_death", -30030),
(assign, ":max_score_plus_death", ":player_score_plus_death"),
(assign, ":max_score_plus_death_player_no", ":player_no"),
(try_end),
(try_begin),
(ge, ":max_score_plus_death_player_no", 0),
(store_add, ":slot_index", ":max_score_plus_death_player_no", multi_data_player_index_list_begin),
(troop_set_slot, "trp_multiplayer_data", ":slot_index", 0),
(try_begin),
(eq, ":my_player_no", ":max_score_plus_death_player_no"),
(create_mesh_overlay, reg0, "mesh_white_plane"),
(overlay_set_color, reg0, 0xFFFFFF),
(overlay_set_alpha, reg0, 0x35),
(store_add, ":sub_cur_x", ":cur_x", 0),
(position_set_x, pos1, ":sub_cur_x"),
(store_add, ":sub_cur_y", ":cur_y", 0),
(position_set_y, pos1, ":sub_cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 16500),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(try_end),
(try_begin),
(assign, ":font_color", 0xFFFFFF),
(player_get_agent_id, ":max_score_plus_death_agent_id", ":max_score_plus_death_player_no"),
(try_begin),
(this_or_next|lt, ":max_score_plus_death_agent_id", 0),
(neg|agent_is_alive, ":max_score_plus_death_agent_id"),
(assign, ":font_color", 0xFF0000),
(create_text_overlay, reg0, "str_dead", 0),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(store_add, ":sub_cur_x", ":cur_x", 130),
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(try_end),
(try_end),
(str_store_player_username, s1, ":max_score_plus_death_player_no"),
(create_text_overlay, reg0, s1, 0),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(position_set_x, pos1, ":cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(player_get_kill_count, reg0, ":max_score_plus_death_player_no"), #get_kill_count
(create_text_overlay, reg0, "str_reg0", tf_right_align),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(store_add, ":sub_cur_x", ":cur_x", 215), #200
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(player_get_death_count, reg0, ":max_score_plus_death_player_no"),
(create_text_overlay, reg0, "str_reg0", tf_right_align),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(store_add, ":sub_cur_x", ":cur_x", 265), #250
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(player_get_ping, reg0, ":max_score_plus_death_player_no"),
(create_text_overlay, reg0, "str_reg0", tf_right_align),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(store_add, ":sub_cur_x", ":cur_x", 315), #300
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", 20),
(else_try),
(try_begin),
(try_begin),
(eq, ":i_team", 0),
(assign, ":number_of_bots_in_cur_team", "$g_multiplayer_num_bots_team_1"),
(else_try),
(assign, ":number_of_bots_in_cur_team", "$g_multiplayer_num_bots_team_2"),
(try_end),
(team_get_bot_kill_count, reg0, ":i_team"),
(team_get_bot_death_count, reg1, ":i_team"),
(try_begin),
(this_or_next|neq, reg0, 0),
(this_or_next|neq, reg1, 0),
(neq, ":number_of_bots_in_cur_team", 0),
(assign, ":write_bot_informations_of_team", 1),
(else_try),
(assign, ":write_bot_informations_of_team", 0),
(try_end),
(eq, ":write_bot_informations_of_team", 1),
(assign, ":number_of_alive_bots", 0),
(try_for_agents, ":cur_agent"),
(agent_is_non_player, ":cur_agent"),
(agent_is_alive, ":cur_agent"),
(agent_get_team, ":cur_agent_team", ":cur_agent"),
(eq, ":cur_agent_team", ":i_team"),
(val_add, ":number_of_alive_bots", 1),
(try_end),
(store_sub, ":number_of_dead_bots", ":number_of_bots_in_cur_team", ":number_of_alive_bots"),
(try_begin),
(eq, ":number_of_alive_bots", 0),
(assign, ":font_color", 0xFF0000),
(else_try),
(assign, ":font_color", 0xD0D0D0),
(try_end),
(try_begin),
(gt, ":number_of_dead_bots", 0),
(try_begin),
(eq, ":number_of_bots_in_cur_team", 1),
(create_text_overlay, reg0, "str_dead", 0),
(store_add, ":sub_cur_x", ":cur_x", 130),
(else_try),
(assign, reg0, ":number_of_dead_bots"),
(create_text_overlay, reg0, "str_reg0_dead", 0),
(store_add, ":sub_cur_x", ":cur_x", 123),
(try_end),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(try_end),
(try_begin),
(gt, ":number_of_bots_in_cur_team", 1),
(assign, reg0, ":number_of_bots_in_cur_team"),
(create_text_overlay, reg0, "str_bots_reg0_agents", 0),
(else_try),
(create_text_overlay, reg0, "str_bot_1_agent", 0),
(try_end),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(position_set_x, pos1, ":cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(team_get_bot_kill_count, reg0, ":i_team"),
(create_text_overlay, reg0, "str_reg0", tf_right_align),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(store_add, ":sub_cur_x", ":cur_x", 215), #200
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(team_get_bot_death_count, reg0, ":i_team"),
(create_text_overlay, reg0, "str_reg0", tf_right_align),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(store_add, ":sub_cur_x", ":cur_x", 265), #250
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", 20),
(try_end),
(assign, ":end_cond", 0), #all players are written for this team, break
(try_end),
(try_end),
(try_begin),
(eq, ":i_team", 0),
(assign, ":old_cur_y", ":cur_y"),
(try_end),
(try_end),
(try_begin),
(le, ":old_cur_y", ":cur_y"),
(assign, ":cur_y", ":old_cur_y"),
(try_end),
(assign, ":cur_x", 42),
#white line between playing players and spectators
(create_mesh_overlay, reg0, "mesh_white_plane"),
(overlay_set_color, reg0, 0xFFFFFF),
(overlay_set_alpha, reg0, 0xD0),
(store_add, ":sub_cur_x", ":cur_x", 0),
(position_set_x, pos1, ":sub_cur_x"),
(store_add, ":sub_cur_y", ":cur_y", 10),
(position_set_y, pos1, ":sub_cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 36000),
(position_set_y, pos1, 50),
(overlay_set_size, reg0, pos1),
(try_begin),
(gt, ":spectator_rows", 0),
(assign, ":cur_x", 280),
(val_sub, ":cur_y", 50),
#"spectators" text
(create_text_overlay, reg0, "str_spectators", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, ":cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(create_text_overlay, reg0, "str_ping", tf_right_align),
(overlay_set_color, reg0, 0xFFFFFF),
(store_add, ":sub_cur_x", ":cur_x", 215), #200
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
#white line for spectators list
(create_mesh_overlay, reg0, "mesh_white_plane"),
(overlay_set_color, reg0, 0xFFFFFF),
(overlay_set_alpha, reg0, 0xD0),
(store_add, ":sub_cur_x", ":cur_x", 0),
(position_set_x, pos1, ":sub_cur_x"),
(store_add, ":sub_cur_y", ":cur_y", -10),
(position_set_y, pos1, ":sub_cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 12000),
(position_set_y, pos1, 50),
(overlay_set_size, reg0, pos1),
(val_sub, ":cur_y", 30),
(assign, ":font_color", 0xC0C0C0),
(store_add, ":end_cond", ":num_players", 1),
(try_for_range, ":player_no", 0, ":end_cond"),
(store_add, ":slot_index", ":player_no", multi_data_player_index_list_begin),
(troop_slot_eq, "trp_multiplayer_data", ":slot_index", 1),
(player_get_team_no, ":player_team", ":player_no"),
(eq, ":player_team", multi_team_spectator), #to not to allow dedicated server to pass below, dedicated servers have -1 for team_no not 2(multi_team_spectator).
(troop_set_slot, "trp_multiplayer_data", ":slot_index", 1),
(try_begin),
(eq, ":my_player_no", ":player_no"),
(create_mesh_overlay, reg0, "mesh_white_plane"),
(overlay_set_color, reg0, 0xFFFFFF),
(overlay_set_alpha, reg0, 0x35),
(store_add, ":sub_cur_x", ":cur_x", 0),
(position_set_x, pos1, ":sub_cur_x"),
(store_add, ":sub_cur_y", ":cur_y", 0),
(position_set_y, pos1, ":sub_cur_y"),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 12000),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(try_end),
(str_store_player_username, s1, ":player_no"),
(create_text_overlay, reg0, s1, 0),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(position_set_x, pos1, ":cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(player_get_ping, reg0, ":player_no"),
(create_text_overlay, reg0, "str_reg0", tf_right_align),
(overlay_set_color, reg0, ":font_color"),
(position_set_x, pos1, 750),
(position_set_y, pos1, 750),
(overlay_set_size, reg0, pos1),
(store_add, ":sub_cur_x", ":cur_x", 215), #200
(position_set_x, pos1, ":sub_cur_x"),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", 20),
(try_end),
(try_end),
(omit_key_once, key_mouse_scroll_up),
(omit_key_once, key_mouse_scroll_down),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_run,
[(store_trigger_param_1, ":cur_time"),
(try_begin),
(this_or_next|key_clicked, key_mouse_scroll_up),
(key_clicked, key_mouse_scroll_down),
(omit_key_once, key_mouse_scroll_up),
(omit_key_once, key_mouse_scroll_down),
(try_end),
(try_begin),
(eq, "$g_multiplayer_stats_chart_opened_manually", 1),
(neg|game_key_is_down, gk_leave),
(assign, "$g_multiplayer_stats_chart_opened_manually", 0),
(clear_omitted_keys),
(presentation_set_duration, 0),
(try_end),
(try_begin),
(store_mul, ":update_period_time_limit", "$g_stats_chart_update_period", 1000),
(gt, ":cur_time", ":update_period_time_limit"),
(clear_omitted_keys),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_stats_chart"),
(try_end),
]),
]),
("multiplayer_escape_menu", prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(create_mesh_overlay, reg0, "mesh_mp_ingame_menu"),
(position_set_x, pos1, 250),
(position_set_y, pos1, 80),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(str_clear, s0),
(create_text_overlay, "$g_presentation_obj_escape_menu_container", s0, tf_scrollable_style_2),
(position_set_x, pos1, 285),
(position_set_y, pos1, 75),
(overlay_set_position, "$g_presentation_obj_escape_menu_container", pos1),
(position_set_x, pos1, 405),
(position_set_y, pos1, 550),
(overlay_set_area_size, "$g_presentation_obj_escape_menu_container", pos1),
(set_container_overlay, "$g_presentation_obj_escape_menu_container"),
(assign, ":cur_y", 500),
(create_text_overlay, reg0, "str_choose_an_option", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 0),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", escape_menu_item_height),
# (create_button_overlay, "$g_presentation_obj_escape_menu_1", "str_choose_faction", 0),
# (overlay_set_color, "$g_presentation_obj_escape_menu_1", 0xFFFFFF),
# (multiplayer_get_my_team, ":my_team"),
# (assign, "$g_presentation_obj_escape_menu_2", -1),
# (assign, "$g_presentation_obj_escape_menu_3", -1),
# (assign, "$g_presentation_obj_escape_menu_6", -1),
# (assign, "$g_presentation_obj_escape_menu_7", -1),
# (assign, "$g_presentation_obj_escape_menu_8", -1),
# (assign, "$g_presentation_obj_escape_menu_9", -1),
# (assign, "$g_presentation_obj_escape_menu_10", -1),
# (assign, "$g_presentation_obj_escape_menu_11", -1),
# (assign, "$g_presentation_obj_escape_menu_12", -1),
# (assign, "$g_presentation_obj_escape_menu_13", -1),
# (try_begin),
# (lt, ":my_team", multi_team_spectator),
# (create_button_overlay, "$g_presentation_obj_escape_menu_2", "str_choose_troop", 0),
# (overlay_set_color, "$g_presentation_obj_escape_menu_2", 0xFFFFFF),
# (multiplayer_get_my_troop, ":my_troop"),
# (try_begin),
# (ge, ":my_troop", 0),
# (create_button_overlay, "$g_presentation_obj_escape_menu_3", "str_choose_items", 0),
# (overlay_set_color, "$g_presentation_obj_escape_menu_3", 0xFFFFFF),
# (try_end),
# (try_end),
(create_button_overlay, "$g_presentation_obj_escape_menu_4", "str_options", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_4", 0xFFFFFF),
(create_button_overlay, "$g_presentation_obj_escape_menu_5", "str_redefine_keys", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_5", 0xFFFFFF),
(create_button_overlay, "$g_presentation_obj_escape_menu_13", "@Show game rules", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_13", 0xFFFFFF),
(multiplayer_get_my_player, ":my_player_no"),
(try_begin),
(this_or_next|eq, "$g_multiplayer_maps_voteable", 1),
(this_or_next|eq, "$g_multiplayer_factions_voteable", 1),
(this_or_next|gt, "$g_multiplayer_num_bots_voteable", 0),
(this_or_next|eq, "$g_multiplayer_kick_voteable", 1),
(eq, "$g_multiplayer_ban_voteable", 1),
(create_button_overlay, "$g_presentation_obj_escape_menu_6", "str_submit_a_poll", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_6", 0xFFFFFF),
(assign, "$g_presentation_obj_escape_menu_6_available", 1),
(try_begin),
(ge, ":my_player_no", 0),
(player_get_slot, ":last_poll_time", ":my_player_no", slot_player_poll_disabled_until_time),
(store_mission_timer_a, ":mission_timer"),
(lt, ":mission_timer", ":last_poll_time"),
(overlay_set_color, "$g_presentation_obj_escape_menu_6", 0x888888),
(overlay_set_hilight_color, "$g_presentation_obj_escape_menu_6", 0x888888),
(assign, "$g_presentation_obj_escape_menu_6_available", 0),
(try_end),
(try_end),
(try_begin),
(ge, ":my_player_no", 0),
(player_is_admin, ":my_player_no"),
(create_button_overlay, "$g_presentation_obj_escape_menu_7", "str_administrator_panel", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_7", 0xFFFFFF),
(create_button_overlay, "$g_presentation_obj_escape_menu_8", "str_kick_player", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_8", 0xFFFFFF),
(create_button_overlay, "$g_presentation_obj_escape_menu_9", "str_ban_player", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_9", 0xFFFFFF),
(try_end),
(create_button_overlay, "$g_presentation_obj_escape_menu_11", "str_mute_player", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_11", 0xFFFFFF),
(try_begin),
(assign, "$g_presentation_obj_escape_menu_12", -1),
(assign, ":any_muted", 0),
(get_max_players, ":num_players"),
(try_for_range, ":player_no", 0, ":num_players"),
(player_is_active, ":player_no"),
(player_get_is_muted, ":is_muted", ":player_no"),
(eq, ":is_muted", 1),
(assign, ":any_muted", 1),
(try_end),
(eq, ":any_muted", 1),
(create_button_overlay, "$g_presentation_obj_escape_menu_12", "str_unmute_player", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_12", 0xFFFFFF),
(try_end),
(create_button_overlay, "$g_presentation_obj_escape_menu_10", "str_quit", 0),
(overlay_set_color, "$g_presentation_obj_escape_menu_10", 0xFFFFFF),
(position_set_x, pos1, 130),
(position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_escape_menu_1", pos1),
# (try_begin),
# (ge, "$g_presentation_obj_escape_menu_2", 0),
# (val_sub, ":cur_y", escape_menu_item_height),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_escape_menu_2", pos1),
# (try_end),
# (try_begin),
# (ge, "$g_presentation_obj_escape_menu_3", 0),
# (val_sub, ":cur_y", escape_menu_item_height),
# (position_set_y, pos1, ":cur_y"),
# (overlay_set_position, "$g_presentation_obj_escape_menu_3", pos1),
# (try_end),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_4", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_5", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_13", pos1),
(try_begin),
(ge, "$g_presentation_obj_escape_menu_6", 0),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_6", pos1),
(try_end),
(try_begin),
(ge, "$g_presentation_obj_escape_menu_7", 0),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_7", pos1),
(try_end),
(try_begin),
(ge, "$g_presentation_obj_escape_menu_8", 0),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_8", pos1),
(try_end),
(try_begin),
(ge, "$g_presentation_obj_escape_menu_9", 0),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_9", pos1),
(try_end),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_11", pos1),
(try_begin),
(ge, "$g_presentation_obj_escape_menu_12", 0),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_12", pos1),
(try_end),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_escape_menu_10", pos1),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_event_state_change,
[(store_trigger_param_1, ":object"),
(try_begin),
# (eq, ":object", "$g_presentation_obj_escape_menu_1"),
# (presentation_set_duration, 0),
# (start_presentation, "prsnt_multiplayer_team_select"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_escape_menu_2"),
# (presentation_set_duration, 0),
# (start_presentation, "prsnt_multiplayer_troop_select"),
# (else_try),
# (eq, ":object", "$g_presentation_obj_escape_menu_3"),
# (presentation_set_duration, 0),
# (assign, "$g_presentation_state", 0),
# (start_presentation, "prsnt_multiplayer_item_select"),
# (else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_4"),
(presentation_set_duration, 0),
(change_screen_options),
(else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_5"),
(presentation_set_duration, 0),
(change_screen_controls),
(else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_6"),
(eq, "$g_presentation_obj_escape_menu_6_available", 1),
(presentation_set_duration, 0),
(start_presentation, "prsnt_multiplayer_poll_menu"),
(else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_7"),
(presentation_set_duration, 0),
(multiplayer_send_message_to_server, multiplayer_event_open_admin_panel),
(else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_8"),
(presentation_set_duration, 0),
(assign, "$g_multiplayer_players_list_action_type", 3), #admin kick
(start_presentation, "prsnt_multiplayer_show_players_list"),
(else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_9"),
(presentation_set_duration, 0),
(assign, "$g_multiplayer_players_list_action_type", 4), #admin ban
(start_presentation, "prsnt_multiplayer_show_players_list"),
(else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_10"),
(presentation_set_duration, 0),
(finish_mission, 0),
(else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_11"),
(presentation_set_duration, 0),
(assign, "$g_multiplayer_players_list_action_type", 5), #mute player
(start_presentation, "prsnt_multiplayer_show_players_list"),
(else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_12"),
(presentation_set_duration, 0),
(assign, "$g_multiplayer_players_list_action_type", 6), #unmute player
(start_presentation, "prsnt_multiplayer_show_players_list"),
(else_try),
(eq, ":object", "$g_presentation_obj_escape_menu_13"),
(presentation_set_duration, 0),
(multiplayer_send_message_to_server, multiplayer_event_open_game_rules),
(try_end),
]),
(ti_on_presentation_run,
[(store_trigger_param_1, ":cur_time"),
(try_begin),
(this_or_next|key_clicked, key_escape),
(key_clicked, key_xbox_start),
(gt, ":cur_time", 200),
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_poll_menu", prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(create_mesh_overlay, reg0, "mesh_mp_ingame_menu"),
(position_set_x, pos1, 250),
(position_set_y, pos1, 80),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(str_clear, s0),
(create_text_overlay, "$g_presentation_obj_poll_menu_container", s0, tf_scrollable_style_2),
(position_set_x, pos1, 285),
(position_set_y, pos1, 125),
(overlay_set_position, "$g_presentation_obj_poll_menu_container", pos1),
(position_set_x, pos1, 405),
(position_set_y, pos1, 500),
(overlay_set_area_size, "$g_presentation_obj_poll_menu_container", pos1),
(set_container_overlay, "$g_presentation_obj_poll_menu_container"),
(assign, "$g_presentation_obj_poll_menu_1", -1),
(assign, "$g_presentation_obj_poll_menu_4", -1),
(assign, "$g_presentation_obj_poll_menu_5", -1),
(assign, ":cur_y", 450),
(create_text_overlay, reg0, "str_choose_a_poll_type", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 0),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(position_set_x, pos1, 60),
(try_begin),
(eq, "$g_multiplayer_maps_voteable", 1),
(create_button_overlay, "$g_presentation_obj_poll_menu_1", "str_poll_for_changing_the_map", 0),
(overlay_set_color, "$g_presentation_obj_poll_menu_1", 0xFFFFFF),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_poll_menu_1", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(try_end),
(try_begin),
(eq, "$g_multiplayer_factions_voteable", 1),
(create_button_overlay, "$g_presentation_obj_poll_menu_4", "str_poll_for_changing_the_map_and_factions", 0),
(overlay_set_color, "$g_presentation_obj_poll_menu_4", 0xFFFFFF),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_poll_menu_4", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(try_end),
(try_begin),
(gt, "$g_multiplayer_num_bots_voteable", 0),
(create_button_overlay, "$g_presentation_obj_poll_menu_5", "str_poll_for_changing_number_of_bots", 0),
(overlay_set_color, "$g_presentation_obj_poll_menu_5", 0xFFFFFF),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_poll_menu_5", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(try_end),
(try_begin),
(eq, "$g_multiplayer_kick_voteable", 1),
(create_button_overlay, "$g_presentation_obj_poll_menu_2", "str_poll_for_kicking_a_player", 0),
(overlay_set_color, "$g_presentation_obj_poll_menu_2", 0xFFFFFF),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_poll_menu_2", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(try_end),
(try_begin),
(eq, "$g_multiplayer_ban_voteable", 1),
(create_button_overlay, "$g_presentation_obj_poll_menu_3", "str_poll_for_banning_a_player", 0),
(overlay_set_color, "$g_presentation_obj_poll_menu_3", 0xFFFFFF),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, "$g_presentation_obj_poll_menu_3", pos1),
(try_end),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_event_state_change,
[(store_trigger_param_1, ":object"),
(try_begin),
(eq, ":object", "$g_presentation_obj_poll_menu_1"),
(presentation_set_duration, 0),
(assign, "$g_multiplayer_maps_list_action_type", 1), #poll map
(start_presentation, "prsnt_multiplayer_show_maps_list"),
(else_try),
(eq, ":object", "$g_presentation_obj_poll_menu_2"),
(presentation_set_duration, 0),
(assign, "$g_multiplayer_players_list_action_type", 1), #poll kick
(start_presentation, "prsnt_multiplayer_show_players_list"),
(else_try),
(eq, ":object", "$g_presentation_obj_poll_menu_3"),
(presentation_set_duration, 0),
(assign, "$g_multiplayer_players_list_action_type", 2), #poll ban
(start_presentation, "prsnt_multiplayer_show_players_list"),
(else_try),
(eq, ":object", "$g_presentation_obj_poll_menu_4"),
(presentation_set_duration, 0),
(assign, "$g_multiplayer_maps_list_action_type", 2), #poll map and factions
(start_presentation, "prsnt_multiplayer_show_maps_list"),
(else_try),
(eq, ":object", "$g_presentation_obj_poll_menu_5"),
(presentation_set_duration, 0),
(assign, "$g_multiplayer_number_of_bots_list_action_type", 1), #for team 1
(start_presentation, "prsnt_multiplayer_show_number_of_bots_list"),
(try_end),
]),
(ti_on_presentation_run,
[(store_trigger_param_1, ":cur_time"),
(try_begin),
(this_or_next|key_clicked, key_escape),
(key_clicked, key_xbox_start),
(gt, ":cur_time", 200),
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_show_players_list", prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(create_mesh_overlay, reg0, "mesh_mp_ingame_menu"),
(position_set_x, pos1, 250),
(position_set_y, pos1, 80),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(str_clear, s0),
(create_text_overlay, "$g_presentation_obj_show_players_1", s0, tf_scrollable_style_2),
(position_set_x, pos1, 285),
(position_set_y, pos1, 125),
(overlay_set_position, "$g_presentation_obj_show_players_1", pos1),
(position_set_x, pos1, 405),
(position_set_y, pos1, 500),
(overlay_set_area_size, "$g_presentation_obj_show_players_1", pos1),
(set_container_overlay, "$g_presentation_obj_show_players_1"),
#(assign, ":cur_y", 450),
(multiplayer_get_my_player, ":my_player_no"),
(assign, ":cur_y", 10),
(get_max_players, ":num_players"),
(try_for_range, ":player_no", 1, ":num_players"), #0 is server no need to write it
(player_is_active, ":player_no"),
(assign, ":continue", 0),
(try_begin),
(neq, "$g_multiplayer_players_list_action_type", 5),
(neq, "$g_multiplayer_players_list_action_type", 6),
(assign, ":continue", 1),
(else_try),
(eq, "$g_multiplayer_players_list_action_type", 5),
(neq, ":player_no", ":my_player_no"),
(player_get_is_muted, ":is_muted", ":player_no"),
(eq, ":is_muted", 0),
(assign, ":continue", 1),
(else_try),
(eq, "$g_multiplayer_players_list_action_type", 6),
(neq, ":player_no", ":my_player_no"),
(player_get_is_muted, ":is_muted", ":player_no"),
(eq, ":is_muted", 1),
(assign, ":continue", 1),
(try_end),
(eq, ":continue", 1),
(val_add, ":cur_y", escape_menu_item_height),
(try_end),
(create_text_overlay, reg0, "str_choose_a_player", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 0),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(get_max_players, ":num_players"),
(try_for_range, ":player_no", 1, ":num_players"), #0 is server no need to write it
(player_is_active, ":player_no"),
(player_set_slot, ":player_no", slot_player_button_index, -1),
(assign, ":continue", 0),
(try_begin),
(neq, "$g_multiplayer_players_list_action_type", 5),
(neq, "$g_multiplayer_players_list_action_type", 6),
(assign, ":continue", 1),
(else_try),
(eq, "$g_multiplayer_players_list_action_type", 5),
(neq, ":player_no", ":my_player_no"),
(player_get_is_muted, ":is_muted", ":player_no"),
(eq, ":is_muted", 0),
(assign, ":continue", 1),
(else_try),
(eq, "$g_multiplayer_players_list_action_type", 6),
(neq, ":player_no", ":my_player_no"),
(player_get_is_muted, ":is_muted", ":player_no"),
(eq, ":is_muted", 1),
(assign, ":continue", 1),
(try_end),
(eq, ":continue", 1),
(str_store_player_username, s0, ":player_no"),
(create_button_overlay, ":overlay_id", s0, 0),
(overlay_set_color, ":overlay_id", 0xFFFFFF),
(position_set_x, pos1, 130),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, ":overlay_id", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(player_set_slot, ":player_no", slot_player_button_index, ":overlay_id"),
(try_end),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_event_state_change,
[(store_trigger_param_1, ":object"),
(get_max_players, ":num_players"),
(try_for_range, ":player_no", 1, ":num_players"), #0 is server no need to write it
(player_is_active, ":player_no"),
(player_slot_eq, ":player_no", slot_player_button_index, ":object"),
(try_begin),
(is_between, "$g_multiplayer_players_list_action_type", 1, 3), #poll kick or poll ban
(try_begin),
(multiplayer_get_my_player, ":my_player_no"),
(ge, ":my_player_no", 0),
(multiplayer_send_2_int_to_server, multiplayer_event_start_new_poll, "$g_multiplayer_players_list_action_type", ":player_no"),
(store_mission_timer_a, ":mission_timer"),
(val_add, ":mission_timer", multiplayer_poll_disable_period),
(player_set_slot, ":my_player_no", slot_player_poll_disabled_until_time, ":mission_timer"),
(try_end),
(else_try),
(eq, "$g_multiplayer_players_list_action_type", 3), #admin kick
(multiplayer_send_int_to_server, multiplayer_event_admin_kick_player, ":player_no"),
(else_try),
(eq, "$g_multiplayer_players_list_action_type", 4), #admin ban
(multiplayer_send_int_to_server, multiplayer_event_admin_ban_player, ":player_no"),
(else_try),
(eq, "$g_multiplayer_players_list_action_type", 5), #mute player
(player_set_is_muted, ":player_no", 1),
(else_try),
(eq, "$g_multiplayer_players_list_action_type", 6), #unmute player
(player_set_is_muted, ":player_no", 0),
(try_end),
(assign, ":num_players", 0), #break
(presentation_set_duration, 0),
(try_end),
]),
(ti_on_presentation_run,
[(store_trigger_param_1, ":cur_time"),
(try_begin),
(this_or_next|key_clicked, key_escape),
(key_clicked, key_xbox_start),
(gt, ":cur_time", 200),
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_show_maps_list", prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(create_mesh_overlay, reg0, "mesh_mp_ingame_menu"),
(position_set_x, pos1, 250),
(position_set_y, pos1, 80),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(str_clear, s0),
(create_text_overlay, "$g_presentation_obj_show_maps_list_menu_container", s0, tf_scrollable_style_2),
(position_set_x, pos1, 285),
(position_set_y, pos1, 125),
(overlay_set_position, "$g_presentation_obj_show_maps_list_menu_container", pos1),
(position_set_x, pos1, 405),
(position_set_y, pos1, 500),
(overlay_set_area_size, "$g_presentation_obj_show_maps_list_menu_container", pos1),
(set_container_overlay, "$g_presentation_obj_show_maps_list_menu_container"),
(call_script, "script_multiplayer_fill_map_game_types", "$g_multiplayer_game_type"),
(assign, ":num_maps", reg0),
(store_mul, ":cur_y", ":num_maps", escape_menu_item_height),
(val_add, ":cur_y", 10),
(create_text_overlay, reg0, "str_choose_a_map", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 0),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(assign, ":overlay_id", -1),
(try_for_range, ":i_map", 0, ":num_maps"),
(store_add, ":map_slot", ":i_map", multi_data_maps_for_game_type_begin),
(troop_get_slot, ":map_no", "trp_multiplayer_data", ":map_slot"),
(store_sub, ":string_index", ":map_no", multiplayer_scenes_begin),
(val_add, ":string_index", multiplayer_scene_names_begin),
(str_store_string, s0, ":string_index"),
(create_button_overlay, ":overlay_id", s0, 0),
(overlay_set_color, ":overlay_id", 0xFFFFFF),
(position_set_x, pos1, 100),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, ":overlay_id", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(try_end),
(store_add, "$g_show_maps_list_button_list_end_index", ":overlay_id", 1),
(store_sub, "$g_show_maps_list_button_list_first_index", "$g_show_maps_list_button_list_end_index", ":num_maps"),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_event_state_change,
[(store_trigger_param_1, ":object"),
(try_for_range, ":i_button", "$g_show_maps_list_button_list_first_index", "$g_show_maps_list_button_list_end_index"),
(eq, ":object", ":i_button"),
(call_script, "script_multiplayer_fill_map_game_types", "$g_multiplayer_game_type"),
(store_sub, ":map_slot", ":object", "$g_show_maps_list_button_list_first_index"),
(val_add, ":map_slot", multi_data_maps_for_game_type_begin),
(troop_get_slot, ":scene_id", "trp_multiplayer_data", ":map_slot"),
(presentation_set_duration, 0),
(try_begin),
(eq, "$g_multiplayer_maps_list_action_type", 1), #vote for map
(try_begin),
(multiplayer_get_my_player, ":my_player_no"),
(ge, ":my_player_no", 0),
(multiplayer_send_2_int_to_server, multiplayer_event_start_new_poll, 0, ":scene_id"),
(store_mission_timer_a, ":mission_timer"),
(val_add, ":mission_timer", multiplayer_poll_disable_period),
(player_set_slot, ":my_player_no", slot_player_poll_disabled_until_time, ":mission_timer"),
(try_end),
(else_try), #vote for map and factions
(assign, "$g_multiplayer_factions_list_action_type", 1), #for team 1
(assign, "$g_multiplayer_poll_for_map_and_faction_data_map", ":scene_id"),
(start_presentation, "prsnt_multiplayer_show_factions_list"),
(try_end),
(assign, "$g_show_maps_list_button_list_end_index", 0), #break;
(try_end),
]),
(ti_on_presentation_run,
[(store_trigger_param_1, ":cur_time"),
(try_begin),
(this_or_next|key_clicked, key_escape),
(key_clicked, key_xbox_start),
(gt, ":cur_time", 200),
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_show_factions_list", prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(create_mesh_overlay, reg0, "mesh_mp_ingame_menu"),
(position_set_x, pos1, 250),
(position_set_y, pos1, 80),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(str_clear, s0),
(create_text_overlay, "$g_presentation_obj_show_factions_list_menu_container", s0, tf_scrollable_style_2),
(position_set_x, pos1, 285),
(position_set_y, pos1, 125),
(overlay_set_position, "$g_presentation_obj_show_factions_list_menu_container", pos1),
(position_set_x, pos1, 405),
(position_set_y, pos1, 500),
(overlay_set_area_size, "$g_presentation_obj_show_factions_list_menu_container", pos1),
(set_container_overlay, "$g_presentation_obj_show_factions_list_menu_container"),
(store_sub, ":num_factions", multiplayer_factions_end, multiplayer_factions_begin),
(try_begin),
(eq, "$g_multiplayer_factions_list_action_type", 2),
(val_sub, ":num_factions", 1),
(try_end),
(store_mul, ":cur_y", ":num_factions", escape_menu_item_height),
(val_add, ":cur_y", 10),
(assign, reg0, "$g_multiplayer_factions_list_action_type"),
(create_text_overlay, reg0, "str_choose_a_faction_for_team_reg0", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 0),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(assign, ":overlay_id", -1),
(try_for_range, ":i_faction", multiplayer_factions_begin, multiplayer_factions_end),
(this_or_next|eq, "$g_multiplayer_factions_list_action_type", 1),
(neq, "$g_multiplayer_poll_for_map_and_faction_data_faction_1", ":i_faction"),
(str_store_faction_name, s0, ":i_faction"),
(create_button_overlay, ":overlay_id", s0, 0),
(overlay_set_color, ":overlay_id", 0xFFFFFF),
(position_set_x, pos1, 100),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, ":overlay_id", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(try_end),
(store_add, "$g_show_factions_list_button_list_end_index", ":overlay_id", 1),
(store_sub, "$g_show_factions_list_button_list_first_index", "$g_show_factions_list_button_list_end_index", ":num_factions"),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_event_state_change,
[(store_trigger_param_1, ":object"),
(try_for_range, ":i_button", "$g_show_factions_list_button_list_first_index", "$g_show_factions_list_button_list_end_index"),
(eq, ":object", ":i_button"),
(store_sub, ":faction_no", ":object", "$g_show_factions_list_button_list_first_index"),
(val_add, ":faction_no", multiplayer_factions_begin),
(presentation_set_duration, 0),
(try_begin),
(eq, "$g_multiplayer_factions_list_action_type", 2), #vote for second team
(try_begin),
(ge, ":faction_no", "$g_multiplayer_poll_for_map_and_faction_data_faction_1"),
(val_add, ":faction_no", 1),
(try_end),
(try_begin),
(multiplayer_get_my_player, ":my_player_no"),
(ge, ":my_player_no", 0),
(multiplayer_send_4_int_to_server, multiplayer_event_start_new_poll, 3, "$g_multiplayer_poll_for_map_and_faction_data_map", "$g_multiplayer_poll_for_map_and_faction_data_faction_1", ":faction_no"),
(store_mission_timer_a, ":mission_timer"),
(val_add, ":mission_timer", multiplayer_poll_disable_period),
(player_set_slot, ":my_player_no", slot_player_poll_disabled_until_time, ":mission_timer"),
(try_end),
(else_try), #vote for first team
(assign, "$g_multiplayer_factions_list_action_type", 2), #for team 2
(assign, "$g_multiplayer_poll_for_map_and_faction_data_faction_1", ":faction_no"),
(start_presentation, "prsnt_multiplayer_show_factions_list"),
(try_end),
(assign, "$g_show_factions_list_button_list_end_index", 0), #break;
(try_end),
]),
(ti_on_presentation_run,
[(store_trigger_param_1, ":cur_time"),
(try_begin),
(this_or_next|key_clicked, key_escape),
(key_clicked, key_xbox_start),
(gt, ":cur_time", 200),
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_show_number_of_bots_list", prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(create_mesh_overlay, reg0, "mesh_mp_ingame_menu"),
(position_set_x, pos1, 250),
(position_set_y, pos1, 80),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1000),
(position_set_y, pos1, 1000),
(overlay_set_size, reg0, pos1),
(str_clear, s0),
(create_text_overlay, "$g_presentation_obj_show_number_of_bots_list_menu_container", s0, tf_scrollable_style_2),
(position_set_x, pos1, 285),
(position_set_y, pos1, 125),
(overlay_set_position, "$g_presentation_obj_show_number_of_bots_list_menu_container", pos1),
(position_set_x, pos1, 405),
(position_set_y, pos1, 500),
(overlay_set_area_size, "$g_presentation_obj_show_number_of_bots_list_menu_container", pos1),
(set_container_overlay, "$g_presentation_obj_show_number_of_bots_list_menu_container"),
(assign, ":num_options", 0),
(store_add, ":end_cond", "$g_multiplayer_num_bots_voteable", 1),
(try_for_range, ":i_number", 0, ":end_cond"),
(assign, ":i_number_mod_5", ":i_number"),
(val_mod, ":i_number_mod_5", 5),
(this_or_next|lt, ":i_number", 10),
(eq, ":i_number_mod_5", 0),
(val_add, ":num_options", 1),
(try_end),
(store_mul, ":cur_y", ":num_options", escape_menu_item_height),
(val_add, ":cur_y", 10),
(assign, reg0, "$g_multiplayer_number_of_bots_list_action_type"),
(create_text_overlay, reg0, "str_choose_number_of_bots_for_team_reg0", 0),
(overlay_set_color, reg0, 0xFFFFFF),
(position_set_x, pos1, 0),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, reg0, pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(assign, ":overlay_id", -1),
(try_for_range, ":i_number", 0, ":end_cond"),
(assign, ":i_number_mod_5", ":i_number"),
(val_mod, ":i_number_mod_5", 5),
(this_or_next|lt, ":i_number", 10),
(eq, ":i_number_mod_5", 0),
(assign, reg0, ":i_number"),
(str_store_string, s0, "str_reg0"),
(create_button_overlay, ":overlay_id", s0, 0),
(overlay_set_color, ":overlay_id", 0xFFFFFF),
(position_set_x, pos1, 100),
(position_set_y, pos1, ":cur_y"),
(overlay_set_position, ":overlay_id", pos1),
(val_sub, ":cur_y", escape_menu_item_height),
(try_end),
(store_add, "$g_show_number_of_bots_list_button_list_end_index", ":overlay_id", 1),
(store_sub, "$g_show_number_of_bots_list_button_list_first_index", "$g_show_number_of_bots_list_button_list_end_index", ":num_options"),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_event_state_change,
[(store_trigger_param_1, ":object"),
(try_for_range, ":i_button", "$g_show_number_of_bots_list_button_list_first_index", "$g_show_number_of_bots_list_button_list_end_index"),
(eq, ":object", ":i_button"),
(store_sub, ":value_index", ":object", "$g_show_number_of_bots_list_button_list_first_index"),
(try_begin),
(lt, ":value_index", 10),
(assign, ":used_value", ":value_index"),
(else_try),
(store_sub, ":used_value", ":value_index", 8),
(val_mul, ":used_value", 5),
(try_end),
(presentation_set_duration, 0),
(try_begin),
(eq, "$g_multiplayer_number_of_bots_list_action_type", 2), #vote for second team
(try_begin),
(multiplayer_get_my_player, ":my_player_no"),
(ge, ":my_player_no", 0),
(multiplayer_send_3_int_to_server, multiplayer_event_start_new_poll, 4, "$g_multiplayer_poll_number_of_bots_team_1", ":used_value"),
(store_mission_timer_a, ":mission_timer"),
(val_add, ":mission_timer", multiplayer_poll_disable_period),
(player_set_slot, ":my_player_no", slot_player_poll_disabled_until_time, ":mission_timer"),
(try_end),
(else_try), #vote for first team
(assign, "$g_multiplayer_number_of_bots_list_action_type", 2), #for team 2
(assign, "$g_multiplayer_poll_number_of_bots_team_1", ":used_value"),
(start_presentation, "prsnt_multiplayer_show_number_of_bots_list"),
(try_end),
(assign, "$g_show_number_of_bots_list_button_list_end_index", 0), #break;
(try_end),
]),
(ti_on_presentation_run,
[(store_trigger_param_1, ":cur_time"),
(try_begin),
(this_or_next|key_clicked, key_escape),
(key_clicked, key_xbox_start),
(gt, ":cur_time", 200),
(presentation_set_duration, 0),
(try_end),
]),
]),
("multiplayer_poll", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load,
[(set_fixed_point_multiplier, 1000),
(create_mesh_overlay, reg0, "mesh_white_plane"),
(overlay_set_color, reg0, 0x000000),
(overlay_set_alpha, reg0, 0x44),
(position_set_x, pos1, 50),
(position_set_y, pos1, 50),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 37500),
(position_set_y, pos1, 4500),
(overlay_set_size, reg0, pos1),
(try_begin),
(eq, "$g_multiplayer_poll_to_show", 0),
(store_sub, ":string_index", "$g_multiplayer_poll_value_to_show", multiplayer_scenes_begin),
(val_add, ":string_index", multiplayer_scene_names_begin),
(str_store_string, s0, ":string_index"),
(create_text_overlay, reg0, "str_poll_change_map", tf_center_justify),
(else_try),
(eq, "$g_multiplayer_poll_to_show", 1),
(str_store_player_username, s0, "$g_multiplayer_poll_value_to_show"),
(create_text_overlay, reg0, "str_poll_kick_player", tf_center_justify),
(else_try),
(eq, "$g_multiplayer_poll_to_show", 2),
(str_store_player_username, s0, "$g_multiplayer_poll_value_to_show"),
(create_text_overlay, reg0, "str_poll_ban_player", tf_center_justify),
(else_try),
(eq, "$g_multiplayer_poll_to_show", 3),
(store_sub, ":string_index", "$g_multiplayer_poll_value_to_show", multiplayer_scenes_begin),
(val_add, ":string_index", multiplayer_scene_names_begin),
(str_store_string, s0, ":string_index"),
(str_store_faction_name, s1, "$g_multiplayer_poll_value_2_to_show"),
(str_store_faction_name, s2, "$g_multiplayer_poll_value_3_to_show"),
(create_text_overlay, reg0, "str_poll_change_map_with_faction", tf_center_justify|tf_scrollable_style_2),
(else_try),
(assign, reg0, "$g_multiplayer_poll_value_to_show"),
(assign, reg1, "$g_multiplayer_poll_value_2_to_show"),
(str_store_faction_name, s0, "$g_multiplayer_team_1_faction"),
(str_store_faction_name, s1, "$g_multiplayer_team_2_faction"),
(create_text_overlay, reg0, "str_poll_change_number_of_bots", tf_center_justify|tf_scrollable_style_2),
(try_end),
(overlay_set_color, reg0, 0xFFFFFF),
(try_begin),
(neq, "$g_multiplayer_poll_to_show", 3),
(neq, "$g_multiplayer_poll_to_show", 4),
(position_set_x, pos1, 400),
(position_set_y, pos1, 100),
(overlay_set_position, reg0, pos1),
(else_try),
(position_set_x, pos1, 50),
(position_set_y, pos1, 70),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 750),
(position_set_y, pos1, 60),
(overlay_set_area_size, reg0, pos1),
(try_end),
(store_mission_timer_a, ":mission_timer"),
(store_sub, "$g_multiplayer_poll_last_written_seconds_left", "$g_multiplayer_poll_client_end_time", ":mission_timer"),
(assign, reg0, "$g_multiplayer_poll_last_written_seconds_left"),
(create_text_overlay, "$g_presentation_obj_poll_1", "str_poll_time_left", tf_right_align|tf_single_line),
(overlay_set_color, "$g_presentation_obj_poll_1", 0xFFFFFF),
(position_set_x, pos1, 790),
(position_set_y, pos1, 60),
(overlay_set_position, "$g_presentation_obj_poll_1", pos1),
(omit_key_once, key_1),
(omit_key_once, key_2),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_run,
[(store_trigger_param_1, ":cur_time"),
(try_begin),
(this_or_next|key_clicked, key_escape),
(this_or_next|key_clicked, key_xbox_start),
(key_clicked, key_2),
(gt, ":cur_time", 500),
(multiplayer_send_int_to_server, multiplayer_event_answer_to_poll, 0),
(clear_omitted_keys),
(presentation_set_duration, 0),
(else_try),
(key_clicked, key_1),
(gt, ":cur_time", 500),
(multiplayer_send_int_to_server, multiplayer_event_answer_to_poll, 1),
(clear_omitted_keys),
(presentation_set_duration, 0),
(try_end),
(store_mission_timer_a, ":mission_timer"),
(store_sub, ":time_left", "$g_multiplayer_poll_client_end_time", ":mission_timer"),
(try_begin),
(neq, ":time_left", "$g_multiplayer_poll_last_written_seconds_left"),
(try_begin),
(lt, ":time_left", 0),
(clear_omitted_keys),
(presentation_set_duration, 0),
(else_try),
(assign, "$g_multiplayer_poll_last_written_seconds_left", ":time_left"),
(assign, reg0, "$g_multiplayer_poll_last_written_seconds_left"),
(overlay_set_text, "$g_presentation_obj_poll_1", "str_poll_time_left"),
(try_end),
(try_end),
]),
]),
("multiplayer_duel_start_counter", prsntf_read_only|prsntf_manual_end_only, 0, [
(ti_on_presentation_load, [
(set_fixed_point_multiplier, 1000),
(assign, "$g_multiplayer_duel_start_counter_overlay", -1),
(assign, "$g_multiplayer_last_duel_start_counter_value", -1),
(str_clear, s0),
(create_text_overlay, "$g_multiplayer_duel_start_counter_overlay", s0, tf_center_justify|tf_with_outline),
(overlay_set_color, "$g_multiplayer_duel_start_counter_overlay", 0xFFFFFF),
(position_set_x, pos1, 500),
(position_set_y, pos1, 600),
(overlay_set_position, "$g_multiplayer_duel_start_counter_overlay", pos1),
(position_set_x, pos1, 2000),
(position_set_y, pos1, 2000),
(overlay_set_size, "$g_multiplayer_duel_start_counter_overlay", pos1),
(presentation_set_duration, 999999),
]),
(ti_on_presentation_run, [
(ge, "$g_multiplayer_duel_start_counter_overlay", 0),
(store_mission_timer_a, ":current_time"),
(store_sub, ":seconds_past_in_duel_start", ":current_time", "$g_multiplayer_duel_start_time"),
(store_sub, ":seconds_left_in_duel_start", 3, ":seconds_past_in_duel_start"),
(try_begin),
(le, ":seconds_left_in_duel_start", 0),
(presentation_set_duration, 0),
(else_try),
(neq, "$g_multiplayer_last_duel_start_counter_value", ":seconds_left_in_duel_start"),
(assign, "$g_multiplayer_last_duel_start_counter_value", ":seconds_left_in_duel_start"),
(assign, reg0, ":seconds_left_in_duel_start"),
(str_store_string, s0, "str_duel_starts_in_reg0_seconds"),
(overlay_set_text, "$g_multiplayer_duel_start_counter_overlay", s0),
(try_end),
]),
]),
("game_before_quit", 0, mesh_load_window,
[
(ti_on_presentation_load,
[
(try_begin),
(is_trial_version),
(set_fixed_point_multiplier, 1000),
(create_mesh_overlay, reg0, "mesh_quit_adv"),
(position_set_x, pos1, -1),
(position_set_y, pos1, -1),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1002),
(position_set_y, pos1, 1002),
(overlay_set_size, reg0, pos1),
(assign, "$g_game_before_quit_state", 0),
(presentation_set_duration, 999999),
(try_end),
]),
(ti_on_presentation_run,
[
(store_trigger_param_1, ":cur_time"),
(gt, ":cur_time", 500),
(try_begin),
(this_or_next|key_clicked, key_space),
(this_or_next|key_clicked, key_enter),
(this_or_next|key_clicked, key_escape),
(this_or_next|key_clicked, key_back_space),
(this_or_next|key_clicked, key_left_mouse_button),
(this_or_next|key_clicked, key_right_mouse_button),
(this_or_next|key_clicked, key_xbox_ltrigger),
(key_clicked, key_xbox_rtrigger),
(try_begin),
(eq, "$g_game_before_quit_state", 0),
(val_add, "$g_game_before_quit_state", 1),
(create_mesh_overlay, reg0, "mesh_quit_adv_b"),
(position_set_x, pos1, -1),
(position_set_y, pos1, -1),
(overlay_set_position, reg0, pos1),
(position_set_x, pos1, 1002),
(position_set_y, pos1, 1002),
(overlay_set_size, reg0, pos1),
(else_try),
(presentation_set_duration, 0),
(try_end),
(try_end),
]),
]),
]
| 47.76377 | 210 | 0.626239 | 30,867 | 246,270 | 4.414715 | 0.024751 | 0.057797 | 0.0627 | 0.04133 | 0.913869 | 0.889248 | 0.851676 | 0.802295 | 0.762734 | 0.70545 | 0 | 0.037129 | 0.237625 | 246,270 | 5,155 | 211 | 47.773036 | 0.688671 | 0.237516 | 0 | 0.706769 | 0 | 0 | 0.288354 | 0.188714 | 0 | 0 | 0.005722 | 0 | 0 | 1 | 0 | false | 0 | 0.002192 | 0 | 0.002192 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed66043f6fd90215f1b835c83565b305b5df21b5 | 51,197 | py | Python | source_code/FastPoseCNN/lib/ransac_voting_gpu_layer/ransac_voting_gpu.py | edavalosanaya/MastersProject | 5409aad0ac681e7017ab5abcea89b5b85664c715 | [
"MIT"
] | 3 | 2021-02-18T06:18:12.000Z | 2022-03-13T13:21:16.000Z | source_code/FastPoseCNN/lib/ransac_voting_gpu_layer/ransac_voting_gpu.py | edavalosanaya/FastPoseCNN | 5409aad0ac681e7017ab5abcea89b5b85664c715 | [
"MIT"
] | null | null | null | source_code/FastPoseCNN/lib/ransac_voting_gpu_layer/ransac_voting_gpu.py | edavalosanaya/FastPoseCNN | 5409aad0ac681e7017ab5abcea89b5b85664c715 | [
"MIT"
] | null | null | null | import torch
#import lib.ransac_voting_gpu_layer.ransac_voting as ransac_voting
import ransac_voting_gpu_layer.ransac_voting as ransac_voting
import numpy as np
def log_msg(msg):
# with open('ransac.log', 'a') as f:
# f.write(msg+'\n')
pass
def ransac_voting_layer(mask, vertex, class_num, round_hyp_num, inlier_thresh=0.999, confidence=0.99, max_iter=20,
min_num=5,max_num=30000):
'''
:param mask: [b,h,w]
:param vertex: [b,h,w,vn,2]
:param class_num:
:param round_hyp_num:
:param inlier_thresh:
:return:
'''
log_msg('ransac begin')
b,h,w,vn,_=vertex.shape
batch_win_pts=[]
for bi in range(b):
class_win_pts = []
hyp_num=0
for k in range(class_num-1):
cur_mask=mask[bi]==k+1
foreground=torch.sum(cur_mask)
log_msg('get sum')
# if too few points, just skip it
if foreground<min_num:
all_win_pts=torch.zeros([vn,2],dtype=torch.float32,device=mask.device)
class_win_pts.append(torch.unsqueeze(all_win_pts,0)) # [1,vn,2]
continue
# if too many inliers, we randomly down sample it
if foreground>max_num:
selection=torch.zeros(cur_mask.shape,dtype=torch.float32,device=mask.device).uniform_(0,1)
selected_mask=(selection<(max_num/foreground.float()))
cur_mask*=selected_mask
log_msg('test done')
coords=torch.nonzero(cur_mask).float() # [tn,2]
coords=coords[:,[1,0]]
log_msg('nonzero')
direct=vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask,2),3)) # [tn,vn,2]
direct=direct.view([coords.shape[0],vn,2])
log_msg('mask select')
tn=coords.shape[0]
idxs=torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
log_msg('random sample')
all_win_ratio=torch.zeros([vn],dtype=torch.float32,device=mask.device)
all_win_pts=torch.zeros([vn,2],dtype=torch.float32,device=mask.device)
log_msg('zeros')
cur_iter=0
while True:
# generate hypothesis
cur_hyp_pts=ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
log_msg('generate_hypothesis')
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, cur_hyp_pts, cur_inlier, inlier_thresh) # [hn,vn,tn]
log_msg('voting_for_hypothesis')
# find max
cur_inlier_counts=torch.sum(cur_inlier,2) # [hn,vn]
cur_win_counts,cur_win_idx=torch.max(cur_inlier_counts,0) # [vn]
cur_win_pts=cur_hyp_pts[cur_win_idx, torch.arange(vn)]
cur_win_ratio=cur_win_counts.float()/tn
log_msg('find max')
larger_mask=all_win_ratio<cur_win_ratio
all_win_pts[larger_mask,:]=cur_win_pts[larger_mask,:]
all_win_ratio[larger_mask]=cur_win_ratio[larger_mask]
log_msg('mask larger')
hyp_num+=round_hyp_num
cur_iter+=1
cur_min_ratio=torch.min(all_win_ratio)
# print('cur_min_ratio {} cur_confidence {}'.format(cur_min_ratio,(1-(1-cur_min_ratio**2)**hyp_num)))
log_msg('check condition')
if (1-(1-cur_min_ratio**2)**hyp_num)>confidence or cur_iter>max_iter:
break
class_win_pts.append(torch.unsqueeze(all_win_pts,0)) # [1,vn,2]
batch_win_pts.append(torch.unsqueeze(torch.cat(class_win_pts,0),0)) # [1,cn,vn,2]
log_msg('class append')
batch_win_pts=torch.cat(batch_win_pts,0)
log_msg('batch append')
return batch_win_pts
def ransac_voting_layer_v2(mask, vertex, class_num, round_hyp_num, inlier_thresh=0.999, confidence=0.99, max_iter=20,
min_num=5,max_num=30000,refine_iter_num=1):
'''
:param mask: [b,h,w]
:param vertex: [b,h,w,vn,2]
:param class_num:
:param round_hyp_num:
:param inlier_thresh:
:return: [b,cn,vn,2]
'''
log_msg('ransac begin')
b,h,w,vn,_=vertex.shape
batch_win_pts=[]
for bi in range(b):
class_win_pts = []
hyp_num=0
for k in range(class_num-1):
cur_mask=mask[bi]==k+1
foreground=torch.sum(cur_mask)
log_msg('get sum')
# if too few points, just skip it
if foreground<min_num:
all_win_pts=torch.zeros([vn,2],dtype=torch.float32,device=mask.device)
class_win_pts.append(torch.unsqueeze(all_win_pts,0)) # [1,vn,2]
continue
# if too many inliers, we randomly down sample it
if foreground>max_num:
selection=torch.zeros(cur_mask.shape,dtype=torch.float32,device=mask.device).uniform_(0,1)
selected_mask=(selection<(max_num/foreground.float()))
cur_mask*=selected_mask
log_msg('test done')
coords=torch.nonzero(cur_mask).float() # [tn,2]
coords=coords[:,[1,0]]
log_msg('nonzero')
direct=vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask,2),3)) # [tn,vn,2]
direct=direct.view([coords.shape[0],vn,2])
log_msg('mask select')
tn=coords.shape[0]
idxs=torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
log_msg('random sample')
all_win_ratio=torch.zeros([vn],dtype=torch.float32,device=mask.device)
all_win_pts=torch.zeros([vn,2],dtype=torch.float32,device=mask.device)
log_msg('zeros')
cur_iter=0
while True:
# generate hypothesis
cur_hyp_pts=ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
log_msg('generate_hypothesis')
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, cur_hyp_pts, cur_inlier, inlier_thresh) # [hn,vn,tn]
log_msg('voting_for_hypothesis')
# find max
cur_inlier_counts=torch.sum(cur_inlier,2) # [hn,vn]
cur_win_counts,cur_win_idx=torch.max(cur_inlier_counts,0) # [vn]
cur_win_pts=cur_hyp_pts[cur_win_idx, torch.arange(vn)]
cur_win_ratio=cur_win_counts.float()/tn
log_msg('find max')
larger_mask=all_win_ratio<cur_win_ratio
all_win_pts[larger_mask,:]=cur_win_pts[larger_mask,:]
all_win_ratio[larger_mask]=cur_win_ratio[larger_mask]
log_msg('mask larger')
hyp_num+=round_hyp_num
cur_iter+=1
cur_min_ratio=torch.min(all_win_ratio)
log_msg('check condition')
if (1-(1-cur_min_ratio**2)**hyp_num)>confidence or cur_iter>max_iter:
break
normal=torch.zeros_like(direct)
normal[:,:,0]=direct[:,:,1]
normal[:,:,1]=-direct[:,:,0]
# compute mean intersection again
for k in range(refine_iter_num):
all_inlier = torch.zeros([1, vn, tn], dtype=torch.uint8, device=mask.device)
all_win_pts=torch.unsqueeze(all_win_pts,0) # [1,vn,2]
ransac_voting.voting_for_hypothesis(direct, coords, all_win_pts, all_inlier, inlier_thresh) # [1,vn,tn]
log_msg('refine voting')
refine_pts=[]
for vi in range(vn):
cur_coords=coords[all_inlier[0,vi]] # in,2
if cur_coords.shape[0]==0:
refine_pts.append(torch.zeros([1,2]).cuda())
continue
cur_normal=normal[:,vi,:][all_inlier[0,vi]] # in,2
A=cur_normal # [cn,2]
b=torch.sum(cur_normal*cur_coords,1) # [cn]
refine_pt=torch.matmul(torch.pinverse(A),b) # [2]
refine_pts.append(torch.unsqueeze(refine_pt,0))
log_msg('invers ')
refine_pts=torch.cat(refine_pts,0)
all_win_pts=refine_pts
class_win_pts.append(torch.unsqueeze(all_win_pts,0)) # [1,vn,2]
batch_win_pts.append(torch.unsqueeze(torch.cat(class_win_pts,0),0)) # [1,cn,vn,2]
log_msg('class append')
batch_win_pts=torch.cat(batch_win_pts,0)
log_msg('batch append')
return batch_win_pts
def ransac_voting_hypothesis(mask, vertex, round_hyp_num, inlier_thresh=0.999, min_num=5, max_num=30000):
b, h, w, vn, _ = vertex.shape
all_hyp_pts,all_inlier_counts=[],[]
for bi in range(b):
k=0
cur_mask = mask[bi] == k + 1
foreground = torch.sum(cur_mask)
# if too few points, just skip it
if foreground < min_num:
cur_hyp_pts = torch.zeros([1, round_hyp_num, vn, 2], dtype=torch.float32, device=mask.device)
all_hyp_pts.append(cur_hyp_pts) # [1,vn,2]
cur_inlier_counts = torch.ones([1, round_hyp_num, vn], dtype=torch.int64, device=mask.device).long()
all_inlier_counts.append(cur_inlier_counts)
continue
# if too many inliers, we randomly down sample it
if foreground > max_num:
selection = torch.zeros(cur_mask.shape, dtype=torch.float32, device=mask.device).uniform_(0, 1)
selected_mask = (selection < (max_num / foreground.float()))
cur_mask *= selected_mask
coords = torch.nonzero(cur_mask).float() # [tn,2]
coords = coords[:, [1, 0]]
direct = vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask, 2), 3)) # [tn,vn,2]
direct = direct.view([coords.shape[0], vn, 2])
tn = coords.shape[0]
idxs = torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0,
direct.shape[0])
# generate hypothesis
cur_hyp_pts = ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, cur_hyp_pts, cur_inlier, inlier_thresh) # [hn,vn,tn]
cur_inlier_counts = torch.sum(cur_inlier, 2) # [hn,vn]
all_hyp_pts.append(torch.unsqueeze(cur_hyp_pts,0))
all_inlier_counts.append(torch.unsqueeze(cur_inlier_counts,0))
all_inlier_counts=torch.cat(all_inlier_counts, 0)
return torch.cat(all_hyp_pts,0), all_inlier_counts # [b,hn,vn,2] [b,hn,vn]
def estimate_voting_distribution(mask, vertex, round_hyp_num=256, min_hyp_num=4096, topk=128,
inlier_thresh=0.99, min_num=5, max_num=30000):
b, h, w, vn, _ = vertex.shape
all_hyp_pts,all_inlier_ratio=[],[]
for bi in range(b):
k=0
cur_mask = mask[bi] == k + 1
foreground = torch.sum(cur_mask)
# if too few points, just skip it
if foreground < min_num:
cur_hyp_pts = torch.zeros([1, round_hyp_num, vn, 2], dtype=torch.float32, device=mask.device).float()
all_hyp_pts.append(cur_hyp_pts) # [1,vn,2]
cur_inlier_ratio = torch.ones([1, round_hyp_num, vn], dtype=torch.int64, device=mask.device).float()
all_inlier_ratio.append(cur_inlier_ratio)
continue
# if too many inliers, we randomly down sample it
if foreground > max_num:
selection = torch.zeros(cur_mask.shape, dtype=torch.float32, device=mask.device).uniform_(0, 1)
selected_mask = (selection < (max_num / foreground.float()))
cur_mask *= selected_mask
foreground = torch.sum(cur_mask)
coords = torch.nonzero(cur_mask).float() # [tn,2]
coords = coords[:, [1, 0]]
direct = vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask, 2), 3)) # [tn,vn,2]
direct = direct.view([coords.shape[0], vn, 2])
tn = coords.shape[0]
round_num=np.ceil(min_hyp_num/round_hyp_num)
cur_hyp_pts=[]
cur_inlier_ratio=[]
for round_idx in range(int(round_num)):
idxs = torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
# generate hypothesis
hyp_pts = ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
# voting for hypothesis
inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, hyp_pts, inlier, inlier_thresh) # [hn,vn,tn]
inlier_ratio = torch.sum(inlier, 2) # [hn,vn]
inlier_ratio=inlier_ratio.float()/foreground.float() # ratio
cur_hyp_pts.append(hyp_pts)
cur_inlier_ratio.append(inlier_ratio)
cur_hyp_pts=torch.cat(cur_hyp_pts,0)
cur_inlier_ratio=torch.cat(cur_inlier_ratio,0)
all_hyp_pts.append(torch.unsqueeze(cur_hyp_pts,0))
all_inlier_ratio.append(torch.unsqueeze(cur_inlier_ratio,0))
all_hyp_pts=torch.cat(all_hyp_pts, 0) # b,hn,vn,2
all_inlier_ratio=torch.cat(all_inlier_ratio, 0) # b,hn,vn
all_hyp_pts=all_hyp_pts.permute(0,2,1,3) # b,vn,hn,2
all_inlier_ratio=all_inlier_ratio.permute(0,2,1) # b,vn,hn
values, indexes=torch.topk(all_inlier_ratio,topk,dim=2,sorted=False)
all_inlier_ratio=torch.zeros_like(all_inlier_ratio).scatter_(2, indexes, values)
weighted_pts=torch.unsqueeze(all_inlier_ratio,3)*all_hyp_pts
mean=torch.sum(weighted_pts,2)/torch.unsqueeze(torch.sum(all_inlier_ratio,2),2) # b,vn,2
diff_pts=all_hyp_pts-torch.unsqueeze(mean,2) # b,vn,hn,2
weighted_diff_pts = diff_pts * torch.unsqueeze(all_inlier_ratio, 3)
cov=torch.matmul(diff_pts.transpose(2,3), weighted_diff_pts) # b,vn,2,2
cov/=torch.unsqueeze(torch.unsqueeze(torch.sum(all_inlier_ratio,2),2),3) # b,vn,2,2
return mean,cov
def estimate_voting_distribution_with_mean(mask, vertex, mean, round_hyp_num=256, min_hyp_num=4096, topk=128,
inlier_thresh=0.99, min_num=5, max_num=30000, output_hyp=False):
b, h, w, vn, _ = vertex.shape
all_hyp_pts,all_inlier_ratio=[],[]
for bi in range(b):
k=0
cur_mask = mask[bi] == k + 1
foreground = torch.sum(cur_mask)
# if too few points, just skip it
if foreground < min_num:
cur_hyp_pts = torch.zeros([1, min_hyp_num, vn, 2], dtype=torch.float32, device=mask.device).float()
all_hyp_pts.append(cur_hyp_pts) # [1,vn,2]
cur_inlier_ratio = torch.ones([1, min_hyp_num, vn], dtype=torch.int64, device=mask.device).float()
all_inlier_ratio.append(cur_inlier_ratio)
continue
# if too many inliers, we randomly down sample it
if foreground > max_num:
selection = torch.zeros(cur_mask.shape, dtype=torch.float32, device=mask.device).uniform_(0, 1)
selected_mask = (selection < (max_num / foreground.float()))
cur_mask *= selected_mask
foreground = torch.sum(cur_mask)
coords = torch.nonzero(cur_mask).float() # [tn,2]
coords = coords[:, [1, 0]]
direct = vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask, 2), 3)) # [tn,vn,2]
direct = direct.view([coords.shape[0], vn, 2])
tn = coords.shape[0]
round_num=np.ceil(min_hyp_num/round_hyp_num)
cur_hyp_pts=[]
cur_inlier_ratio=[]
for round_idx in range(int(round_num)):
idxs = torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
# generate hypothesis
hyp_pts = ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
# voting for hypothesis
inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, hyp_pts, inlier, inlier_thresh) # [hn,vn,tn]
inlier_ratio = torch.sum(inlier, 2) # [hn,vn]
inlier_ratio=inlier_ratio.float()/foreground.float() # ratio
cur_hyp_pts.append(hyp_pts)
cur_inlier_ratio.append(inlier_ratio)
cur_hyp_pts=torch.cat(cur_hyp_pts,0)
cur_inlier_ratio=torch.cat(cur_inlier_ratio,0)
all_hyp_pts.append(torch.unsqueeze(cur_hyp_pts,0))
all_inlier_ratio.append(torch.unsqueeze(cur_inlier_ratio,0))
all_hyp_pts=torch.cat(all_hyp_pts, 0) # b,hn,vn,2
all_inlier_ratio=torch.cat(all_inlier_ratio, 0) # b,hn,vn
# raw_hyp_pts=all_hyp_pts.permute(0,2,1,3).clone()
# raw_hyp_ratio=all_inlier_ratio.permute(0,2,1).clone()
all_hyp_pts=all_hyp_pts.permute(0,2,1,3) # b,vn,hn,2
all_inlier_ratio=all_inlier_ratio.permute(0,2,1) # b,vn,hn
thresh=torch.max(all_inlier_ratio,2)[0]-0.1 # b,vn
all_inlier_ratio[all_inlier_ratio<torch.unsqueeze(thresh,2)]=0.0
diff_pts=all_hyp_pts-torch.unsqueeze(mean,2) # b,vn,hn,2
weighted_diff_pts = diff_pts * torch.unsqueeze(all_inlier_ratio, 3)
cov=torch.matmul(diff_pts.transpose(2,3), weighted_diff_pts) # b,vn,2,2
cov/=torch.unsqueeze(torch.unsqueeze(torch.sum(all_inlier_ratio,2),2),3)+1e-3 # b,vn,2,2
# if output_hyp:
# return mean,cov,all_hyp_pts,all_inlier_ratio,raw_hyp_pts,raw_hyp_ratio
return mean, cov
def ransac_voting_vanish_point_layer(mask, vertex, round_hyp_num, inlier_thresh=0.999,
confidence=0.99, max_iter=20, min_num=5,max_num=30000,refine_iter_num=1):
b,h,w,vn,_=vertex.shape
batch_win_pts=[]
for bi in range(b):
class_win_pts = []
hyp_num=0
for k in range(class_num-1):
cur_mask=mask[bi]==k+1
foreground=torch.sum(cur_mask)
# if too few points, just skip it
if foreground<min_num:
all_win_pts=torch.zeros([vn,2],dtype=torch.float32,device=mask.device)
class_win_pts.append(torch.unsqueeze(all_win_pts,0)) # [1,vn,2]
continue
# if too many inliers, we randomly down sample it
if foreground>max_num:
selection=torch.zeros(cur_mask.shape,dtype=torch.float32,device=mask.device).uniform_(0,1)
selected_mask=(selection<(max_num/foreground.float()))
cur_mask*=selected_mask
coords=torch.nonzero(cur_mask).float() # [tn,2]
coords=coords[:,[1,0]]
direct=vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask,2),3)) # [tn,vn,2]
direct=direct.view([coords.shape[0],vn,2])
tn=coords.shape[0]
idxs=torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
all_win_ratio=torch.zeros([vn],dtype=torch.float32,device=mask.device)
all_win_pts=torch.zeros([vn,3],dtype=torch.float32,device=mask.device)
cur_iter=0
while True:
# generate hypothesis
cur_hyp_pts=ransac_voting.generate_hypothesis_vanishing_point(direct, coords, idxs) # [hn,vn,3]
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis_vanishing_point(direct, coords, cur_hyp_pts,
cur_inlier, inlier_thresh) # [hn,vn,tn]
cur_hyp_pts/=torch.norm(cur_hyp_pts,2,2,keepdim=True)
# find max
cur_inlier_counts=torch.sum(cur_inlier,2) # [hn,vn]
cur_win_counts,cur_win_idx=torch.max(cur_inlier_counts,0) # [vn]
cur_win_pts=cur_hyp_pts[cur_win_idx, torch.arange(vn)]
cur_win_ratio=cur_win_counts.float()/tn
larger_mask=all_win_ratio<cur_win_ratio
all_win_pts[larger_mask,:]=cur_win_pts[larger_mask,:]
all_win_ratio[larger_mask]=cur_win_ratio[larger_mask]
hyp_num+=round_hyp_num
cur_iter+=1
cur_min_ratio=torch.min(all_win_ratio)
if (1-(1-cur_min_ratio**2)**hyp_num)>confidence or cur_iter>max_iter:
break
normal = torch.zeros_like(direct)
normal[:, :, 0] = direct[:, :, 1]
normal[:, :, 1] = -direct[:, :, 0]
# compute mean intersection again
for k in range(refine_iter_num):
all_inlier = torch.zeros([1, vn, tn], dtype=torch.uint8, device=mask.device)
all_win_pts = torch.unsqueeze(all_win_pts, 0) # [1,vn,3]
ransac_voting.voting_for_hypothesis_vanishing_point(direct, coords, all_win_pts, all_inlier,
inlier_thresh) # [1,vn,tn]
refine_pts = []
for vi in range(vn):
cur_coords = coords[all_inlier[0, vi]] # in,2
cur_normal = normal[:, vi, :][all_inlier[0, vi]] # in,2
H=torch.cat([-cur_normal,torch.unsqueeze(torch.sum(cur_normal*cur_coords,1),1)],1) # in,3
U, S, V=torch.svd(H,some=True) # 3,3
refine_pt=V[:,2:].transpose(0,1)
# correct direction
if (refine_pt[0,0]-refine_pt[0,2]*cur_coords[0,0])*(-cur_normal[0,1])<0:
refine_pt=-refine_pt
refine_pts.append(refine_pt)
refine_pts = torch.cat(refine_pts, 0)
all_win_pts = refine_pts
class_win_pts.append(torch.unsqueeze(all_win_pts, 0)) # [1,vn,2]
batch_win_pts.append(torch.unsqueeze(torch.cat(class_win_pts, 0), 0)) # [1,cn,vn,2]
log_msg('class append')
batch_win_pts = torch.cat(batch_win_pts, 0)
log_msg('batch append')
return batch_win_pts
def b_inv(b_mat):
'''
code from
https://stackoverflow.com/questions/46595157/how-to-apply-the-torch-inverse-function-of-pytorch-to-every-sample-in-the-batc
:param b_mat:
:return:
'''
eye = b_mat.new_ones(b_mat.size(-1)).diag().expand_as(b_mat)
# b_inv, _ = torch.gesv(eye, b_mat) # Updating the PyTorch Version
try:
b_inv, _ = torch.solve(eye, b_mat)
except RuntimeError: # Singularity
b_inv = torch.pinverse(b_mat)
return b_inv
def ransac_voting_layer_v3(mask, vertex, round_hyp_num, inlier_thresh=0.999, confidence=0.99, max_iter=20,
min_num=5, max_num=30000):
'''
:param mask: [b,h,w]
:param vertex: [b,h,w,vn,2]
:param round_hyp_num:
:param inlier_thresh:
:return: [b,vn,2]
'''
b, h, w, vn, _ = vertex.shape
batch_win_pts = []
for bi in range(b):
hyp_num = 0
#cur_mask = (mask[bi]).byte()
cur_mask = (mask[bi]).bool()
foreground_num = torch.sum(cur_mask)
# if too few points, just skip it
if foreground_num < min_num:
win_pts = torch.zeros([1, vn, 2], dtype=torch.float32, device=mask.device)
batch_win_pts.append(win_pts) # [1,vn,2]
continue
# if too many inliers, we randomly down sample it
if foreground_num > max_num:
selection = torch.zeros(cur_mask.shape, dtype=torch.float32, device=mask.device).uniform_(0, 1)
selected_mask = (selection < (max_num / foreground_num.float()))
cur_mask *= selected_mask
coords = torch.nonzero(cur_mask).float() # [tn,2]
coords = coords[:, [1, 0]]
direct = vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask, 2), 3)) # [tn,vn,2]
direct = direct.view([coords.shape[0], vn, 2])
tn = coords.shape[0]
idxs = torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
all_win_ratio = torch.zeros([vn], dtype=torch.float32, device=mask.device)
all_win_pts = torch.zeros([vn, 2], dtype=torch.float32, device=mask.device)
cur_iter = 0
while True:
# generate hypothesis
cur_hyp_pts = ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, cur_hyp_pts, cur_inlier, inlier_thresh) # [hn,vn,tn]
# find max
cur_inlier_counts = torch.sum(cur_inlier, 2) # [hn,vn]
cur_win_counts, cur_win_idx = torch.max(cur_inlier_counts, 0) # [vn]
cur_win_pts = cur_hyp_pts[cur_win_idx, torch.arange(vn)]
cur_win_ratio = cur_win_counts.float() / tn
# update best point
larger_mask = all_win_ratio < cur_win_ratio
all_win_pts[larger_mask, :] = cur_win_pts[larger_mask, :]
all_win_ratio[larger_mask] = cur_win_ratio[larger_mask]
# check confidence
hyp_num += round_hyp_num
cur_iter += 1
cur_min_ratio = torch.min(all_win_ratio)
if (1 - (1 - cur_min_ratio ** 2) ** hyp_num) > confidence or cur_iter > max_iter:
break
# compute mean intersection again
normal = torch.zeros_like(direct) # [tn,vn,2]
normal[:, :, 0] = direct[:, :, 1]
normal[:, :, 1] = -direct[:, :, 0]
all_inlier = torch.zeros([1, vn, tn], dtype=torch.uint8, device=mask.device)
all_win_pts = torch.unsqueeze(all_win_pts, 0) # [1,vn,2]
ransac_voting.voting_for_hypothesis(direct, coords, all_win_pts, all_inlier, inlier_thresh) # [1,vn,tn]
# coords [tn,2] normal [vn,tn,2]
all_inlier=torch.squeeze(all_inlier.float(),0) # [vn,tn]
normal=normal.permute(1,0,2) # [vn,tn,2]
normal=normal*torch.unsqueeze(all_inlier,2) # [vn,tn,2] outlier is all zero
b=torch.sum(normal*torch.unsqueeze(coords,0),2) # [vn,tn]
ATA=torch.matmul(normal.permute(0,2,1),normal) # [vn,2,2]
ATb=torch.sum(normal*torch.unsqueeze(b,2),1) # [vn,2]
all_win_pts=torch.matmul(b_inv(ATA),torch.unsqueeze(ATb,2)) # [vn,2,1]
batch_win_pts.append(all_win_pts[None,:,:,0])
try:
batch_win_pts=torch.cat(batch_win_pts)
except RuntimeError: # Empty list
batch_win_pts=torch.empty((0,vn,2), device=mask.device)
return batch_win_pts
def ransac_voting_center(mask, vertex, round_hyp_num, inlier_thresh=0.99, confidence=0.999, max_iter=20, min_num=100):
'''
:param mask: [b,h,w]
:param vertex: [b,h,w,2]
:param round_hyp_num:
:param inlier_thresh:
:return: batch_instance_mask [b,h,w] max_instance_num [b]
'''
b, h, w, _ = vertex.shape
vn=1
batch_instance_mask = []
batch_instance_num = []
for bi in range(b):
hyp_num = 0
cur_mask = (mask[bi]).byte()
foreground_num = torch.sum(cur_mask)
# if too few points, just skip it
if foreground_num < min_num:
instance_mask = torch.zeros([h,w], dtype=torch.float32, device=mask.device)
batch_instance_mask.append(instance_mask)
batch_instance_num.append(0)
continue
coords_int = torch.nonzero(cur_mask)
coords = coords_int.float() # [tn,2]
coords = coords[:, [1, 0]]
direct = vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask, 2), 3)) # [tn,2]
direct = direct.view([coords.shape[0], 1, 2])
tn = coords.shape[0]
idxs = torch.zeros([round_hyp_num, 1, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
all_win_ratio = torch.zeros([1], dtype=torch.float32, device=mask.device)
all_win_pts = torch.zeros([1, 2], dtype=torch.float32, device=mask.device)
cur_iter = 0
while True:
# generate hypothesis
cur_hyp_pts = ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,1,2]
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, cur_hyp_pts, cur_inlier, inlier_thresh) # [hn,1,tn]
# find max
cur_inlier_counts = torch.sum(cur_inlier, 2) # [hn,1]
cur_win_counts, cur_win_idx = torch.max(cur_inlier_counts, 0) # [1]
cur_win_pts = cur_hyp_pts[cur_win_idx, torch.arange(vn)]
cur_win_ratio = cur_win_counts.float() / tn
# update best point
larger_mask = all_win_ratio < cur_win_ratio
all_win_pts[larger_mask, :] = cur_win_pts[larger_mask, :]
all_win_ratio[larger_mask] = cur_win_ratio[larger_mask]
# check confidence
hyp_num += round_hyp_num
cur_iter += 1
cur_min_ratio = torch.min(all_win_ratio)
if (1 - (1 - cur_min_ratio ** 2) ** hyp_num) > confidence or cur_iter > max_iter:
break
# compute mean intersection again
all_inlier = torch.zeros([1, vn, tn], dtype=torch.uint8, device=mask.device)
all_win_pts = torch.unsqueeze(all_win_pts, 0)
ransac_voting.voting_for_hypothesis(direct, coords, all_win_pts, all_inlier, inlier_thresh)
all_inlier=torch.squeeze(all_inlier.float(),0) # [tn]
return batch_instance_mask
def ransac_voting_layer_v4(mask, vertex, round_hyp_num, inlier_thresh=0.99, confidence=0.999, max_iter=20,
min_num=5, max_num=30000):
'''
:param mask: [b,h,w]
:param vertex: [b,h,w,vn,2]
:param round_hyp_num:
:param inlier_thresh:
:return: [b,vn,2]
'''
b, h, w, vn, _ = vertex.shape
batch_win_pts = []
batch_var = []
for bi in range(b):
hyp_num = 0
cur_mask = (mask[bi]).byte()
foreground_num = torch.sum(cur_mask)
# if too few points, just skip it
if foreground_num < min_num:
win_pts = torch.zeros([1, vn, 2], dtype=torch.float32, device=mask.device)
batch_win_pts.append(win_pts) # [1,vn,2]
batch_var.append(torch.ones([1,vn],dtype=torch.float32, device=mask.device))
continue
# if too many inliers, we randomly down sample it
if foreground_num > max_num:
selection = torch.zeros(cur_mask.shape, dtype=torch.float32, device=mask.device).uniform_(0, 1)
selected_mask = (selection < (max_num / foreground_num.float()))
cur_mask *= selected_mask
coords = torch.nonzero(cur_mask).float() # [tn,2]
coords = coords[:, [1, 0]]
direct = vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask, 2), 3)) # [tn,vn,2]
direct = direct.view([coords.shape[0], vn, 2])
tn = coords.shape[0]
idxs = torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
all_win_ratio = torch.zeros([vn], dtype=torch.float32, device=mask.device)
all_win_pts = torch.zeros([vn, 2], dtype=torch.float32, device=mask.device)
cur_iter = 0
while True:
# generate hypothesis
cur_hyp_pts = ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, cur_hyp_pts, cur_inlier, inlier_thresh) # [hn,vn,tn]
# find max
cur_inlier_counts = torch.sum(cur_inlier, 2) # [hn,vn]
cur_win_counts, cur_win_idx = torch.max(cur_inlier_counts, 0) # [vn]
cur_win_pts = cur_hyp_pts[cur_win_idx, torch.arange(vn)]
cur_win_ratio = cur_win_counts.float() / tn
# update best point
larger_mask = all_win_ratio < cur_win_ratio
all_win_pts[larger_mask, :] = cur_win_pts[larger_mask, :]
all_win_ratio[larger_mask] = cur_win_ratio[larger_mask]
# check confidence
hyp_num += round_hyp_num
cur_iter += 1
cur_min_ratio = torch.min(all_win_ratio)
if (1 - (1 - cur_min_ratio ** 2) ** hyp_num) > confidence or cur_iter > max_iter:
break
# compute mean intersection again
normal = torch.zeros_like(direct) # [tn,vn,2]
normal[:, :, 0] = direct[:, :, 1]
normal[:, :, 1] = -direct[:, :, 0]
all_inlier = torch.zeros([1, vn, tn], dtype=torch.uint8, device=mask.device)
all_win_pts = torch.unsqueeze(all_win_pts, 0) # [1,vn,2]
ransac_voting.voting_for_hypothesis(direct, coords, all_win_pts, all_inlier, inlier_thresh) # [1,vn,tn]
# coords [tn,2] normal [vn,tn,2]
all_inlier=torch.squeeze(all_inlier.float(),0) # [vn,tn]
normal=normal.permute(1,0,2) # [vn,tn,2]
normal=normal*torch.unsqueeze(all_inlier,2) # [vn,tn,2] outlier is all zero
b=torch.sum(normal*torch.unsqueeze(coords,0),2) # [vn,tn]
ATA=torch.matmul(normal.permute(0,2,1),normal) # [vn,2,2]
ATb=torch.sum(normal*torch.unsqueeze(b,2),1) # [vn,2]
all_win_pts=torch.matmul(b_inv(ATA),torch.unsqueeze(ATb,2)) # [vn,2,1]
residual=torch.matmul(normal,all_win_pts)[:,:,0]-b # [vn,tn]
var=torch.sum(residual**2,1)/torch.sum(all_inlier,1) # [vn]
batch_win_pts.append(all_win_pts[None,:,:,0])
batch_var.append(var[None,:])
batch_win_pts=torch.cat(batch_win_pts)
batch_var=torch.cat(batch_var)
return batch_win_pts, batch_var
def ransac_voting_layer_v5(mask, vertex, round_hyp_num, inlier_thresh=0.999, confidence=0.99, max_iter=20,
min_num=5, max_num=100):
'''
:param mask: [b,h,w]
:param vertex: [b,h,w,vn,2]
:param round_hyp_num:
:param inlier_thresh:
:return: [b,vn,2] [b,vn,2,2]
'''
b, h, w, vn, _ = vertex.shape
batch_win_pts, batch_confidence = [], []
for bi in range(b):
hyp_num = 0
cur_mask = (mask[bi]).byte()
foreground_num = torch.sum(cur_mask)
# if too few points, just skip it
if foreground_num < min_num:
win_pts = torch.zeros([1, vn, 2], dtype=torch.float32, device=mask.device)
pts_conf = torch.zeros([1, vn], dtype=torch.float32, device=mask.device)
batch_win_pts.append(win_pts) # [1,vn,2]
batch_confidence.append(pts_conf) # [1, vn]
continue
# if too many inliers, we randomly down sample it
if foreground_num > max_num:
selection = torch.zeros(cur_mask.shape, dtype=torch.float32, device=mask.device).uniform_(0, 1)
selected_mask = (selection < (max_num / foreground_num.float()))
cur_mask *= selected_mask
# print(torch.sum(cur_mask))
coords = torch.nonzero(cur_mask).float() # [tn,2]
coords = coords[:, [1, 0]]
direct = vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask, 2), 3)) # [tn,vn,2]
direct = direct.view([coords.shape[0], vn, 2])
tn = coords.shape[0]
idxs = torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
all_win_ratio = torch.zeros([vn], dtype=torch.float32, device=mask.device)
all_win_pts = torch.zeros([vn, 2], dtype=torch.float32, device=mask.device)
cur_iter = 0
while True:
# generate hypothesis
cur_hyp_pts = ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, cur_hyp_pts, cur_inlier, inlier_thresh) # [hn,vn,tn]
# find max
cur_inlier_counts = torch.sum(cur_inlier, 2) # [hn,vn]
cur_win_counts, cur_win_idx = torch.max(cur_inlier_counts, 0) # [vn]
cur_win_pts = cur_hyp_pts[cur_win_idx, torch.arange(vn)]
cur_win_ratio = cur_win_counts.float() / tn
# update best point
larger_mask = all_win_ratio < cur_win_ratio
all_win_pts[larger_mask, :] = cur_win_pts[larger_mask, :]
all_win_ratio[larger_mask] = cur_win_ratio[larger_mask]
# check confidence
hyp_num += round_hyp_num
cur_iter += 1
cur_min_ratio = torch.min(all_win_ratio)
if (1 - (1 - cur_min_ratio ** 2) ** hyp_num) > confidence or cur_iter > max_iter:
break
# compute mean intersection again
normal = torch.zeros_like(direct) # [tn,vn,2]
normal[:, :, 0] = direct[:, :, 1]
normal[:, :, 1] = -direct[:, :, 0]
all_inlier = torch.zeros([1, vn, tn], dtype=torch.uint8, device=mask.device)
all_win_pts = torch.unsqueeze(all_win_pts, 0) # [1,vn,2]
ransac_voting.voting_for_hypothesis(direct, coords, all_win_pts, all_inlier, inlier_thresh) # [1,vn,tn]
# coords [tn,2] normal [vn,tn,2]
all_inlier=torch.squeeze(all_inlier.float(),0) # [vn,tn]
normal=normal.permute(1,0,2) # [vn,tn,2]
normal=normal*torch.unsqueeze(all_inlier,2) # [vn,tn,2] outlier is all zero
b=torch.sum(normal*torch.unsqueeze(coords,0),2) # [vn,tn]
ATA=torch.matmul(normal.permute(0,2,1),normal) # [vn,2,2]
ATb=torch.sum(normal*torch.unsqueeze(b,2),1) # [vn,2]
all_win_pts=torch.matmul(b_inv(ATA),torch.unsqueeze(ATb,2)) # [vn,2,1]
all_inlier=torch.zeros([1, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, torch.unsqueeze(all_win_pts[:,:,0], 0), all_inlier, 0.999)
pts_conf=torch.sum(all_inlier.int(),2).float()/tn # [1,vn]
batch_win_pts.append(all_win_pts[None,:,:,0])
batch_confidence.append(pts_conf)
batch_win_pts=torch.cat(batch_win_pts)
batch_confidence=torch.cat(batch_confidence)
return batch_win_pts, batch_confidence
def ransac_voting_layer_v6(mask, vertex, round_hyp_num, inlier_thresh=0.999, confidence=0.99, max_iter=20,
min_num=5, max_num=100):
'''
:param mask: [b,h,w]
:param vertex: [b,h,w,vn,2]
:param round_hyp_num:
:param inlier_thresh:
:return: [b,vn,2] [b,vn,2,2]
'''
b, h, w, vn, _ = vertex.shape
batch_win_pts, batch_confidence = [], []
mask2=mask.byte()
print(mask2.device)
val=torch.sum(mask2)
for bi in range(b):
hyp_num = 0
foreground_num = torch.sum(mask)
cur_mask = mask.byte()[bi]
# if too few points, just skip it
if foreground_num < min_num:
win_pts = torch.zeros([1, vn, 2], dtype=torch.float32, device=mask.device)
pts_conf = torch.zeros([1, vn], dtype=torch.float32, device=mask.device)
batch_win_pts.append(win_pts) # [1,vn,2]
batch_confidence.append(pts_conf) # [1, vn]
continue
# if too many inliers, we randomly down sample it
if foreground_num > max_num:
selection = torch.zeros(cur_mask.shape, dtype=torch.float32, device=mask.device).uniform_(0, 1)
selected_mask = (selection < (max_num / foreground_num.float()))
cur_mask *= selected_mask
coords = torch.nonzero(cur_mask).float() # [tn,2]
coords = coords[:, [1, 0]]
direct = vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask, 2), 3)) # [tn,vn,2]
direct = direct.view([coords.shape[0], vn, 2])
tn = coords.shape[0]
idxs = torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
all_win_ratio = torch.zeros([vn], dtype=torch.float32, device=mask.device)
all_win_pts = torch.zeros([vn, 2], dtype=torch.float32, device=mask.device)
cur_iter = 0
while True:
# generate hypothesis
cur_hyp_pts = ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, cur_hyp_pts, cur_inlier, inlier_thresh) # [hn,vn,tn]
# find max
cur_inlier_counts = torch.sum(cur_inlier, 2) # [hn,vn]
cur_win_counts, cur_win_idx = torch.max(cur_inlier_counts, 0) # [vn]
cur_win_pts = cur_hyp_pts[cur_win_idx, torch.arange(vn)]
cur_win_ratio = cur_win_counts.float() / tn
# update best point
larger_mask = all_win_ratio < cur_win_ratio
all_win_pts[larger_mask, :] = cur_win_pts[larger_mask, :]
all_win_ratio[larger_mask] = cur_win_ratio[larger_mask]
# check confidence
hyp_num += round_hyp_num
cur_iter += 1
cur_min_ratio = torch.min(all_win_ratio)
if (1 - (1 - cur_min_ratio ** 2) ** hyp_num) > confidence or cur_iter > max_iter:
break
# compute mean intersection again
normal = torch.zeros_like(direct) # [tn,vn,2]
normal[:, :, 0] = direct[:, :, 1]
normal[:, :, 1] = -direct[:, :, 0]
all_inlier = torch.zeros([1, vn, tn], dtype=torch.uint8, device=mask.device)
all_win_pts = torch.unsqueeze(all_win_pts, 0) # [1,vn,2]
ransac_voting.voting_for_hypothesis(direct, coords, all_win_pts, all_inlier, inlier_thresh) # [1,vn,tn]
# coords [tn,2] normal [vn,tn,2]
all_inlier=torch.squeeze(all_inlier.float(),0) # [vn,tn]
normal=normal.permute(1,0,2) # [vn,tn,2]
normal=normal*torch.unsqueeze(all_inlier,2) # [vn,tn,2] outlier is all zero
b=torch.sum(normal*torch.unsqueeze(coords,0),2) # [vn,tn]
ATA=torch.matmul(normal.permute(0,2,1),normal) # [vn,2,2]
ATb=torch.sum(normal*torch.unsqueeze(b,2),1) # [vn,2]
all_win_pts=torch.matmul(b_inv(ATA),torch.unsqueeze(ATb,2)) # [vn,2,1]
all_inlier=torch.zeros([1, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, torch.unsqueeze(all_win_pts[:,:,0], 0), all_inlier, 0.999)
pts_conf=torch.sum(all_inlier.int(),2).float()/tn # [1,vn]
batch_win_pts.append(all_win_pts[None,:,:,0])
batch_confidence.append(pts_conf)
batch_win_pts=torch.cat(batch_win_pts)
batch_confidence=torch.cat(batch_confidence)
return batch_win_pts, batch_confidence
def ransac_motion_voting(mask, vertex):
'''
:param mask: b,h,w
:param vertex: b,h,w,vn,2
:return:
'''
b, h, w, vn, _ = vertex.shape
pts=[]
for bi in range(b):
cur_mask=mask[bi].byte()
coords=torch.nonzero(cur_mask).float()
if coords.shape[0]<1:
pts.append(torch.zeros([1,vn,2],dtype=torch.float32,device=vertex.device))
continue
coords=coords[:,(1,0)]
cur_vert=vertex[bi]
cur_vert=cur_vert[cur_mask]+torch.unsqueeze(coords,1)
pt=torch.mean(cur_vert,0)
pts.append(torch.unsqueeze(pt,0))
return torch.cat(pts,0)
def generate_hypothesis(mask, vertex, round_hyp_num, inlier_thresh=0.999, confidence=0.99, max_iter=20,
min_num=5, max_num=30000):
'''
:param mask: [b,h,w]
:param vertex: [b,h,w,vn,2]
:param round_hyp_num:
:param inlier_thresh:
:return: [b,vn,2]
'''
b, h, w, vn, _ = vertex.shape
batch_hyp_pts = []
batch_hyp_counts = []
for bi in range(b):
hyp_num = 0
cur_mask = (mask[bi]).byte()
foreground_num = torch.sum(cur_mask)
# if too few points, just skip it
if foreground_num < min_num:
win_pts = torch.zeros([1, vn, 2], dtype=torch.float32, device=mask.device)
batch_win_pts.append(win_pts) # [1,vn,2]
continue
# if too many inliers, we randomly down sample it
if foreground_num > max_num:
selection = torch.zeros(cur_mask.shape, dtype=torch.float32, device=mask.device).uniform_(0, 1)
selected_mask = (selection < (max_num / foreground_num.float()))
cur_mask *= selected_mask
coords = torch.nonzero(cur_mask).float() # [tn,2]
coords = coords[:, [1, 0]]
direct = vertex[bi].masked_select(torch.unsqueeze(torch.unsqueeze(cur_mask, 2), 3)) # [tn,vn,2]
direct = direct.view([coords.shape[0], vn, 2])
tn = coords.shape[0]
idxs = torch.zeros([round_hyp_num, vn, 2], dtype=torch.int32, device=mask.device).random_(0, direct.shape[0])
all_win_ratio = torch.zeros([vn], dtype=torch.float32, device=mask.device)
all_win_pts = torch.zeros([vn, 2], dtype=torch.float32, device=mask.device)
# generate hypothesis
cur_hyp_pts = ransac_voting.generate_hypothesis(direct, coords, idxs) # [hn,vn,2]
# voting for hypothesis
cur_inlier = torch.zeros([round_hyp_num, vn, tn], dtype=torch.uint8, device=mask.device)
ransac_voting.voting_for_hypothesis(direct, coords, cur_hyp_pts, cur_inlier, inlier_thresh) # [hn,vn,tn]
# find max
cur_inlier_counts = torch.sum(cur_inlier, 2) # [hn,vn]
batch_hyp_pts.append(cur_hyp_pts)
batch_hyp_counts.append(cur_inlier_counts)
return torch.stack(batch_hyp_pts), torch.stack(batch_hyp_counts)
if __name__=="__main__":
from lib.datasets.linemod_dataset import LineModDatasetRealAug,VotingType
from lib.utils.data_utils import LineModImageDB
from lib.utils.draw_utils import imagenet_to_uint8
import numpy as np
train_set = LineModDatasetRealAug(LineModImageDB('cat',has_fuse_set=False,has_ms_set=False).real_set)
rgb, mask, vertex, vertex_weight, pose, gt_corners = train_set[np.random.randint(0,len(train_set)),480,640]
h,w=mask.shape
mask_0=torch.unsqueeze(mask, 0)
# mask=torch.cat([mask_0,mask_0],0).cuda().int().contiguous()
vertex=vertex.cuda() # [16,h,w]
vertex=vertex.permute(1,2,0).view(h,w,8,2)
vertex_0=torch.unsqueeze(vertex, 0)
# vertex=torch.cat([vertex_0,vertex_0],0).cuda().float().contiguous()
vt_corners=ransac_voting_layer_v3(mask_0.cuda(),vertex_0.cuda(),500) # [1,1,8,2]
print(vt_corners.shape)
vt_corners=vt_corners.cpu().numpy()[0]
gt_corners=gt_corners.numpy()[:,:2]
print(vt_corners)
print(gt_corners)
print(vt_corners-gt_corners)
import matplotlib.pyplot as plt
plt.imshow(imagenet_to_uint8(rgb.cpu().numpy()))
plt.plot(vt_corners[:,0],vt_corners[:,1],'*')
plt.plot(gt_corners[:,0],gt_corners[:,1],'*')
plt.show()
# vote for vanishing point
# train_set = LineModDatasetRealAug(LineModImageDB('cat',has_fuse_set=False,has_ms_set=False).real_set,vote_type=VotingType.VanPts)
#
# for k in np.random.choice(np.arange(len(train_set)),100):
# rgb, mask, vertex, vertex_weight, pose, van_pts = train_set[np.random.randint(k, len(train_set)), 480, 640]
#
# h,w=mask.shape
# mask_0=torch.unsqueeze(mask, 0)
# vn,h,w=vertex.shape
# vn//=2
# vertex=vertex.cuda() # [16,h,w]
# vertex=vertex.permute(1,2,0).view(h,w,vn,2)
# vertex_0=torch.unsqueeze(vertex, 0)
# # vertex=torch.cat([vertex_0,vertex_0],0).cuda().float().contiguous()
# vt_van_pts=ransac_voting_vanish_point_layer(mask_0.cuda(), vertex_0.cuda(), 2, 500, 0.999, 0.99, 20, 5, 30000, 0) # [1,1,8,2]
#
# vt_van_pts=vt_van_pts.cpu().numpy()[0, 0]
# van_pts= van_pts.numpy()
#
# ratio=vt_van_pts/van_pts
# try:
# assert(np.sum(ratio<0)==0)
# assert(np.sum(np.abs(ratio[:,0]-ratio[:,1])>1e-5)==0)
# assert(np.sum(np.abs(ratio[:,1]-ratio[:,2])>1e-5)==0)
# except AssertionError:
# print(ratio)
#
# print(k)
# assert(np.sum(np.abs(ratio[:,2]-ratio[:,3])>1e-6)==0)
# import matplotlib.pyplot as plt
# plt.imshow(imagenet_to_uint8(rgb.cpu().numpy()))
# plt.plot(vt_van_pts[:, 0], vt_van_pts[:, 1], '*')
# plt.plot(van_pts[:, 0], van_pts[:, 1], '*')
# plt.show()
| 45.95781 | 136 | 0.590171 | 7,150 | 51,197 | 3.976364 | 0.038881 | 0.029334 | 0.045584 | 0.036404 | 0.899265 | 0.884527 | 0.871865 | 0.863811 | 0.857796 | 0.849319 | 0 | 0.030912 | 0.283454 | 51,197 | 1,113 | 137 | 45.999102 | 0.744092 | 0.132898 | 0 | 0.81719 | 0 | 0 | 0.008929 | 0.000984 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020464 | false | 0.001364 | 0.010914 | 0 | 0.050477 | 0.006821 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed684889ecf551f0a2b89a5f094f81706eea5921 | 20,831 | py | Python | aud/generated/audSkel.py | merikesh/aud | 304feee374bfd20ad5ab916c12daf05327c0d4ac | [
"MIT"
] | 10 | 2019-01-09T06:18:11.000Z | 2021-07-28T22:48:40.000Z | aud/generated/audSkel.py | merikesh/aud | 304feee374bfd20ad5ab916c12daf05327c0d4ac | [
"MIT"
] | null | null | null | aud/generated/audSkel.py | merikesh/aud | 304feee374bfd20ad5ab916c12daf05327c0d4ac | [
"MIT"
] | 1 | 2019-05-12T16:12:16.000Z | 2019-05-12T16:12:16.000Z | from .base import Prim, Attribute, Property
class SkelRoot(Prim):
"""
Boundable prim type used to identify a scope beneath which skeletally-posed primitives are defined.
A SkelRoot must be defined at or above a skinned primitive for any skinning
behaviors in UsdSkel.
See the extented "Skel Root Schema" documentation for
more information.
Extent is a three dimensional range measuring the geometric extent of the authored gprim in its own local space (i.e. its own
transform not applied), without accounting for any shader-induced
displacement. Whenever any geometry-affecting attribute is authored
for any gprim in a layer, extent must also be authored at the same
timesample; failure to do so will result in incorrect bounds-computation.
\\sa \\ref UsdGeom_Boundable_Extent.
An authored extent on a prim which has children is expected to include
the extent of all children, as they will be pruned from BBox computation
during traversal.
"""
as_type = "SkelRoot"
proxyPrim = Attribute(
name = 'proxyPrim',
as_type = 'rel',
docstring = """,
The proxyPrim relationship allows us to link a prim whose purpose is "render" to its (single target)
purpose="proxy" prim. This is entirely optional, but can be
useful in several scenarios:
- In a pipeline that does pruning (for complexity management)
by deactivating prims composed from asset references, when we
deactivate a purpose="render" prim, we will be able to discover
and additionally deactivate its associated purpose="proxy" prim,
so that preview renders reflect the pruning accurately.
- DCC importers may be able to make more aggressive optimizations
for interactive processing and display if they can discover the proxy
for a given render prim.
- With a little more work, a Hydra-based application will be able
to map a picked proxy prim back to its render geometry for selection.
\\note It is only valid to author the proxyPrim relationship on
prims whose purpose is "render".
""",
)
purpose = Attribute(
name = 'purpose',
as_type = 'token',
value = "default",
is_uniform = True,
allowedTokens = ["default", "render", "proxy", "guide"],
docstring = """,
Purpose is a concept we have found useful in our pipeline for classifying geometry into categories that can each be independently
included or excluded from traversals of prims on a stage, such as
rendering or bounding-box computation traversals. The fallback
purpose, default indicates that a prim has "no special purpose"
and should generally be included in all traversals. Subtrees rooted
at a prim with purpose render should generally only be included
when performing a "final quality" render. Subtrees rooted at a prim
with purpose proxy should generally only be included when
performing a lightweight proxy render (such as openGL). Finally,
subtrees rooted at a prim with purpose guide should generally
only be included when an interactive application has been explicitly
asked to "show guides".
In the previous paragraph, when we say "subtrees rooted at a prim",
we mean the most ancestral or tallest subtree that has an authored,
non-default opinion. If the purpose of </RootPrim> is set to
"render", then the effective purpose of </RootPrim/ChildPrim> will
be "render" even if that prim has a different authored value for
purpose. <b>See ComputePurpose() for details of how purpose
inherits down namespace</b>.
As demonstrated in UsdGeomBBoxCache, a traverser should be ready to
accept combinations of included purposes as an input.
Purpose render can be useful in creating "light blocker"
geometry for raytracing interior scenes. Purposes render and
proxy can be used together to partition a complicated model
into a lightweight proxy representation for interactive use, and a
fully realized, potentially quite heavy, representation for rendering.
One can use UsdVariantSets to create proxy representations, but doing
so requires that we recompose parts of the UsdStage in order to change
to a different runtime level of detail, and that does not interact
well with the needs of multithreaded rendering. Purpose provides us with
a better tool for dynamic, interactive complexity management.
""",
)
visibility = Attribute(
name = 'visibility',
as_type = 'token',
value = "inherited",
allowedTokens = ["inherited", "invisible"],
docstring = """,
Visibility is meant to be the simplest form of "pruning" visibility that is supported by most DCC apps. Visibility is
animatable, allowing a sub-tree of geometry to be present for some
segment of a shot, and absent from others; unlike the action of
deactivating geometry prims, invisible geometry is still
available for inspection, for positioning, for defining volumes, etc.
""",
)
token = Attribute(
name = 'token',
as_type = 'uniform',
docstring = """,
Encodes the sequence of transformation operations in the order in which they should be pushed onto a transform stack while
visiting a UsdStage's prims in a graph traversal that will effect
the desired positioning for this prim and its descendant prims.
You should rarely, if ever, need to manipulate this attribute directly.
It is managed by the AddXformOp(), SetResetXformStack(), and
SetXformOpOrder(), and consulted by GetOrderedXformOps() and
GetLocalTransformation().
""",
)
class Skeleton(Prim):
"""
Describes a skeleton.
See the extended "Skeleton Schema" documentation for
more information.
"""
as_type = "Skeleton"
matrix4d = Attribute(
name = 'matrix4d',
as_type = 'uniform',
docstring = """,
Specifies the bind-pose transforms of each joint in **world space**, in the ordering imposed by *joints*.
Extent is a three dimensional range measuring the geometric extent of the authored gprim in its own local space (i.e. its own
transform not applied), without accounting for any shader-induced
displacement. Whenever any geometry-affecting attribute is authored
for any gprim in a layer, extent must also be authored at the same
timesample; failure to do so will result in incorrect bounds-computation.
\\sa \\ref UsdGeom_Boundable_Extent.
An authored extent on a prim which has children is expected to include
the extent of all children, as they will be pruned from BBox computation
during traversal.
""",
)
token = Attribute(
name = 'token',
as_type = 'uniform',
docstring = """,
An array of path tokens identifying the set of joints that make up the skeleton, and their order. Each token in the array must be valid
when parsed as an SdfPath. The parent-child relationships of the
corresponding paths determine the parent-child relationships of each
joint.
""",
)
proxyPrim = Attribute(
name = 'proxyPrim',
as_type = 'rel',
docstring = """,
The proxyPrim relationship allows us to link a prim whose purpose is "render" to its (single target)
purpose="proxy" prim. This is entirely optional, but can be
useful in several scenarios:
- In a pipeline that does pruning (for complexity management)
by deactivating prims composed from asset references, when we
deactivate a purpose="render" prim, we will be able to discover
and additionally deactivate its associated purpose="proxy" prim,
so that preview renders reflect the pruning accurately.
- DCC importers may be able to make more aggressive optimizations
for interactive processing and display if they can discover the proxy
for a given render prim.
- With a little more work, a Hydra-based application will be able
to map a picked proxy prim back to its render geometry for selection.
\\note It is only valid to author the proxyPrim relationship on
prims whose purpose is "render".
""",
)
purpose = Attribute(
name = 'purpose',
as_type = 'token',
value = "default",
is_uniform = True,
allowedTokens = ["default", "render", "proxy", "guide"],
docstring = """,
Purpose is a concept we have found useful in our pipeline for classifying geometry into categories that can each be independently
included or excluded from traversals of prims on a stage, such as
rendering or bounding-box computation traversals. The fallback
purpose, default indicates that a prim has "no special purpose"
and should generally be included in all traversals. Subtrees rooted
at a prim with purpose render should generally only be included
when performing a "final quality" render. Subtrees rooted at a prim
with purpose proxy should generally only be included when
performing a lightweight proxy render (such as openGL). Finally,
subtrees rooted at a prim with purpose guide should generally
only be included when an interactive application has been explicitly
asked to "show guides".
In the previous paragraph, when we say "subtrees rooted at a prim",
we mean the most ancestral or tallest subtree that has an authored,
non-default opinion. If the purpose of </RootPrim> is set to
"render", then the effective purpose of </RootPrim/ChildPrim> will
be "render" even if that prim has a different authored value for
purpose. <b>See ComputePurpose() for details of how purpose
inherits down namespace</b>.
As demonstrated in UsdGeomBBoxCache, a traverser should be ready to
accept combinations of included purposes as an input.
Purpose render can be useful in creating "light blocker"
geometry for raytracing interior scenes. Purposes render and
proxy can be used together to partition a complicated model
into a lightweight proxy representation for interactive use, and a
fully realized, potentially quite heavy, representation for rendering.
One can use UsdVariantSets to create proxy representations, but doing
so requires that we recompose parts of the UsdStage in order to change
to a different runtime level of detail, and that does not interact
well with the needs of multithreaded rendering. Purpose provides us with
a better tool for dynamic, interactive complexity management.
""",
)
matrix4d = Attribute(
name = 'matrix4d',
as_type = 'uniform',
docstring = """,
Specifies the rest-pose transforms of each joint in **local space**, in the ordering imposed by *joints*. This provides
fallback values for joint transforms when a Skeleton either has no
bound animation source, or when that animation source only contains
animation for a subset of a Skeleton's joints.
""",
)
visibility = Attribute(
name = 'visibility',
as_type = 'token',
value = "inherited",
allowedTokens = ["inherited", "invisible"],
docstring = """,
Visibility is meant to be the simplest form of "pruning" visibility that is supported by most DCC apps. Visibility is
animatable, allowing a sub-tree of geometry to be present for some
segment of a shot, and absent from others; unlike the action of
deactivating geometry prims, invisible geometry is still
available for inspection, for positioning, for defining volumes, etc.
""",
)
token = Attribute(
name = 'token',
as_type = 'uniform',
docstring = """,
Encodes the sequence of transformation operations in the order in which they should be pushed onto a transform stack while
visiting a UsdStage's prims in a graph traversal that will effect
the desired positioning for this prim and its descendant prims.
You should rarely, if ever, need to manipulate this attribute directly.
It is managed by the AddXformOp(), SetResetXformStack(), and
SetXformOpOrder(), and consulted by GetOrderedXformOps() and
GetLocalTransformation().
""",
)
class SkelAnimation(Prim):
"""
Describes a skel animation, where joint animation is stored in a vectorized form.
See the extended "Skel Animation"
documentation for more information.
"""
as_type = "SkelAnimation"
token = Attribute(
name = 'token',
as_type = 'uniform',
docstring = """,
Array of tokens identifying which blend shapes this animation's data applies to. The tokens for blendShapes correspond to
the tokens set in the *skel:blendShapes* binding property of the
UsdSkelBindingAPI.
Array of weight values for each blend shape. Each weight value is associated with the corresponding blend shape identified within the
*blendShapes* token array, and therefore must have the same length as
*blendShapes.
""",
)
token = Attribute(
name = 'token',
as_type = 'uniform',
docstring = """,
Array of tokens identifying which joints this animation's data applies to. The tokens for joints correspond to the tokens of
Skeleton primitives. The order of the joints as listed here may
vary from the order of joints on the Skeleton itself.
Joint-local unit quaternion rotations of all affected joints, in 32-bit precision. Array length should match the size of the
*joints* attribute.
Joint-local scales of all affected joints, in 16 bit precision. Array length should match the size of the *joints*
attribute.
Joint-local translations of all affected joints. Array length should match the size of the *joints* attribute.
""",
)
class PackedJointAnimation(Prim):
"""
Deprecated. Please use SkelAnimation instead.
"""
as_type = "PackedJointAnimation"
token = Attribute(
name = 'token',
as_type = 'uniform',
docstring = """,
Array of tokens identifying which blend shapes this animation's data applies to. The tokens for blendShapes correspond to
the tokens set in the *skel:blendShapes* binding property of the
UsdSkelBindingAPI.
Array of weight values for each blend shape. Each weight value is associated with the corresponding blend shape identified within the
*blendShapes* token array, and therefore must have the same length as
*blendShapes.
""",
)
token = Attribute(
name = 'token',
as_type = 'uniform',
docstring = """,
Array of tokens identifying which joints this animation's data applies to. The tokens for joints correspond to the tokens of
Skeleton primitives. The order of the joints as listed here may
vary from the order of joints on the Skeleton itself.
Joint-local unit quaternion rotations of all affected joints, in 32-bit precision. Array length should match the size of the
*joints* attribute.
Joint-local scales of all affected joints, in 16 bit precision. Array length should match the size of the *joints*
attribute.
Joint-local translations of all affected joints. Array length should match the size of the *joints* attribute.
""",
)
class SkelBindingAPI(Prim):
"""
Provides API for authoring and extracting all the skinning-related data that lives in the "geometry hierarchy" of prims and models that want
to be skeletally deformed.
See the extended "UsdSkelBindingAPI schema"
documentation for more about bindings and how they apply in a scene graph.
"""
geomBindTransform = Attribute(
name = 'primvars:skel:geomBindTransform',
as_type = 'matrix4d',
docstring = """,
Encodes the bind-time world space transforms of the prim. If the transform is identical for a group of gprims that share a common
ancestor, the transform may be authored on the ancestor, to "inherit"
down to all the leaf gprims. If this transform is unset, an identity
transform is used instead.
Indices into the *joints* attribute of the closest (in namespace) bound Skeleton that affect each point of a PointBased
gprim. The primvar can have either *constant* or *vertex* interpolation.
This primvar's *elementSize* will determine how many joint influences
apply to each point. Indices must point be valid. Null influences should
be defined by setting values in jointWeights to zero.
See UsdGeomPrimvar for more information on interpolation and
elementSize.
Weights for the joints that affect each point of a PointBased gprim. The primvar can have either *constant* or *vertex* interpolation.
This primvar's *elementSize* will determine how many joints influences
apply to each point. The length, interpolation, and elementSize of
*jointWeights* must match that of *jointIndices*. See UsdGeomPrimvar
for more information on interpolation and elementSize.
""",
)
animationSource = Attribute(
name = 'skel:animationSource',
as_type = 'rel',
docstring = """,
Animation source to be bound to Skeleton primitives at or beneath the location at which this property is defined.
""",
)
token = Attribute(
name = 'token',
as_type = 'uniform',
docstring = """,
An array of tokens defining the order onto which blend shape weights from an animation source map onto the *skel:blendShapeTargets*
rel of a binding site. If authored, the number of elements must be equal
to the number of targets in the _blendShapeTargets_ rel. This property
is not inherited hierarchically, and is expected to be authored directly
on the skinnable primitive to which the blend shapes apply.
""",
)
blendShapeTargets = Attribute(
name = 'skel:blendShapeTargets',
as_type = 'rel',
docstring = """,
Ordered list of all target blend shapes. This property is not inherited hierarchically, and is expected to be authored directly on
the skinnable primitive to which the the blend shapes apply.
""",
)
token = Attribute(
name = 'token',
as_type = 'uniform',
docstring = """,
An (optional) array of tokens defining the list of joints to which jointIndices apply. If not defined, jointIndices applies
to the ordered list of joints defined in the bound Skeleton's *joints*
attribute. If undefined on a primitive, the primitive inherits the
value of the nearest ancestor prim, if any.
""",
)
skeleton = Attribute(
name = 'skel:skeleton',
as_type = 'rel',
docstring = """,
Skeleton to be bound to this prim and its descendents that possess a mapping and weighting to the joints of the identified
Skeleton.
""",
)
class BlendShape(Prim):
"""
Describes a target blend shape, possibly containing inbetween shapes.
See the extended "Blend Shape Schema
documentation for information.
"""
as_type = "BlendShape"
vector3f = Attribute(
name = 'vector3f',
as_type = 'uniform',
docstring = """,
**Required property**. Position offsets which, when added to the base pose, provides the target shape.
""",
)
uint = Attribute(
name = 'uint',
as_type = 'uniform',
docstring = """,
**Optional property**. Indices into the original mesh that correspond to the values in *offsets* and of any inbetween shapes. If
authored, the number of elements must be equal to the number of elements
in the *offsets* array.
""",
) | 47.668192 | 150 | 0.669147 | 2,628 | 20,831 | 5.290335 | 0.185693 | 0.012084 | 0.012156 | 0.020571 | 0.787672 | 0.773718 | 0.764511 | 0.759764 | 0.759764 | 0.747177 | 0 | 0.000997 | 0.277951 | 20,831 | 437 | 151 | 47.668192 | 0.923343 | 0.077049 | 0 | 0.719101 | 0 | 0.08427 | 0.844643 | 0.012395 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008427 | 0 | 0.103933 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed7f5b2af717a31a68d285e3af990bd1d9353ceb | 41,970 | py | Python | sdk/python/pulumi_aws/ec2/eip.py | rapzo/pulumi-aws | 390a098221315d98a54ba97d1559e750dc3053b7 | [
"ECL-2.0",
"Apache-2.0"
] | 260 | 2018-06-18T14:57:00.000Z | 2022-03-29T11:41:03.000Z | sdk/python/pulumi_aws/ec2/eip.py | rapzo/pulumi-aws | 390a098221315d98a54ba97d1559e750dc3053b7 | [
"ECL-2.0",
"Apache-2.0"
] | 1,154 | 2018-06-19T20:38:20.000Z | 2022-03-31T19:48:16.000Z | sdk/python/pulumi_aws/ec2/eip.py | rapzo/pulumi-aws | 390a098221315d98a54ba97d1559e750dc3053b7 | [
"ECL-2.0",
"Apache-2.0"
] | 115 | 2018-06-28T03:20:27.000Z | 2022-03-29T11:41:06.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['EipArgs', 'Eip']
@pulumi.input_type
class EipArgs:
def __init__(__self__, *,
address: Optional[pulumi.Input[str]] = None,
associate_with_private_ip: Optional[pulumi.Input[str]] = None,
customer_owned_ipv4_pool: Optional[pulumi.Input[str]] = None,
instance: Optional[pulumi.Input[str]] = None,
network_border_group: Optional[pulumi.Input[str]] = None,
network_interface: Optional[pulumi.Input[str]] = None,
public_ipv4_pool: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
vpc: Optional[pulumi.Input[bool]] = None):
"""
The set of arguments for constructing a Eip resource.
:param pulumi.Input[str] address: IP address from an EC2 BYOIP pool. This option is only available for VPC EIPs.
:param pulumi.Input[str] associate_with_private_ip: User-specified primary or secondary private IP address to associate with the Elastic IP address. If no private IP address is specified, the Elastic IP address is associated with the primary private IP address.
:param pulumi.Input[str] customer_owned_ipv4_pool: ID of a customer-owned address pool. For more on customer owned IP addressed check out [Customer-owned IP addresses guide](https://docs.aws.amazon.com/outposts/latest/userguide/outposts-networking-components.html#ip-addressing).
:param pulumi.Input[str] instance: EC2 instance ID.
:param pulumi.Input[str] network_border_group: Location from which the IP address is advertised. Use this parameter to limit the address to this location.
:param pulumi.Input[str] network_interface: Network interface ID to associate with.
:param pulumi.Input[str] public_ipv4_pool: EC2 IPv4 address pool identifier or `amazon`. This option is only available for VPC EIPs.
:param pulumi.Input[bool] vpc: Boolean if the EIP is in a VPC or not.
"""
if address is not None:
pulumi.set(__self__, "address", address)
if associate_with_private_ip is not None:
pulumi.set(__self__, "associate_with_private_ip", associate_with_private_ip)
if customer_owned_ipv4_pool is not None:
pulumi.set(__self__, "customer_owned_ipv4_pool", customer_owned_ipv4_pool)
if instance is not None:
pulumi.set(__self__, "instance", instance)
if network_border_group is not None:
pulumi.set(__self__, "network_border_group", network_border_group)
if network_interface is not None:
pulumi.set(__self__, "network_interface", network_interface)
if public_ipv4_pool is not None:
pulumi.set(__self__, "public_ipv4_pool", public_ipv4_pool)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if vpc is not None:
pulumi.set(__self__, "vpc", vpc)
@property
@pulumi.getter
def address(self) -> Optional[pulumi.Input[str]]:
"""
IP address from an EC2 BYOIP pool. This option is only available for VPC EIPs.
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "address", value)
@property
@pulumi.getter(name="associateWithPrivateIp")
def associate_with_private_ip(self) -> Optional[pulumi.Input[str]]:
"""
User-specified primary or secondary private IP address to associate with the Elastic IP address. If no private IP address is specified, the Elastic IP address is associated with the primary private IP address.
"""
return pulumi.get(self, "associate_with_private_ip")
@associate_with_private_ip.setter
def associate_with_private_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "associate_with_private_ip", value)
@property
@pulumi.getter(name="customerOwnedIpv4Pool")
def customer_owned_ipv4_pool(self) -> Optional[pulumi.Input[str]]:
"""
ID of a customer-owned address pool. For more on customer owned IP addressed check out [Customer-owned IP addresses guide](https://docs.aws.amazon.com/outposts/latest/userguide/outposts-networking-components.html#ip-addressing).
"""
return pulumi.get(self, "customer_owned_ipv4_pool")
@customer_owned_ipv4_pool.setter
def customer_owned_ipv4_pool(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "customer_owned_ipv4_pool", value)
@property
@pulumi.getter
def instance(self) -> Optional[pulumi.Input[str]]:
"""
EC2 instance ID.
"""
return pulumi.get(self, "instance")
@instance.setter
def instance(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance", value)
@property
@pulumi.getter(name="networkBorderGroup")
def network_border_group(self) -> Optional[pulumi.Input[str]]:
"""
Location from which the IP address is advertised. Use this parameter to limit the address to this location.
"""
return pulumi.get(self, "network_border_group")
@network_border_group.setter
def network_border_group(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "network_border_group", value)
@property
@pulumi.getter(name="networkInterface")
def network_interface(self) -> Optional[pulumi.Input[str]]:
"""
Network interface ID to associate with.
"""
return pulumi.get(self, "network_interface")
@network_interface.setter
def network_interface(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "network_interface", value)
@property
@pulumi.getter(name="publicIpv4Pool")
def public_ipv4_pool(self) -> Optional[pulumi.Input[str]]:
"""
EC2 IPv4 address pool identifier or `amazon`. This option is only available for VPC EIPs.
"""
return pulumi.get(self, "public_ipv4_pool")
@public_ipv4_pool.setter
def public_ipv4_pool(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_ipv4_pool", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter
def vpc(self) -> Optional[pulumi.Input[bool]]:
"""
Boolean if the EIP is in a VPC or not.
"""
return pulumi.get(self, "vpc")
@vpc.setter
def vpc(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "vpc", value)
@pulumi.input_type
class _EipState:
def __init__(__self__, *,
address: Optional[pulumi.Input[str]] = None,
allocation_id: Optional[pulumi.Input[str]] = None,
associate_with_private_ip: Optional[pulumi.Input[str]] = None,
association_id: Optional[pulumi.Input[str]] = None,
carrier_ip: Optional[pulumi.Input[str]] = None,
customer_owned_ip: Optional[pulumi.Input[str]] = None,
customer_owned_ipv4_pool: Optional[pulumi.Input[str]] = None,
domain: Optional[pulumi.Input[str]] = None,
instance: Optional[pulumi.Input[str]] = None,
network_border_group: Optional[pulumi.Input[str]] = None,
network_interface: Optional[pulumi.Input[str]] = None,
private_dns: Optional[pulumi.Input[str]] = None,
private_ip: Optional[pulumi.Input[str]] = None,
public_dns: Optional[pulumi.Input[str]] = None,
public_ip: Optional[pulumi.Input[str]] = None,
public_ipv4_pool: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
tags_all: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
vpc: Optional[pulumi.Input[bool]] = None):
"""
Input properties used for looking up and filtering Eip resources.
:param pulumi.Input[str] address: IP address from an EC2 BYOIP pool. This option is only available for VPC EIPs.
:param pulumi.Input[str] allocation_id: ID that AWS assigns to represent the allocation of the Elastic IP address for use with instances in a VPC.
:param pulumi.Input[str] associate_with_private_ip: User-specified primary or secondary private IP address to associate with the Elastic IP address. If no private IP address is specified, the Elastic IP address is associated with the primary private IP address.
:param pulumi.Input[str] association_id: ID representing the association of the address with an instance in a VPC.
:param pulumi.Input[str] carrier_ip: Carrier IP address.
:param pulumi.Input[str] customer_owned_ip: Customer owned IP.
:param pulumi.Input[str] customer_owned_ipv4_pool: ID of a customer-owned address pool. For more on customer owned IP addressed check out [Customer-owned IP addresses guide](https://docs.aws.amazon.com/outposts/latest/userguide/outposts-networking-components.html#ip-addressing).
:param pulumi.Input[str] domain: Indicates if this EIP is for use in VPC (`vpc`) or EC2 Classic (`standard`).
:param pulumi.Input[str] instance: EC2 instance ID.
:param pulumi.Input[str] network_border_group: Location from which the IP address is advertised. Use this parameter to limit the address to this location.
:param pulumi.Input[str] network_interface: Network interface ID to associate with.
:param pulumi.Input[str] private_dns: The Private DNS associated with the Elastic IP address (if in VPC).
:param pulumi.Input[str] private_ip: Contains the private IP address (if in VPC).
:param pulumi.Input[str] public_dns: Public DNS associated with the Elastic IP address.
:param pulumi.Input[str] public_ip: Contains the public IP address.
:param pulumi.Input[str] public_ipv4_pool: EC2 IPv4 address pool identifier or `amazon`. This option is only available for VPC EIPs.
:param pulumi.Input[bool] vpc: Boolean if the EIP is in a VPC or not.
"""
if address is not None:
pulumi.set(__self__, "address", address)
if allocation_id is not None:
pulumi.set(__self__, "allocation_id", allocation_id)
if associate_with_private_ip is not None:
pulumi.set(__self__, "associate_with_private_ip", associate_with_private_ip)
if association_id is not None:
pulumi.set(__self__, "association_id", association_id)
if carrier_ip is not None:
pulumi.set(__self__, "carrier_ip", carrier_ip)
if customer_owned_ip is not None:
pulumi.set(__self__, "customer_owned_ip", customer_owned_ip)
if customer_owned_ipv4_pool is not None:
pulumi.set(__self__, "customer_owned_ipv4_pool", customer_owned_ipv4_pool)
if domain is not None:
pulumi.set(__self__, "domain", domain)
if instance is not None:
pulumi.set(__self__, "instance", instance)
if network_border_group is not None:
pulumi.set(__self__, "network_border_group", network_border_group)
if network_interface is not None:
pulumi.set(__self__, "network_interface", network_interface)
if private_dns is not None:
pulumi.set(__self__, "private_dns", private_dns)
if private_ip is not None:
pulumi.set(__self__, "private_ip", private_ip)
if public_dns is not None:
pulumi.set(__self__, "public_dns", public_dns)
if public_ip is not None:
pulumi.set(__self__, "public_ip", public_ip)
if public_ipv4_pool is not None:
pulumi.set(__self__, "public_ipv4_pool", public_ipv4_pool)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if tags_all is not None:
pulumi.set(__self__, "tags_all", tags_all)
if vpc is not None:
pulumi.set(__self__, "vpc", vpc)
@property
@pulumi.getter
def address(self) -> Optional[pulumi.Input[str]]:
"""
IP address from an EC2 BYOIP pool. This option is only available for VPC EIPs.
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "address", value)
@property
@pulumi.getter(name="allocationId")
def allocation_id(self) -> Optional[pulumi.Input[str]]:
"""
ID that AWS assigns to represent the allocation of the Elastic IP address for use with instances in a VPC.
"""
return pulumi.get(self, "allocation_id")
@allocation_id.setter
def allocation_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "allocation_id", value)
@property
@pulumi.getter(name="associateWithPrivateIp")
def associate_with_private_ip(self) -> Optional[pulumi.Input[str]]:
"""
User-specified primary or secondary private IP address to associate with the Elastic IP address. If no private IP address is specified, the Elastic IP address is associated with the primary private IP address.
"""
return pulumi.get(self, "associate_with_private_ip")
@associate_with_private_ip.setter
def associate_with_private_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "associate_with_private_ip", value)
@property
@pulumi.getter(name="associationId")
def association_id(self) -> Optional[pulumi.Input[str]]:
"""
ID representing the association of the address with an instance in a VPC.
"""
return pulumi.get(self, "association_id")
@association_id.setter
def association_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "association_id", value)
@property
@pulumi.getter(name="carrierIp")
def carrier_ip(self) -> Optional[pulumi.Input[str]]:
"""
Carrier IP address.
"""
return pulumi.get(self, "carrier_ip")
@carrier_ip.setter
def carrier_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "carrier_ip", value)
@property
@pulumi.getter(name="customerOwnedIp")
def customer_owned_ip(self) -> Optional[pulumi.Input[str]]:
"""
Customer owned IP.
"""
return pulumi.get(self, "customer_owned_ip")
@customer_owned_ip.setter
def customer_owned_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "customer_owned_ip", value)
@property
@pulumi.getter(name="customerOwnedIpv4Pool")
def customer_owned_ipv4_pool(self) -> Optional[pulumi.Input[str]]:
"""
ID of a customer-owned address pool. For more on customer owned IP addressed check out [Customer-owned IP addresses guide](https://docs.aws.amazon.com/outposts/latest/userguide/outposts-networking-components.html#ip-addressing).
"""
return pulumi.get(self, "customer_owned_ipv4_pool")
@customer_owned_ipv4_pool.setter
def customer_owned_ipv4_pool(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "customer_owned_ipv4_pool", value)
@property
@pulumi.getter
def domain(self) -> Optional[pulumi.Input[str]]:
"""
Indicates if this EIP is for use in VPC (`vpc`) or EC2 Classic (`standard`).
"""
return pulumi.get(self, "domain")
@domain.setter
def domain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain", value)
@property
@pulumi.getter
def instance(self) -> Optional[pulumi.Input[str]]:
"""
EC2 instance ID.
"""
return pulumi.get(self, "instance")
@instance.setter
def instance(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance", value)
@property
@pulumi.getter(name="networkBorderGroup")
def network_border_group(self) -> Optional[pulumi.Input[str]]:
"""
Location from which the IP address is advertised. Use this parameter to limit the address to this location.
"""
return pulumi.get(self, "network_border_group")
@network_border_group.setter
def network_border_group(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "network_border_group", value)
@property
@pulumi.getter(name="networkInterface")
def network_interface(self) -> Optional[pulumi.Input[str]]:
"""
Network interface ID to associate with.
"""
return pulumi.get(self, "network_interface")
@network_interface.setter
def network_interface(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "network_interface", value)
@property
@pulumi.getter(name="privateDns")
def private_dns(self) -> Optional[pulumi.Input[str]]:
"""
The Private DNS associated with the Elastic IP address (if in VPC).
"""
return pulumi.get(self, "private_dns")
@private_dns.setter
def private_dns(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "private_dns", value)
@property
@pulumi.getter(name="privateIp")
def private_ip(self) -> Optional[pulumi.Input[str]]:
"""
Contains the private IP address (if in VPC).
"""
return pulumi.get(self, "private_ip")
@private_ip.setter
def private_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "private_ip", value)
@property
@pulumi.getter(name="publicDns")
def public_dns(self) -> Optional[pulumi.Input[str]]:
"""
Public DNS associated with the Elastic IP address.
"""
return pulumi.get(self, "public_dns")
@public_dns.setter
def public_dns(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_dns", value)
@property
@pulumi.getter(name="publicIp")
def public_ip(self) -> Optional[pulumi.Input[str]]:
"""
Contains the public IP address.
"""
return pulumi.get(self, "public_ip")
@public_ip.setter
def public_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_ip", value)
@property
@pulumi.getter(name="publicIpv4Pool")
def public_ipv4_pool(self) -> Optional[pulumi.Input[str]]:
"""
EC2 IPv4 address pool identifier or `amazon`. This option is only available for VPC EIPs.
"""
return pulumi.get(self, "public_ipv4_pool")
@public_ipv4_pool.setter
def public_ipv4_pool(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_ipv4_pool", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="tagsAll")
def tags_all(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "tags_all")
@tags_all.setter
def tags_all(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags_all", value)
@property
@pulumi.getter
def vpc(self) -> Optional[pulumi.Input[bool]]:
"""
Boolean if the EIP is in a VPC or not.
"""
return pulumi.get(self, "vpc")
@vpc.setter
def vpc(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "vpc", value)
class Eip(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
address: Optional[pulumi.Input[str]] = None,
associate_with_private_ip: Optional[pulumi.Input[str]] = None,
customer_owned_ipv4_pool: Optional[pulumi.Input[str]] = None,
instance: Optional[pulumi.Input[str]] = None,
network_border_group: Optional[pulumi.Input[str]] = None,
network_interface: Optional[pulumi.Input[str]] = None,
public_ipv4_pool: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
vpc: Optional[pulumi.Input[bool]] = None,
__props__=None):
"""
Provides an Elastic IP resource.
> **Note:** EIP may require IGW to exist prior to association. Use `depends_on` to set an explicit dependency on the IGW.
> **Note:** Do not use `network_interface` to associate the EIP to `lb.LoadBalancer` or `ec2.NatGateway` resources. Instead use the `allocation_id` available in those resources to allow AWS to manage the association, otherwise you will see `AuthFailure` errors.
## Example Usage
### Single EIP associated with an instance
```python
import pulumi
import pulumi_aws as aws
lb = aws.ec2.Eip("lb",
instance=aws_instance["web"]["id"],
vpc=True)
```
### Multiple EIPs associated with a single network interface
```python
import pulumi
import pulumi_aws as aws
multi_ip = aws.ec2.NetworkInterface("multi-ip",
subnet_id=aws_subnet["main"]["id"],
private_ips=[
"10.0.0.10",
"10.0.0.11",
])
one = aws.ec2.Eip("one",
vpc=True,
network_interface=multi_ip.id,
associate_with_private_ip="10.0.0.10")
two = aws.ec2.Eip("two",
vpc=True,
network_interface=multi_ip.id,
associate_with_private_ip="10.0.0.11")
```
### Attaching an EIP to an Instance with a pre-assigned private ip (VPC Only)
```python
import pulumi
import pulumi_aws as aws
default = aws.ec2.Vpc("default",
cidr_block="10.0.0.0/16",
enable_dns_hostnames=True)
gw = aws.ec2.InternetGateway("gw", vpc_id=default.id)
tf_test_subnet = aws.ec2.Subnet("tfTestSubnet",
vpc_id=default.id,
cidr_block="10.0.0.0/24",
map_public_ip_on_launch=True,
opts=pulumi.ResourceOptions(depends_on=[gw]))
foo = aws.ec2.Instance("foo",
ami="ami-5189a661",
instance_type="t2.micro",
private_ip="10.0.0.12",
subnet_id=tf_test_subnet.id)
bar = aws.ec2.Eip("bar",
vpc=True,
instance=foo.id,
associate_with_private_ip="10.0.0.12",
opts=pulumi.ResourceOptions(depends_on=[gw]))
```
### Allocating EIP from the BYOIP pool
```python
import pulumi
import pulumi_aws as aws
byoip_ip = aws.ec2.Eip("byoip-ip",
public_ipv4_pool="ipv4pool-ec2-012345",
vpc=True)
```
## Import
EIPs in a VPC can be imported using their Allocation ID, e.g.
```sh
$ pulumi import aws:ec2/eip:Eip bar eipalloc-00a10e96
```
EIPs in EC2 Classic can be imported using their Public IP, e.g.
```sh
$ pulumi import aws:ec2/eip:Eip bar 52.0.0.0
```
[1]https://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_AssociateAddress.html
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] address: IP address from an EC2 BYOIP pool. This option is only available for VPC EIPs.
:param pulumi.Input[str] associate_with_private_ip: User-specified primary or secondary private IP address to associate with the Elastic IP address. If no private IP address is specified, the Elastic IP address is associated with the primary private IP address.
:param pulumi.Input[str] customer_owned_ipv4_pool: ID of a customer-owned address pool. For more on customer owned IP addressed check out [Customer-owned IP addresses guide](https://docs.aws.amazon.com/outposts/latest/userguide/outposts-networking-components.html#ip-addressing).
:param pulumi.Input[str] instance: EC2 instance ID.
:param pulumi.Input[str] network_border_group: Location from which the IP address is advertised. Use this parameter to limit the address to this location.
:param pulumi.Input[str] network_interface: Network interface ID to associate with.
:param pulumi.Input[str] public_ipv4_pool: EC2 IPv4 address pool identifier or `amazon`. This option is only available for VPC EIPs.
:param pulumi.Input[bool] vpc: Boolean if the EIP is in a VPC or not.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: Optional[EipArgs] = None,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides an Elastic IP resource.
> **Note:** EIP may require IGW to exist prior to association. Use `depends_on` to set an explicit dependency on the IGW.
> **Note:** Do not use `network_interface` to associate the EIP to `lb.LoadBalancer` or `ec2.NatGateway` resources. Instead use the `allocation_id` available in those resources to allow AWS to manage the association, otherwise you will see `AuthFailure` errors.
## Example Usage
### Single EIP associated with an instance
```python
import pulumi
import pulumi_aws as aws
lb = aws.ec2.Eip("lb",
instance=aws_instance["web"]["id"],
vpc=True)
```
### Multiple EIPs associated with a single network interface
```python
import pulumi
import pulumi_aws as aws
multi_ip = aws.ec2.NetworkInterface("multi-ip",
subnet_id=aws_subnet["main"]["id"],
private_ips=[
"10.0.0.10",
"10.0.0.11",
])
one = aws.ec2.Eip("one",
vpc=True,
network_interface=multi_ip.id,
associate_with_private_ip="10.0.0.10")
two = aws.ec2.Eip("two",
vpc=True,
network_interface=multi_ip.id,
associate_with_private_ip="10.0.0.11")
```
### Attaching an EIP to an Instance with a pre-assigned private ip (VPC Only)
```python
import pulumi
import pulumi_aws as aws
default = aws.ec2.Vpc("default",
cidr_block="10.0.0.0/16",
enable_dns_hostnames=True)
gw = aws.ec2.InternetGateway("gw", vpc_id=default.id)
tf_test_subnet = aws.ec2.Subnet("tfTestSubnet",
vpc_id=default.id,
cidr_block="10.0.0.0/24",
map_public_ip_on_launch=True,
opts=pulumi.ResourceOptions(depends_on=[gw]))
foo = aws.ec2.Instance("foo",
ami="ami-5189a661",
instance_type="t2.micro",
private_ip="10.0.0.12",
subnet_id=tf_test_subnet.id)
bar = aws.ec2.Eip("bar",
vpc=True,
instance=foo.id,
associate_with_private_ip="10.0.0.12",
opts=pulumi.ResourceOptions(depends_on=[gw]))
```
### Allocating EIP from the BYOIP pool
```python
import pulumi
import pulumi_aws as aws
byoip_ip = aws.ec2.Eip("byoip-ip",
public_ipv4_pool="ipv4pool-ec2-012345",
vpc=True)
```
## Import
EIPs in a VPC can be imported using their Allocation ID, e.g.
```sh
$ pulumi import aws:ec2/eip:Eip bar eipalloc-00a10e96
```
EIPs in EC2 Classic can be imported using their Public IP, e.g.
```sh
$ pulumi import aws:ec2/eip:Eip bar 52.0.0.0
```
[1]https://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_AssociateAddress.html
:param str resource_name: The name of the resource.
:param EipArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(EipArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
address: Optional[pulumi.Input[str]] = None,
associate_with_private_ip: Optional[pulumi.Input[str]] = None,
customer_owned_ipv4_pool: Optional[pulumi.Input[str]] = None,
instance: Optional[pulumi.Input[str]] = None,
network_border_group: Optional[pulumi.Input[str]] = None,
network_interface: Optional[pulumi.Input[str]] = None,
public_ipv4_pool: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
vpc: Optional[pulumi.Input[bool]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = EipArgs.__new__(EipArgs)
__props__.__dict__["address"] = address
__props__.__dict__["associate_with_private_ip"] = associate_with_private_ip
__props__.__dict__["customer_owned_ipv4_pool"] = customer_owned_ipv4_pool
__props__.__dict__["instance"] = instance
__props__.__dict__["network_border_group"] = network_border_group
__props__.__dict__["network_interface"] = network_interface
__props__.__dict__["public_ipv4_pool"] = public_ipv4_pool
__props__.__dict__["tags"] = tags
__props__.__dict__["vpc"] = vpc
__props__.__dict__["allocation_id"] = None
__props__.__dict__["association_id"] = None
__props__.__dict__["carrier_ip"] = None
__props__.__dict__["customer_owned_ip"] = None
__props__.__dict__["domain"] = None
__props__.__dict__["private_dns"] = None
__props__.__dict__["private_ip"] = None
__props__.__dict__["public_dns"] = None
__props__.__dict__["public_ip"] = None
__props__.__dict__["tags_all"] = None
super(Eip, __self__).__init__(
'aws:ec2/eip:Eip',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
address: Optional[pulumi.Input[str]] = None,
allocation_id: Optional[pulumi.Input[str]] = None,
associate_with_private_ip: Optional[pulumi.Input[str]] = None,
association_id: Optional[pulumi.Input[str]] = None,
carrier_ip: Optional[pulumi.Input[str]] = None,
customer_owned_ip: Optional[pulumi.Input[str]] = None,
customer_owned_ipv4_pool: Optional[pulumi.Input[str]] = None,
domain: Optional[pulumi.Input[str]] = None,
instance: Optional[pulumi.Input[str]] = None,
network_border_group: Optional[pulumi.Input[str]] = None,
network_interface: Optional[pulumi.Input[str]] = None,
private_dns: Optional[pulumi.Input[str]] = None,
private_ip: Optional[pulumi.Input[str]] = None,
public_dns: Optional[pulumi.Input[str]] = None,
public_ip: Optional[pulumi.Input[str]] = None,
public_ipv4_pool: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
tags_all: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
vpc: Optional[pulumi.Input[bool]] = None) -> 'Eip':
"""
Get an existing Eip resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] address: IP address from an EC2 BYOIP pool. This option is only available for VPC EIPs.
:param pulumi.Input[str] allocation_id: ID that AWS assigns to represent the allocation of the Elastic IP address for use with instances in a VPC.
:param pulumi.Input[str] associate_with_private_ip: User-specified primary or secondary private IP address to associate with the Elastic IP address. If no private IP address is specified, the Elastic IP address is associated with the primary private IP address.
:param pulumi.Input[str] association_id: ID representing the association of the address with an instance in a VPC.
:param pulumi.Input[str] carrier_ip: Carrier IP address.
:param pulumi.Input[str] customer_owned_ip: Customer owned IP.
:param pulumi.Input[str] customer_owned_ipv4_pool: ID of a customer-owned address pool. For more on customer owned IP addressed check out [Customer-owned IP addresses guide](https://docs.aws.amazon.com/outposts/latest/userguide/outposts-networking-components.html#ip-addressing).
:param pulumi.Input[str] domain: Indicates if this EIP is for use in VPC (`vpc`) or EC2 Classic (`standard`).
:param pulumi.Input[str] instance: EC2 instance ID.
:param pulumi.Input[str] network_border_group: Location from which the IP address is advertised. Use this parameter to limit the address to this location.
:param pulumi.Input[str] network_interface: Network interface ID to associate with.
:param pulumi.Input[str] private_dns: The Private DNS associated with the Elastic IP address (if in VPC).
:param pulumi.Input[str] private_ip: Contains the private IP address (if in VPC).
:param pulumi.Input[str] public_dns: Public DNS associated with the Elastic IP address.
:param pulumi.Input[str] public_ip: Contains the public IP address.
:param pulumi.Input[str] public_ipv4_pool: EC2 IPv4 address pool identifier or `amazon`. This option is only available for VPC EIPs.
:param pulumi.Input[bool] vpc: Boolean if the EIP is in a VPC or not.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _EipState.__new__(_EipState)
__props__.__dict__["address"] = address
__props__.__dict__["allocation_id"] = allocation_id
__props__.__dict__["associate_with_private_ip"] = associate_with_private_ip
__props__.__dict__["association_id"] = association_id
__props__.__dict__["carrier_ip"] = carrier_ip
__props__.__dict__["customer_owned_ip"] = customer_owned_ip
__props__.__dict__["customer_owned_ipv4_pool"] = customer_owned_ipv4_pool
__props__.__dict__["domain"] = domain
__props__.__dict__["instance"] = instance
__props__.__dict__["network_border_group"] = network_border_group
__props__.__dict__["network_interface"] = network_interface
__props__.__dict__["private_dns"] = private_dns
__props__.__dict__["private_ip"] = private_ip
__props__.__dict__["public_dns"] = public_dns
__props__.__dict__["public_ip"] = public_ip
__props__.__dict__["public_ipv4_pool"] = public_ipv4_pool
__props__.__dict__["tags"] = tags
__props__.__dict__["tags_all"] = tags_all
__props__.__dict__["vpc"] = vpc
return Eip(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def address(self) -> pulumi.Output[Optional[str]]:
"""
IP address from an EC2 BYOIP pool. This option is only available for VPC EIPs.
"""
return pulumi.get(self, "address")
@property
@pulumi.getter(name="allocationId")
def allocation_id(self) -> pulumi.Output[str]:
"""
ID that AWS assigns to represent the allocation of the Elastic IP address for use with instances in a VPC.
"""
return pulumi.get(self, "allocation_id")
@property
@pulumi.getter(name="associateWithPrivateIp")
def associate_with_private_ip(self) -> pulumi.Output[Optional[str]]:
"""
User-specified primary or secondary private IP address to associate with the Elastic IP address. If no private IP address is specified, the Elastic IP address is associated with the primary private IP address.
"""
return pulumi.get(self, "associate_with_private_ip")
@property
@pulumi.getter(name="associationId")
def association_id(self) -> pulumi.Output[str]:
"""
ID representing the association of the address with an instance in a VPC.
"""
return pulumi.get(self, "association_id")
@property
@pulumi.getter(name="carrierIp")
def carrier_ip(self) -> pulumi.Output[str]:
"""
Carrier IP address.
"""
return pulumi.get(self, "carrier_ip")
@property
@pulumi.getter(name="customerOwnedIp")
def customer_owned_ip(self) -> pulumi.Output[str]:
"""
Customer owned IP.
"""
return pulumi.get(self, "customer_owned_ip")
@property
@pulumi.getter(name="customerOwnedIpv4Pool")
def customer_owned_ipv4_pool(self) -> pulumi.Output[Optional[str]]:
"""
ID of a customer-owned address pool. For more on customer owned IP addressed check out [Customer-owned IP addresses guide](https://docs.aws.amazon.com/outposts/latest/userguide/outposts-networking-components.html#ip-addressing).
"""
return pulumi.get(self, "customer_owned_ipv4_pool")
@property
@pulumi.getter
def domain(self) -> pulumi.Output[str]:
"""
Indicates if this EIP is for use in VPC (`vpc`) or EC2 Classic (`standard`).
"""
return pulumi.get(self, "domain")
@property
@pulumi.getter
def instance(self) -> pulumi.Output[str]:
"""
EC2 instance ID.
"""
return pulumi.get(self, "instance")
@property
@pulumi.getter(name="networkBorderGroup")
def network_border_group(self) -> pulumi.Output[str]:
"""
Location from which the IP address is advertised. Use this parameter to limit the address to this location.
"""
return pulumi.get(self, "network_border_group")
@property
@pulumi.getter(name="networkInterface")
def network_interface(self) -> pulumi.Output[str]:
"""
Network interface ID to associate with.
"""
return pulumi.get(self, "network_interface")
@property
@pulumi.getter(name="privateDns")
def private_dns(self) -> pulumi.Output[str]:
"""
The Private DNS associated with the Elastic IP address (if in VPC).
"""
return pulumi.get(self, "private_dns")
@property
@pulumi.getter(name="privateIp")
def private_ip(self) -> pulumi.Output[str]:
"""
Contains the private IP address (if in VPC).
"""
return pulumi.get(self, "private_ip")
@property
@pulumi.getter(name="publicDns")
def public_dns(self) -> pulumi.Output[str]:
"""
Public DNS associated with the Elastic IP address.
"""
return pulumi.get(self, "public_dns")
@property
@pulumi.getter(name="publicIp")
def public_ip(self) -> pulumi.Output[str]:
"""
Contains the public IP address.
"""
return pulumi.get(self, "public_ip")
@property
@pulumi.getter(name="publicIpv4Pool")
def public_ipv4_pool(self) -> pulumi.Output[str]:
"""
EC2 IPv4 address pool identifier or `amazon`. This option is only available for VPC EIPs.
"""
return pulumi.get(self, "public_ipv4_pool")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
return pulumi.get(self, "tags")
@property
@pulumi.getter(name="tagsAll")
def tags_all(self) -> pulumi.Output[Mapping[str, str]]:
return pulumi.get(self, "tags_all")
@property
@pulumi.getter
def vpc(self) -> pulumi.Output[bool]:
"""
Boolean if the EIP is in a VPC or not.
"""
return pulumi.get(self, "vpc")
| 43.673257 | 288 | 0.645771 | 5,279 | 41,970 | 4.913241 | 0.053609 | 0.079732 | 0.086363 | 0.083973 | 0.9173 | 0.895208 | 0.879323 | 0.864518 | 0.855766 | 0.821838 | 0 | 0.009194 | 0.248463 | 41,970 | 960 | 289 | 43.71875 | 0.813106 | 0.368644 | 0 | 0.715164 | 1 | 0 | 0.101139 | 0.023961 | 0 | 0 | 0 | 0 | 0 | 1 | 0.168033 | false | 0.002049 | 0.010246 | 0.010246 | 0.282787 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9c17ae3230abbc6562283ae932d2e84d5038c1f1 | 45,788 | py | Python | asn/views.py | wanxger/GreaterWMS | 882949f462a1bffc6091b074b70b269ce50ca057 | [
"Apache-2.0"
] | null | null | null | asn/views.py | wanxger/GreaterWMS | 882949f462a1bffc6091b074b70b269ce50ca057 | [
"Apache-2.0"
] | null | null | null | asn/views.py | wanxger/GreaterWMS | 882949f462a1bffc6091b074b70b269ce50ca057 | [
"Apache-2.0"
] | null | null | null | from rest_framework import viewsets
from .models import AsnListModel, AsnDetailModel
from . import serializers
from .page import MyPageNumberPaginationASNList
from utils.page import MyPageNumberPagination
from utils.datasolve import sumOfList, transportation_calculate, secret_bar_code, verify_bar_code
from utils.md5 import Md5
from rest_framework.filters import OrderingFilter
from django_filters.rest_framework import DjangoFilterBackend
from rest_framework.response import Response
from .filter import AsnListFilter, AsnDetailFilter
from rest_framework.exceptions import APIException
from supplier.models import ListModel as supplier
from warehouse.models import ListModel as warehouse
from goods.models import ListModel as goods
from payment.models import TransportationFeeListModel as transportation
from stock.models import StockListModel as stocklist
from stock.models import StockBinModel as stockbin
from binset.models import ListModel as binset
from scanner.models import ListModel as scanner
from django.db.models import Q
import re
from .serializers import FileListRenderSerializer, FileDetailRenderSerializer
from django.http import StreamingHttpResponse
from .files import FileListRenderCN, FileListRenderEN, FileDetailRenderCN, FileDetailRenderEN
from rest_framework.settings import api_settings
class AsnListViewSet(viewsets.ModelViewSet):
"""
retrieve:
Response a data list(get)
list:
Response a data list(all)
create:
Create a data line(post)
delete:
Delete a data line(delete)
"""
pagination_class = MyPageNumberPaginationASNList
filter_backends = [DjangoFilterBackend, OrderingFilter, ]
ordering_fields = ['id', "create_time", "update_time", ]
filter_class = AsnListFilter
def get_project(self):
try:
id = self.kwargs.get('pk')
return id
except:
return None
def get_queryset(self):
id = self.get_project()
if self.request.user:
if id is None:
return AsnListModel.objects.filter(openid=self.request.auth.openid, is_delete=False)
else:
return AsnListModel.objects.filter(openid=self.request.auth.openid, id=id, is_delete=False)
else:
return AsnListModel.objects.none()
def get_serializer_class(self):
if self.action in ['list', 'retrieve', 'destroy']:
return serializers.ASNListGetSerializer
elif self.action in ['create']:
return serializers.ASNListPostSerializer
elif self.action in ['update']:
return serializers.ASNListUpdateSerializer
elif self.action in ['partial_update']:
return serializers.ASNListPartialUpdateSerializer
else:
return self.http_method_not_allowed(request=self.request)
def create(self, request, *args, **kwargs):
data = self.request.data
data['openid'] = self.request.auth.openid
if self.get_queryset().filter(openid=data['openid'], is_delete=False).exists():
asn_last_code = self.get_queryset().filter(openid=data['openid']).first().asn_code
asn_add_code = str(int(re.findall(r'\d+', str(asn_last_code), re.IGNORECASE)[0]) + 1).zfill(8)
data['asn_code'] = 'ASN' + asn_add_code
else:
data['asn_code'] = 'ASN00000001'
data['bar_code'] = Md5.md5(data['asn_code'])
serializer = self.get_serializer(data=data)
serializer.is_valid(raise_exception=True)
serializer.save()
scanner.objects.create(openid=self.request.auth.openid, mode="ASN", code=data['asn_code'], bar_code=data['bar_code'])
headers = self.get_success_headers(serializer.data)
return Response(serializer.data, status=200, headers=headers)
def destroy(self, request, pk):
qs = self.get_object()
if qs.openid != self.request.auth.openid:
raise APIException({"detail": "Cannot delete data which not yours"})
else:
if qs.asn_status == 1:
qs.is_delete = True
asn_detail_list = AsnDetailModel.objects.filter(openid=self.request.auth.openid, asn_code=qs.asn_code,
asn_status=1, is_delete=False)
for i in range(len(asn_detail_list)):
goods_qty_change = stocklist.objects.filter(openid=self.request.auth.openid,
goods_code=str(asn_detail_list[i].goods_code)).first()
goods_qty_change.goods_qty = goods_qty_change.goods_qty - int(asn_detail_list[i].goods_qty)
goods_qty_change.asn_stock = goods_qty_change.asn_stock - int(asn_detail_list[i].goods_qty)
goods_qty_change.save()
asn_detail_list.update(is_delete=True)
qs.save()
serializer = self.get_serializer(qs, many=False)
headers = self.get_success_headers(serializer.data)
return Response(serializer.data, status=200, headers=headers)
else:
raise APIException({"detail": "This ASN Status Is Not '1'"})
class AsnDetailViewSet(viewsets.ModelViewSet):
"""
retrieve:
Response a data list(get)
list:
Response a data list(all)
create:
Create a data line(post)
update:
Update a data(put:update)
"""
pagination_class = MyPageNumberPagination
filter_backends = [DjangoFilterBackend, OrderingFilter, ]
ordering_fields = ['id', "create_time", "update_time", ]
filter_class = AsnDetailFilter
def get_project(self):
try:
id = self.kwargs.get('pk')
return id
except:
return None
def get_queryset(self):
id = self.get_project()
if self.request.user:
if id is None:
return AsnDetailModel.objects.filter(openid=self.request.auth.openid, is_delete=False)
else:
return AsnDetailModel.objects.filter(openid=self.request.auth.openid, id=id, is_delete=False)
else:
return AsnDetailModel.objects.none()
def get_serializer_class(self):
if self.action in ['list', 'retrieve']:
return serializers.ASNDetailGetSerializer
elif self.action in ['create']:
return serializers.ASNDetailPostSerializer
elif self.action in ['update']:
return serializers.ASNDetailUpdateSerializer
else:
return self.http_method_not_allowed(request=self.request)
def create(self, request, *args, **kwargs):
data = self.request.data
if AsnListModel.objects.filter(openid=self.request.auth.openid, asn_code=str(data['asn_code']), is_delete=False).exists():
if supplier.objects.filter(openid=self.request.auth.openid, supplier_name=str(data['supplier']), is_delete=False).exists():
for i in range(len(data['goods_code'])):
check_data = {
'openid': self.request.auth.openid,
'asn_code': str(data['asn_code']),
'supplier': str(data['supplier']),
'goods_code': str(data['goods_code'][i]),
'goods_qty': int(data['goods_qty'][i]),
'creater': str(data['creater'])
}
serializer = self.get_serializer(data=check_data)
serializer.is_valid(raise_exception=True)
post_data_list = []
weight_list = []
volume_list = []
for j in range(len(data['goods_code'])):
goods_detail = goods.objects.filter(openid=self.request.auth.openid,
goods_code=str(data['goods_code'][j]),
is_delete=False).first()
goods_weight = round(goods_detail.goods_weight * int(data['goods_qty'][j]) / 1000, 4)
goods_volume = round(goods_detail.unit_volume * int(data['goods_qty'][j]), 4)
if stocklist.objects.filter(openid=self.request.auth.openid, goods_code=str(data['goods_code'][j])).exists():
goods_qty_change = stocklist.objects.filter(openid=self.request.auth.openid,
goods_code=str(data['goods_code'][j])).first()
goods_qty_change.goods_qty = goods_qty_change.goods_qty + int(data['goods_qty'][j])
goods_qty_change.asn_stock = goods_qty_change.asn_stock + int(data['goods_qty'][j])
goods_qty_change.save()
else:
stocklist.objects.create(openid=self.request.auth.openid,
goods_code=str(data['goods_code'][j]),
goods_desc=goods_detail.goods_desc,
goods_qty=int(data['goods_qty'][j]),
asn_stock=int(data['goods_qty'][j]))
post_data = AsnDetailModel(openid=self.request.auth.openid,
asn_code=str(data['asn_code']),
supplier=str(data['supplier']),
goods_code=str(data['goods_code'][j]),
goods_qty=int(data['goods_qty'][j]),
goods_weight=goods_weight,
goods_volume=goods_volume,
creater=str(data['creater']))
post_data_list.append(post_data)
weight_list.append(goods_weight)
volume_list.append(goods_volume)
total_weight = sumOfList(weight_list, len(weight_list))
total_volume = sumOfList(volume_list, len(volume_list))
supplier_city = supplier.objects.filter(openid=self.request.auth.openid,
supplier_name=str(data['supplier']),
is_delete=False).first().supplier_city
warehouse_city = warehouse.objects.filter(openid=self.request.auth.openid).first().warehouse_city
transportation_fee = transportation.objects.filter(
Q(openid=self.request.auth.openid, send_city__icontains=supplier_city, receiver_city__icontains=warehouse_city,
is_delete=False) | Q(openid='init_data', send_city__icontains=supplier_city, receiver_city__icontains=warehouse_city,
is_delete=False))
transportation_res = {
"detail": []
}
if len(transportation_fee) >= 1:
transportation_list = []
for k in range(len(transportation_fee)):
transportation_cost = transportation_calculate(total_weight,
total_volume,
transportation_fee[k].weight_fee,
transportation_fee[k].volume_fee,
transportation_fee[k].min_payment)
transportation_detail = {
"transportation_supplier": transportation_fee[k].transportation_supplier,
"transportation_cost": transportation_cost
}
transportation_list.append(transportation_detail)
transportation_res['detail'] = transportation_list
AsnDetailModel.objects.bulk_create(post_data_list, batch_size=100)
AsnListModel.objects.filter(openid=self.request.auth.openid, asn_code=str(data['asn_code'])).update(
supplier=str(data['supplier']), total_weight=total_weight, total_volume=total_volume,
transportation_fee=transportation_res)
return Response({"detail": "success"}, status=200)
else:
raise APIException({"detail": "Supplier does not exists"})
else:
raise APIException({"detail": "ASN Code does not exists"})
def update(self, request, *args, **kwargs):
data = self.request.data
if AsnListModel.objects.filter(openid=self.request.auth.openid, asn_code=str(data['asn_code']),
asn_status=1, is_delete=False).exists():
if supplier.objects.filter(openid=self.request.auth.openid, supplier_name=str(data['supplier']),
is_delete=False).exists():
for i in range(len(data['goods_code'])):
check_data = {
'openid': self.request.auth.openid,
'asn_code': str(data['asn_code']),
'supplier': str(data['supplier']),
'goods_code': str(data['goods_code'][i]),
'goods_qty': int(data['goods_qty'][i]),
'creater': str(data['creater'])
}
serializer = self.get_serializer(data=check_data)
serializer.is_valid(raise_exception=True)
asn_detail_list = AsnDetailModel.objects.filter(openid=self.request.auth.openid,
asn_code=str(data['asn_code']))
for v in range(len(asn_detail_list)):
goods_qty_change = stocklist.objects.filter(openid=self.request.auth.openid,
goods_code=str(asn_detail_list[v].goods_code)).first()
goods_qty_change.goods_qty = goods_qty_change.goods_qty - asn_detail_list[v].goods_qty
if goods_qty_change.goods_qty < 0:
goods_qty_change.goods_qty = 0
goods_qty_change.asn_stock = goods_qty_change.asn_stock - asn_detail_list[v].goods_qty
if goods_qty_change.asn_stock < 0:
goods_qty_change.asn_stock = 0
goods_qty_change.save()
asn_detail_list.delete()
post_data_list = []
weight_list = []
volume_list = []
for j in range(len(data['goods_code'])):
goods_detail = goods.objects.filter(openid=self.request.auth.openid,
goods_code=str(data['goods_code'][j]),
is_delete=False).first()
goods_weight = round(goods_detail.goods_weight * int(data['goods_qty'][j]) / 1000, 4)
goods_volume = round(goods_detail.unit_volume * int(data['goods_qty'][j]), 4)
if stocklist.objects.filter(openid=self.request.auth.openid, goods_code=str(data['goods_code'][j])).exists():
goods_qty_change = stocklist.objects.filter(openid=self.request.auth.openid,
goods_code=str(data['goods_code'][j])).first()
goods_qty_change.goods_qty = goods_qty_change.goods_qty + int(data['goods_qty'][j])
goods_qty_change.asn_stock = goods_qty_change.asn_stock + int(data['goods_qty'][j])
goods_qty_change.save()
else:
stocklist.objects.create(openid=self.request.auth.openid,
goods_code=str(data['goods_code'][j]),
goods_desc=goods_detail.goods_desc,
goods_qty=int(data['goods_qty'][j]),
asn_stock=int(data['goods_qty'][j]))
post_data = AsnDetailModel(openid=self.request.auth.openid,
asn_code=str(data['asn_code']),
supplier=str(data['supplier']),
goods_code=str(data['goods_code'][j]),
goods_qty=int(data['goods_qty'][j]),
goods_weight=goods_weight,
goods_volume=goods_volume,
creater=str(data['creater']))
post_data_list.append(post_data)
weight_list.append(goods_weight)
volume_list.append(goods_volume)
total_weight = sumOfList(weight_list, len(weight_list))
total_volume = sumOfList(volume_list, len(volume_list))
supplier_city = supplier.objects.filter(openid=self.request.auth.openid,
supplier_name=str(data['supplier']),
is_delete=False).first().supplier_city
warehouse_city = warehouse.objects.filter(openid=self.request.auth.openid).first().warehouse_city
transportation_fee = transportation.objects.filter(
Q(openid=self.request.auth.openid, send_city__icontains=supplier_city,
receiver_city__icontains=warehouse_city,
is_delete=False) | Q(openid='init_data', send_city__icontains=supplier_city,
receiver_city__icontains=warehouse_city,
is_delete=False))
transportation_res = {
"detail": []
}
if len(transportation_fee) >= 1:
transportation_list = []
for k in range(len(transportation_fee)):
transportation_cost = transportation_calculate(total_weight,
total_volume,
transportation_fee[k].weight_fee,
transportation_fee[k].volume_fee,
transportation_fee[k].min_payment)
transportation_detail = {
"transportation_supplier": transportation_fee[k].transportation_supplier,
"transportation_cost": transportation_cost
}
transportation_list.append(transportation_detail)
transportation_res['detail'] = transportation_list
AsnDetailModel.objects.bulk_create(post_data_list, batch_size=100)
AsnListModel.objects.filter(openid=self.request.auth.openid, asn_code=str(data['asn_code'])).update(
supplier=str(data['supplier']), total_weight=total_weight, total_volume=total_volume,
transportation_fee=transportation_res)
return Response({"detail": "success"}, status=200)
else:
raise APIException({"detail": "Supplier does not exists"})
else:
raise APIException({"detail": "This ASN Status Is Not 1"})
class AsnViewPrintViewSet(viewsets.ModelViewSet):
"""
retrieve:
Response a data list(get)
"""
pagination_class = MyPageNumberPagination
filter_backends = [DjangoFilterBackend, OrderingFilter, ]
ordering_fields = ['id', "create_time", "update_time", ]
filter_class = AsnListFilter
def get_project(self):
try:
id = self.kwargs.get('pk')
return id
except:
return None
def get_queryset(self):
id = self.get_project()
if self.request.user:
if id is None:
return AsnListModel.objects.filter(openid=self.request.auth.openid, is_delete=False)
else:
return AsnListModel.objects.filter(openid=self.request.auth.openid, id=id, is_delete=False)
else:
return AsnListModel.objects.none()
def get_serializer_class(self):
if self.action in ['retrieve']:
return serializers.ASNDetailGetSerializer
else:
return self.http_method_not_allowed(request=self.request)
def retrieve(self, request, pk):
qs = self.get_object()
if qs.openid != self.request.auth.openid:
raise APIException({"detail": "Cannot update data which not yours"})
else:
context = {}
asn_detail_list = AsnDetailModel.objects.filter(openid=self.request.auth.openid,
asn_code=qs.asn_code)
asn_detail = serializers.ASNDetailGetSerializer(asn_detail_list, many=True)
supplier_detail = supplier.objects.filter(openid=self.request.auth.openid,
supplier_name=qs.supplier).first()
warehouse_detail = warehouse.objects.filter(openid=self.request.auth.openid,).first()
context['asn_detail'] = asn_detail.data
context['supplier_detail'] = {
"supplier_name": supplier_detail.supplier_name,
"supplier_city": supplier_detail.supplier_city,
"supplier_address": supplier_detail.supplier_address,
"supplier_contact": supplier_detail.supplier_contact
}
context['warehouse_detail'] = {
"warehouse_name": warehouse_detail.warehouse_name,
"warehouse_city": warehouse_detail.warehouse_city,
"warehouse_address": warehouse_detail.warehouse_address,
"warehouse_contact": warehouse_detail.warehouse_contact
}
return Response(context, status=200)
class AsnPreLoadViewSet(viewsets.ModelViewSet):
"""
retrieve:
Response a data list(get)
"""
pagination_class = MyPageNumberPagination
filter_backends = [DjangoFilterBackend, OrderingFilter, ]
ordering_fields = ['id', "create_time", "update_time", ]
filter_class = AsnListFilter
def get_project(self):
try:
id = self.kwargs.get('pk')
return id
except:
return None
def get_queryset(self):
id = self.get_project()
if self.request.user:
if id is None:
return AsnListModel.objects.filter(openid=self.request.auth.openid, is_delete=False)
else:
return AsnListModel.objects.filter(openid=self.request.auth.openid, id=id, is_delete=False)
else:
return AsnListModel.objects.none()
def get_serializer_class(self):
if self.action in ['create']:
return serializers.ASNListPartialUpdateSerializer
else:
return self.http_method_not_allowed(request=self.request)
def create(self, request, pk):
qs = self.get_object()
if qs.openid != self.request.auth.openid:
raise APIException({"detail": "Cannot delete data which not yours"})
else:
if qs.asn_status == 1:
if AsnDetailModel.objects.filter(openid=self.request.auth.openid, asn_code=qs.asn_code,
asn_status=1, is_delete=False).exists():
qs.asn_status = 2
asn_detail_list = AsnDetailModel.objects.filter(openid=self.request.auth.openid, asn_code=qs.asn_code,
asn_status=1, is_delete=False)
for i in range(len(asn_detail_list)):
goods_qty_change = stocklist.objects.filter(openid=self.request.auth.openid,
goods_code=str(asn_detail_list[i].goods_code)).first()
goods_qty_change.asn_stock = goods_qty_change.asn_stock - asn_detail_list[i].goods_qty
if goods_qty_change.asn_stock < 0:
goods_qty_change.asn_stock = 0
goods_qty_change.pre_load_stock = goods_qty_change.pre_load_stock + asn_detail_list[i].goods_qty
goods_qty_change.save()
asn_detail_list.update(asn_status=2)
qs.save()
serializer = self.get_serializer(qs, many=False)
headers = self.get_success_headers(serializer.data)
return Response(serializer.data, status=200, headers=headers)
else:
raise APIException({"detail": "Please Enter The ASN Detail"})
else:
raise APIException({"detail": "This ASN Status Is Not 1"})
class AsnPreSortViewSet(viewsets.ModelViewSet):
"""
retrieve:
Response a data list(get)
"""
pagination_class = MyPageNumberPagination
filter_backends = [DjangoFilterBackend, OrderingFilter, ]
ordering_fields = ['id', "create_time", "update_time", ]
filter_class = AsnListFilter
def get_project(self):
try:
id = self.kwargs.get('pk')
return id
except:
return None
def get_queryset(self):
id = self.get_project()
if self.request.user:
if id is None:
return AsnListModel.objects.filter(openid=self.request.auth.openid, is_delete=False)
else:
return AsnListModel.objects.filter(openid=self.request.auth.openid, id=id, is_delete=False)
else:
return AsnListModel.objects.none()
def get_serializer_class(self):
if self.action in ['create']:
return serializers.ASNListUpdateSerializer
else:
return self.http_method_not_allowed(request=self.request)
def create(self, request, pk):
qs = self.get_object()
if qs.openid != self.request.auth.openid:
raise APIException({"detail": "Cannot delete data which not yours"})
else:
if qs.asn_status == 2:
qs.asn_status = 3
asn_detail_list = AsnDetailModel.objects.filter(openid=self.request.auth.openid, asn_code=qs.asn_code,
asn_status=2, is_delete=False)
for i in range(len(asn_detail_list)):
goods_qty_change = stocklist.objects.filter(openid=self.request.auth.openid,
goods_code=str(asn_detail_list[i].goods_code)).first()
goods_qty_change.pre_load_stock = goods_qty_change.pre_load_stock - asn_detail_list[i].goods_qty
if goods_qty_change.pre_load_stock < 0:
goods_qty_change.pre_load_stock = 0
goods_qty_change.pre_sort_stock = goods_qty_change.pre_sort_stock + asn_detail_list[i].goods_qty
goods_qty_change.save()
asn_detail_list.update(asn_status=3)
qs.save()
serializer = self.get_serializer(qs, many=False)
headers = self.get_success_headers(serializer.data)
return Response(serializer.data, status=200, headers=headers)
else:
raise APIException({"detail": "This ASN Status Is Not 2"})
class AsnSortedViewSet(viewsets.ModelViewSet):
"""
create:
Finish Sorted
"""
pagination_class = MyPageNumberPagination
filter_backends = [DjangoFilterBackend, OrderingFilter, ]
ordering_fields = ['id', "create_time", "update_time", ]
filter_class = AsnListFilter
def get_project(self):
try:
id = self.kwargs.get('pk')
return id
except:
return None
def get_queryset(self):
id = self.get_project()
if self.request.user:
if id is None:
return AsnListModel.objects.filter(openid=self.request.auth.openid, is_delete=False)
else:
return AsnListModel.objects.filter(openid=self.request.auth.openid, id=id, is_delete=False)
else:
return AsnListModel.objects.none()
def get_serializer_class(self):
if self.action in ['create']:
return serializers.ASNSortedPostSerializer
else:
return self.http_method_not_allowed(request=self.request)
def create(self, request, pk):
qs = self.get_object()
if qs.asn_status != 3:
raise APIException({"detail": "This ASN Status Is Not 3"})
else:
data = self.request.data
for i in range(len(data['goodsData'])):
check_data = {
'openid': self.request.auth.openid,
'asn_code': str(data['asn_code']),
'supplier': str(data['supplier']),
'goods_code': str(data['goodsData'][i].get('goods_code')),
'goods_qty': int(data['goodsData'][i].get('goods_actual_qty')),
'creater': str(data['creater'])
}
serializer = self.get_serializer(data=check_data)
serializer.is_valid(raise_exception=True)
for j in range(len(data['goodsData'])):
goods_qty_change = stocklist.objects.filter(openid=self.request.auth.openid,
goods_code=str(
data['goodsData'][j].get('goods_code'))).first()
asn_detail = AsnDetailModel.objects.filter(openid=self.request.auth.openid,
asn_code=str(data['asn_code']),
asn_status=3, supplier=str(data['supplier']),
goods_code=str(
data['goodsData'][j].get('goods_code'))).first()
if int(data['goodsData'][j].get('goods_actual_qty')) == 0:
asn_detail.goods_actual_qty = int(data['goodsData'][j].get('goods_actual_qty'))
asn_detail.goods_shortage_qty = asn_detail.goods_qty
goods_qty_change.goods_qty = goods_qty_change.goods_qty - asn_detail.goods_qty
goods_qty_change.pre_sort_stock = goods_qty_change.pre_sort_stock - asn_detail.goods_qty
asn_detail.asn_status = 5
asn_detail.save()
goods_qty_change.save()
if goods_qty_change.goods_qty == 0 and goods_qty_change.back_order_stock == 0:
goods_qty_change.delete()
else:
asn_detail.goods_actual_qty = int(data['goodsData'][j].get('goods_actual_qty'))
goods_qty_check = asn_detail.goods_qty - int(data['goodsData'][j].get('goods_actual_qty'))
if goods_qty_check > 0:
asn_detail.goods_shortage_qty = goods_qty_check
asn_detail.goods_more_qty = 0
goods_qty_change.goods_qty = goods_qty_change.goods_qty - goods_qty_check
goods_qty_change.pre_sort_stock = goods_qty_change.pre_sort_stock - asn_detail.goods_qty
goods_qty_change.sorted_stock = goods_qty_change.sorted_stock + int(data['goodsData'][j].get('goods_actual_qty'))
elif goods_qty_check == 0:
asn_detail.goods_shortage_qty = 0
asn_detail.goods_more_qty = 0
goods_qty_change.pre_sort_stock = goods_qty_change.pre_sort_stock - int(data['goodsData'][j].get('goods_actual_qty'))
goods_qty_change.sorted_stock = goods_qty_change.sorted_stock + int(data['goodsData'][j].get('goods_actual_qty'))
elif goods_qty_check < 0:
asn_detail.goods_shortage_qty = 0
asn_detail.goods_more_qty = abs(goods_qty_check)
goods_qty_change.goods_qty = goods_qty_change.goods_qty + abs(goods_qty_check)
goods_qty_change.pre_sort_stock = goods_qty_change.pre_sort_stock - asn_detail.goods_qty
goods_qty_change.sorted_stock = goods_qty_change.sorted_stock + int(data['goodsData'][j].get('goods_actual_qty'))
asn_detail.asn_status = 4
asn_detail.save()
goods_qty_change.save()
if goods_qty_change.goods_qty == 0 and goods_qty_change.back_order_stock == 0:
goods_qty_change.delete()
if AsnDetailModel.objects.filter(openid=self.request.auth.openid, asn_code=str(data['asn_code']),
asn_status=4, supplier=str(data['supplier'])).exists():
qs.asn_status = 4
else:
qs.asn_status = 5
qs.save()
return Response({"detail": "success"}, status=200)
class MoveToBinViewSet(viewsets.ModelViewSet):
"""
create:
Create a data line(post)
"""
pagination_class = MyPageNumberPagination
filter_backends = [DjangoFilterBackend, OrderingFilter, ]
ordering_fields = ['id', "create_time", "update_time", ]
filter_class = AsnDetailFilter
def get_project(self):
try:
id = self.kwargs.get('pk')
return id
except:
return None
def get_queryset(self):
id = self.get_project()
if self.request.user:
if id is None:
return AsnDetailModel.objects.filter(openid=self.request.auth.openid, is_delete=False)
else:
return AsnDetailModel.objects.filter(openid=self.request.auth.openid, id=id, is_delete=False)
else:
return AsnDetailModel.objects.none()
def get_serializer_class(self):
if self.action in ['retrieve']:
return serializers.ASNDetailGetSerializer
elif self.action in ['create']:
return serializers.MoveToBinSerializer
else:
return self.http_method_not_allowed(request=self.request)
def create(self, request, pk):
qs = self.get_object()
if qs.openid != self.request.auth.openid:
raise APIException({"detail": "Cannot delete data which not yours"})
else:
if qs.asn_status != 4:
raise APIException({"detail": "This ASN Status Is Not 4"})
else:
data = self.request.data
if 'bin_name' not in data:
raise APIException({"detail": "Please Enter the Bin Name"})
else:
bin_detail = binset.objects.filter(openid=self.request.auth.openid,
bin_name=str(data['bin_name'])).first()
asn_detail = AsnListModel.objects.filter(openid=self.request.auth.openid,
asn_code=str(data['asn_code'])).first()
goods_qty_change = stocklist.objects.filter(openid=self.request.auth.openid,
goods_code=str(data['goods_code'])).first()
if int(data['qty']) <= 0:
raise APIException({"detail": "Move QTY Must > 0"})
else:
move_qty = qs.goods_actual_qty - qs.sorted_qty - int(data['qty'])
if move_qty > 0:
qs.sorted_qty = qs.sorted_qty + int(data['qty'])
goods_qty_change.sorted_stock = goods_qty_change.sorted_stock - int(data['qty'])
goods_qty_change.onhand_stock = goods_qty_change.onhand_stock + int(data['qty'])
if bin_detail.bin_property == 'Damage':
goods_qty_change.damage_stock = goods_qty_change.damage_stock + int(data['qty'])
qs.goods_damage_qty = qs.goods_damage_qty + int(data['qty'])
elif bin_detail.bin_property == 'Inspection':
goods_qty_change.inspect_stock = goods_qty_change.inspect_stock + int(data['qty'])
elif bin_detail.bin_property == 'Holding':
goods_qty_change.hold_stock = goods_qty_change.hold_stock + int(data['qty'])
else:
goods_qty_change.can_order_stock = goods_qty_change.can_order_stock + int(data['qty'])
qs.save()
goods_qty_change.save()
stockbin.objects.create(openid=self.request.auth.openid,
bin_name=str(data['bin_name']),
goods_code=str(data['goods_code']),
goods_desc=goods_qty_change.goods_desc,
goods_qty=int(data['qty']),
bin_size=bin_detail.bin_size,
bin_property=bin_detail.bin_property,
t_code=Md5.md5(str(data['goods_code'])),
create_time=qs.create_time
)
if bin_detail.empty_label == True:
bin_detail.empty_label = False
bin_detail.save()
elif move_qty == 0:
qs.sorted_qty = qs.sorted_qty + int(data['qty'])
qs.asn_status = 5
goods_qty_change.sorted_stock = goods_qty_change.sorted_stock - int(data['qty'])
goods_qty_change.onhand_stock = goods_qty_change.onhand_stock + int(data['qty'])
if bin_detail.bin_property == 'Damage':
goods_qty_change.damage_stock = goods_qty_change.damage_stock + int(data['qty'])
qs.goods_damage_qty = qs.goods_damage_qty + int(data['qty'])
elif bin_detail.bin_property == 'Inspection':
goods_qty_change.inspect_stock = goods_qty_change.inspect_stock + int(data['qty'])
elif bin_detail.bin_property == 'Holding':
goods_qty_change.hold_stock = goods_qty_change.hold_stock + int(data['qty'])
else:
goods_qty_change.can_order_stock = goods_qty_change.can_order_stock + int(data['qty'])
qs.save()
goods_qty_change.save()
if AsnDetailModel.objects.filter(openid=self.request.auth.openid,
asn_code=str(data['asn_code']),
asn_status=4
).exists():
pass
else:
asn_detail.asn_status = 5
asn_detail.save()
stockbin.objects.create(openid=self.request.auth.openid,
bin_name=str(data['bin_name']),
goods_code=str(data['goods_code']),
goods_desc=goods_qty_change.goods_desc,
goods_qty=int(data['qty']),
bin_size=bin_detail.bin_size,
bin_property=bin_detail.bin_property,
t_code=Md5.md5(str(data['goods_code'])),
create_time=qs.create_time)
if bin_detail.empty_label == True:
bin_detail.empty_label = False
bin_detail.save()
elif move_qty < 0:
raise APIException({"detail": "Move Qty must < Actual Arrive Qty"})
return Response({"detail": "success"}, status=200)
class FileListDownloadView(viewsets.ModelViewSet):
renderer_classes = (FileListRenderCN, ) + tuple(api_settings.DEFAULT_RENDERER_CLASSES)
filter_backends = [DjangoFilterBackend, OrderingFilter, ]
ordering_fields = ['id', "create_time", "update_time", ]
filter_class = AsnListFilter
def get_project(self):
try:
id = self.kwargs.get('pk')
return id
except:
return None
def get_queryset(self):
id = self.get_project()
if self.request.user:
if id is None:
return AsnListModel.objects.filter(openid=self.request.auth.openid, is_delete=False)
else:
return AsnListModel.objects.filter(openid=self.request.auth.openid, id=id, is_delete=False)
else:
return AsnListModel.objects.none()
def get_serializer_class(self):
if self.action in ['list']:
return serializers.FileListRenderSerializer
else:
return self.http_method_not_allowed(request=self.request)
def get_lang(self, data):
lang = self.request.META.get('HTTP_LANGUAGE')
if lang:
if lang == 'zh-hans':
return FileListRenderCN().render(data)
else:
return FileListRenderEN().render(data)
else:
return FileListRenderEN().render(data)
def list(self, request, *args, **kwargs):
from datetime import datetime
dt = datetime.now()
data = (
FileListRenderSerializer(instance).data
for instance in self.filter_queryset(self.get_queryset())
)
renderer = self.get_lang(data)
response = StreamingHttpResponse(
renderer,
content_type="text/csv"
)
response['Content-Disposition'] = "attachment; filename='asnlist_{}.csv'".format(str(dt.strftime('%Y%m%d%H%M%S%f')))
return response
class FileDetailDownloadView(viewsets.ModelViewSet):
serializer_class = serializers.FileDetailRenderSerializer
renderer_classes = (FileDetailRenderCN, ) + tuple(api_settings.DEFAULT_RENDERER_CLASSES)
filter_backends = [DjangoFilterBackend, OrderingFilter, ]
ordering_fields = ['id', "create_time", "update_time", ]
filter_class = AsnDetailFilter
def get_project(self):
try:
id = self.kwargs.get('pk')
return id
except:
return None
def get_queryset(self):
id = self.get_project()
if self.request.user:
if id is None:
return AsnDetailModel.objects.filter(openid=self.request.auth.openid, is_delete=False)
else:
return AsnDetailModel.objects.filter(openid=self.request.auth.openid, id=id, is_delete=False)
else:
return AsnDetailModel.objects.none()
def get_serializer_class(self):
if self.action == 'list':
return serializers.FileDetailRenderSerializer
else:
return self.http_method_not_allowed(request=self.request)
def get_lang(self, data):
lang = self.request.META.get('HTTP_LANGUAGE')
if lang:
if lang == 'zh-hans':
return FileDetailRenderCN().render(data)
else:
return FileDetailRenderEN().render(data)
else:
return FileDetailRenderEN().render(data)
def list(self, request, *args, **kwargs):
from datetime import datetime
dt = datetime.now()
data = (
FileDetailRenderSerializer(instance).data
for instance in self.filter_queryset(self.get_queryset())
)
renderer = self.get_lang(data)
response = StreamingHttpResponse(
renderer,
content_type="text/csv"
)
response['Content-Disposition'] = "attachment; filename='asndetail_{}.csv'".format(str(dt.strftime('%Y%m%d%H%M%S%f')))
return response
| 53.056779 | 141 | 0.545623 | 4,605 | 45,788 | 5.182193 | 0.057329 | 0.055984 | 0.059839 | 0.062479 | 0.871396 | 0.858238 | 0.851576 | 0.82702 | 0.815538 | 0.805858 | 0 | 0.004116 | 0.363327 | 45,788 | 862 | 142 | 53.118329 | 0.81449 | 0.010723 | 0 | 0.754569 | 0 | 0 | 0.061297 | 0.002177 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052219 | false | 0.001305 | 0.036554 | 0 | 0.261097 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
92be6a0962f7c951f05a1211ccbd84fe97ce106c | 14,750 | py | Python | sdk/python/pulumi_aws/sqs/queue.py | dixler/pulumi-aws | 88838ed6d412c092717a916b0b5b154f68226c3a | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/sqs/queue.py | dixler/pulumi-aws | 88838ed6d412c092717a916b0b5b154f68226c3a | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/sqs/queue.py | dixler/pulumi-aws | 88838ed6d412c092717a916b0b5b154f68226c3a | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import json
import warnings
import pulumi
import pulumi.runtime
from typing import Union
from .. import utilities, tables
class Queue(pulumi.CustomResource):
arn: pulumi.Output[str]
"""
The ARN of the SQS queue
"""
content_based_deduplication: pulumi.Output[bool]
"""
Enables content-based deduplication for FIFO queues. For more information, see the [related documentation](http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html#FIFO-queues-exactly-once-processing)
"""
delay_seconds: pulumi.Output[float]
"""
The time in seconds that the delivery of all messages in the queue will be delayed. An integer from 0 to 900 (15 minutes). The default for this attribute is 0 seconds.
"""
fifo_queue: pulumi.Output[bool]
"""
Boolean designating a FIFO queue. If not set, it defaults to `false` making it standard.
"""
kms_data_key_reuse_period_seconds: pulumi.Output[float]
"""
The length of time, in seconds, for which Amazon SQS can reuse a data key to encrypt or decrypt messages before calling AWS KMS again. An integer representing seconds, between 60 seconds (1 minute) and 86,400 seconds (24 hours). The default is 300 (5 minutes).
"""
kms_master_key_id: pulumi.Output[str]
"""
The ID of an AWS-managed customer master key (CMK) for Amazon SQS or a custom CMK. For more information, see [Key Terms](http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-server-side-encryption.html#sqs-sse-key-terms).
"""
max_message_size: pulumi.Output[float]
"""
The limit of how many bytes a message can contain before Amazon SQS rejects it. An integer from 1024 bytes (1 KiB) up to 262144 bytes (256 KiB). The default for this attribute is 262144 (256 KiB).
"""
message_retention_seconds: pulumi.Output[float]
"""
The number of seconds Amazon SQS retains a message. Integer representing seconds, from 60 (1 minute) to 1209600 (14 days). The default for this attribute is 345600 (4 days).
"""
name: pulumi.Output[str]
"""
This is the human-readable name of the queue. If omitted, this provider will assign a random name.
"""
name_prefix: pulumi.Output[str]
"""
Creates a unique name beginning with the specified prefix. Conflicts with `name`.
"""
policy: pulumi.Output[str]
receive_wait_time_seconds: pulumi.Output[float]
"""
The time for which a ReceiveMessage call will wait for a message to arrive (long polling) before returning. An integer from 0 to 20 (seconds). The default for this attribute is 0, meaning that the call will return immediately.
"""
redrive_policy: pulumi.Output[str]
"""
The JSON policy to set up the Dead Letter Queue, see [AWS docs](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/SQSDeadLetterQueue.html). **Note:** when specifying `maxReceiveCount`, you must specify it as an integer (`5`), and not a string (`"5"`).
"""
tags: pulumi.Output[dict]
"""
A mapping of tags to assign to the queue.
"""
visibility_timeout_seconds: pulumi.Output[float]
"""
The visibility timeout for the queue. An integer from 0 to 43200 (12 hours). The default for this attribute is 30. For more information about visibility timeout, see [AWS docs](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/AboutVT.html).
"""
def __init__(__self__, resource_name, opts=None, content_based_deduplication=None, delay_seconds=None, fifo_queue=None, kms_data_key_reuse_period_seconds=None, kms_master_key_id=None, max_message_size=None, message_retention_seconds=None, name=None, name_prefix=None, policy=None, receive_wait_time_seconds=None, redrive_policy=None, tags=None, visibility_timeout_seconds=None, __props__=None, __name__=None, __opts__=None):
"""
Create a Queue resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] content_based_deduplication: Enables content-based deduplication for FIFO queues. For more information, see the [related documentation](http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html#FIFO-queues-exactly-once-processing)
:param pulumi.Input[float] delay_seconds: The time in seconds that the delivery of all messages in the queue will be delayed. An integer from 0 to 900 (15 minutes). The default for this attribute is 0 seconds.
:param pulumi.Input[bool] fifo_queue: Boolean designating a FIFO queue. If not set, it defaults to `false` making it standard.
:param pulumi.Input[float] kms_data_key_reuse_period_seconds: The length of time, in seconds, for which Amazon SQS can reuse a data key to encrypt or decrypt messages before calling AWS KMS again. An integer representing seconds, between 60 seconds (1 minute) and 86,400 seconds (24 hours). The default is 300 (5 minutes).
:param pulumi.Input[str] kms_master_key_id: The ID of an AWS-managed customer master key (CMK) for Amazon SQS or a custom CMK. For more information, see [Key Terms](http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-server-side-encryption.html#sqs-sse-key-terms).
:param pulumi.Input[float] max_message_size: The limit of how many bytes a message can contain before Amazon SQS rejects it. An integer from 1024 bytes (1 KiB) up to 262144 bytes (256 KiB). The default for this attribute is 262144 (256 KiB).
:param pulumi.Input[float] message_retention_seconds: The number of seconds Amazon SQS retains a message. Integer representing seconds, from 60 (1 minute) to 1209600 (14 days). The default for this attribute is 345600 (4 days).
:param pulumi.Input[str] name: This is the human-readable name of the queue. If omitted, this provider will assign a random name.
:param pulumi.Input[str] name_prefix: Creates a unique name beginning with the specified prefix. Conflicts with `name`.
:param pulumi.Input[float] receive_wait_time_seconds: The time for which a ReceiveMessage call will wait for a message to arrive (long polling) before returning. An integer from 0 to 20 (seconds). The default for this attribute is 0, meaning that the call will return immediately.
:param pulumi.Input[str] redrive_policy: The JSON policy to set up the Dead Letter Queue, see [AWS docs](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/SQSDeadLetterQueue.html). **Note:** when specifying `maxReceiveCount`, you must specify it as an integer (`5`), and not a string (`"5"`).
:param pulumi.Input[dict] tags: A mapping of tags to assign to the queue.
:param pulumi.Input[float] visibility_timeout_seconds: The visibility timeout for the queue. An integer from 0 to 43200 (12 hours). The default for this attribute is 30. For more information about visibility timeout, see [AWS docs](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/AboutVT.html).
> This content is derived from https://github.com/terraform-providers/terraform-provider-aws/blob/master/website/docs/r/sqs_queue.html.markdown.
"""
if __name__ is not None:
warnings.warn("explicit use of __name__ is deprecated", DeprecationWarning)
resource_name = __name__
if __opts__ is not None:
warnings.warn("explicit use of __opts__ is deprecated, use 'opts' instead", DeprecationWarning)
opts = __opts__
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = dict()
__props__['content_based_deduplication'] = content_based_deduplication
__props__['delay_seconds'] = delay_seconds
__props__['fifo_queue'] = fifo_queue
__props__['kms_data_key_reuse_period_seconds'] = kms_data_key_reuse_period_seconds
__props__['kms_master_key_id'] = kms_master_key_id
__props__['max_message_size'] = max_message_size
__props__['message_retention_seconds'] = message_retention_seconds
__props__['name'] = name
__props__['name_prefix'] = name_prefix
__props__['policy'] = policy
__props__['receive_wait_time_seconds'] = receive_wait_time_seconds
__props__['redrive_policy'] = redrive_policy
__props__['tags'] = tags
__props__['visibility_timeout_seconds'] = visibility_timeout_seconds
__props__['arn'] = None
super(Queue, __self__).__init__(
'aws:sqs/queue:Queue',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name, id, opts=None, arn=None, content_based_deduplication=None, delay_seconds=None, fifo_queue=None, kms_data_key_reuse_period_seconds=None, kms_master_key_id=None, max_message_size=None, message_retention_seconds=None, name=None, name_prefix=None, policy=None, receive_wait_time_seconds=None, redrive_policy=None, tags=None, visibility_timeout_seconds=None):
"""
Get an existing Queue resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param str id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] arn: The ARN of the SQS queue
:param pulumi.Input[bool] content_based_deduplication: Enables content-based deduplication for FIFO queues. For more information, see the [related documentation](http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html#FIFO-queues-exactly-once-processing)
:param pulumi.Input[float] delay_seconds: The time in seconds that the delivery of all messages in the queue will be delayed. An integer from 0 to 900 (15 minutes). The default for this attribute is 0 seconds.
:param pulumi.Input[bool] fifo_queue: Boolean designating a FIFO queue. If not set, it defaults to `false` making it standard.
:param pulumi.Input[float] kms_data_key_reuse_period_seconds: The length of time, in seconds, for which Amazon SQS can reuse a data key to encrypt or decrypt messages before calling AWS KMS again. An integer representing seconds, between 60 seconds (1 minute) and 86,400 seconds (24 hours). The default is 300 (5 minutes).
:param pulumi.Input[str] kms_master_key_id: The ID of an AWS-managed customer master key (CMK) for Amazon SQS or a custom CMK. For more information, see [Key Terms](http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-server-side-encryption.html#sqs-sse-key-terms).
:param pulumi.Input[float] max_message_size: The limit of how many bytes a message can contain before Amazon SQS rejects it. An integer from 1024 bytes (1 KiB) up to 262144 bytes (256 KiB). The default for this attribute is 262144 (256 KiB).
:param pulumi.Input[float] message_retention_seconds: The number of seconds Amazon SQS retains a message. Integer representing seconds, from 60 (1 minute) to 1209600 (14 days). The default for this attribute is 345600 (4 days).
:param pulumi.Input[str] name: This is the human-readable name of the queue. If omitted, this provider will assign a random name.
:param pulumi.Input[str] name_prefix: Creates a unique name beginning with the specified prefix. Conflicts with `name`.
:param pulumi.Input[float] receive_wait_time_seconds: The time for which a ReceiveMessage call will wait for a message to arrive (long polling) before returning. An integer from 0 to 20 (seconds). The default for this attribute is 0, meaning that the call will return immediately.
:param pulumi.Input[str] redrive_policy: The JSON policy to set up the Dead Letter Queue, see [AWS docs](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/SQSDeadLetterQueue.html). **Note:** when specifying `maxReceiveCount`, you must specify it as an integer (`5`), and not a string (`"5"`).
:param pulumi.Input[dict] tags: A mapping of tags to assign to the queue.
:param pulumi.Input[float] visibility_timeout_seconds: The visibility timeout for the queue. An integer from 0 to 43200 (12 hours). The default for this attribute is 30. For more information about visibility timeout, see [AWS docs](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/AboutVT.html).
> This content is derived from https://github.com/terraform-providers/terraform-provider-aws/blob/master/website/docs/r/sqs_queue.html.markdown.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = dict()
__props__["arn"] = arn
__props__["content_based_deduplication"] = content_based_deduplication
__props__["delay_seconds"] = delay_seconds
__props__["fifo_queue"] = fifo_queue
__props__["kms_data_key_reuse_period_seconds"] = kms_data_key_reuse_period_seconds
__props__["kms_master_key_id"] = kms_master_key_id
__props__["max_message_size"] = max_message_size
__props__["message_retention_seconds"] = message_retention_seconds
__props__["name"] = name
__props__["name_prefix"] = name_prefix
__props__["policy"] = policy
__props__["receive_wait_time_seconds"] = receive_wait_time_seconds
__props__["redrive_policy"] = redrive_policy
__props__["tags"] = tags
__props__["visibility_timeout_seconds"] = visibility_timeout_seconds
return Queue(resource_name, opts=opts, __props__=__props__)
def translate_output_property(self, prop):
return tables._CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
def translate_input_property(self, prop):
return tables._SNAKE_TO_CAMEL_CASE_TABLE.get(prop) or prop
| 81.491713 | 428 | 0.733424 | 2,086 | 14,750 | 4.979386 | 0.127996 | 0.030711 | 0.04159 | 0.02455 | 0.845095 | 0.829017 | 0.813806 | 0.813806 | 0.807259 | 0.804178 | 0 | 0.019769 | 0.183797 | 14,750 | 180 | 429 | 81.944444 | 0.84301 | 0.470712 | 0 | 0.024096 | 1 | 0 | 0.152232 | 0.055956 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048193 | false | 0.012048 | 0.072289 | 0.024096 | 0.349398 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
92cbd8b27a57ebd56a47f52dbca5222ae3375c59 | 43,913 | py | Python | tests/unit/commands/samconfig/test_samconfig.py | valerena/aws-sam-cli | 2aa7bf01b2e0b0864ef63b1898a8b30577443acc | [
"BSD-2-Clause",
"Apache-2.0"
] | 1 | 2022-01-27T02:42:38.000Z | 2022-01-27T02:42:38.000Z | tests/unit/commands/samconfig/test_samconfig.py | valerena/aws-sam-cli | 2aa7bf01b2e0b0864ef63b1898a8b30577443acc | [
"BSD-2-Clause",
"Apache-2.0"
] | null | null | null | tests/unit/commands/samconfig/test_samconfig.py | valerena/aws-sam-cli | 2aa7bf01b2e0b0864ef63b1898a8b30577443acc | [
"BSD-2-Clause",
"Apache-2.0"
] | null | null | null | """
Tests whether SAM Config is being read by all CLI commands
"""
import json
import os
import shutil
import tempfile
from pathlib import Path
from contextlib import contextmanager
from samcli.commands._utils.experimental import ExperimentalFlag, set_experimental
from samcli.lib.config.samconfig import SamConfig, DEFAULT_ENV
from click.testing import CliRunner
from unittest import TestCase
from unittest.mock import patch, ANY
import logging
from samcli.lib.utils.packagetype import ZIP, IMAGE
LOG = logging.getLogger()
logging.basicConfig()
class TestSamConfigForAllCommands(TestCase):
def setUp(self):
self._old_cwd = os.getcwd()
self.scratch_dir = tempfile.mkdtemp()
Path(self.scratch_dir, "envvar.json").write_text("{}")
Path(self.scratch_dir, "container-envvar.json").write_text("{}")
os.chdir(self.scratch_dir)
def tearDown(self):
os.chdir(self._old_cwd)
shutil.rmtree(self.scratch_dir)
self.scratch_dir = None
@patch("samcli.commands.init.do_cli")
def test_init(self, do_cli_mock):
config_values = {
"no_interactive": True,
"location": "github.com",
"runtime": "nodejs10.x",
"dependency_manager": "maven",
"output_dir": "myoutput",
"name": "myname",
"app_template": "apptemplate",
"no_input": True,
"extra_context": '{"key": "value", "key2": "value2"}',
}
with samconfig_parameters(["init"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.init import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
ANY,
True,
"github.com",
False,
ZIP,
"nodejs10.x",
None,
None,
"maven",
"myoutput",
"myname",
"apptemplate",
True,
'{"key": "value", "key2": "value2"}',
)
@patch("samcli.commands.validate.validate.do_cli")
def test_validate(self, do_cli_mock):
config_values = {"template_file": "mytemplate.yaml"}
with samconfig_parameters(["validate"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.validate.validate import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(ANY, str(Path(os.getcwd(), "mytemplate.yaml")))
@patch("samcli.commands.build.command.do_cli")
def test_build(self, do_cli_mock):
config_values = {
"resource_logical_id": "foo",
"template_file": "mytemplate.yaml",
"base_dir": "basedir",
"build_dir": "builddir",
"cache_dir": "cachedir",
"cache": False,
"use_container": True,
"manifest": "requirements.txt",
"docker_network": "mynetwork",
"skip_pull_image": True,
"parameter_overrides": "ParameterKey=Key,ParameterValue=Value ParameterKey=Key2,ParameterValue=Value2",
"container_env_var": (""),
"container_env_var_file": "file",
"build_image": (""),
}
with samconfig_parameters(["build"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.build.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
ANY,
"foo",
str(Path(os.getcwd(), "mytemplate.yaml")),
"basedir",
"builddir",
"cachedir",
True,
True,
False,
False,
"requirements.txt",
"mynetwork",
True,
{"Key": "Value", "Key2": "Value2"},
None,
(),
"file",
(),
)
@patch("samcli.commands.build.command.do_cli")
def test_build_with_container_env_vars(self, do_cli_mock):
config_values = {
"resource_logical_id": "foo",
"template_file": "mytemplate.yaml",
"base_dir": "basedir",
"build_dir": "builddir",
"cache_dir": "cachedir",
"cache": False,
"use_container": True,
"manifest": "requirements.txt",
"docker_network": "mynetwork",
"skip_pull_image": True,
"parameter_overrides": "ParameterKey=Key,ParameterValue=Value ParameterKey=Key2,ParameterValue=Value2",
"container_env_var": (""),
"container_env_var_file": "env_vars_file",
}
with samconfig_parameters(["build"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.build.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
ANY,
"foo",
str(Path(os.getcwd(), "mytemplate.yaml")),
"basedir",
"builddir",
"cachedir",
True,
True,
False,
False,
"requirements.txt",
"mynetwork",
True,
{"Key": "Value", "Key2": "Value2"},
None,
(),
"env_vars_file",
(),
)
@patch("samcli.commands.build.command.do_cli")
def test_build_with_build_images(self, do_cli_mock):
config_values = {
"resource_logical_id": "foo",
"template_file": "mytemplate.yaml",
"base_dir": "basedir",
"build_dir": "builddir",
"cache_dir": "cachedir",
"cache": False,
"use_container": True,
"manifest": "requirements.txt",
"docker_network": "mynetwork",
"skip_pull_image": True,
"parameter_overrides": "ParameterKey=Key,ParameterValue=Value ParameterKey=Key2,ParameterValue=Value2",
"build_image": ["Function1=image_1", "image_2"],
}
with samconfig_parameters(["build"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.build.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
ANY,
"foo",
str(Path(os.getcwd(), "mytemplate.yaml")),
"basedir",
"builddir",
"cachedir",
True,
True,
False,
False,
"requirements.txt",
"mynetwork",
True,
{"Key": "Value", "Key2": "Value2"},
None,
(),
None,
("Function1=image_1", "image_2"),
)
@patch("samcli.commands.local.invoke.cli.do_cli")
def test_local_invoke(self, do_cli_mock):
config_values = {
"function_logical_id": "foo",
"template_file": "mytemplate.yaml",
"event": "event",
"no_event": False,
"env_vars": "envvar.json",
"debug_port": [1, 2, 3],
"debug_args": "args",
"debugger_path": "mypath",
"container_env_vars": "container-envvar.json",
"docker_volume_basedir": "basedir",
"docker_network": "mynetwork",
"log_file": "logfile",
"layer_cache_basedir": "basedir",
"skip_pull_image": True,
"force_image_build": True,
"shutdown": True,
"parameter_overrides": "ParameterKey=Key,ParameterValue=Value ParameterKey=Key2,ParameterValue=Value2",
"invoke_image": ["image"],
}
# NOTE: Because we don't load the full Click BaseCommand here, this is mounted as top-level command
with samconfig_parameters(["invoke"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.local.invoke.cli import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
ANY,
"foo",
str(Path(os.getcwd(), "mytemplate.yaml")),
"event",
False,
"envvar.json",
(1, 2, 3),
"args",
"mypath",
"container-envvar.json",
"basedir",
"mynetwork",
"logfile",
"basedir",
True,
True,
True,
{"Key": "Value", "Key2": "Value2"},
"localhost",
"127.0.0.1",
("image",),
)
@patch("samcli.commands.local.start_api.cli.do_cli")
def test_local_start_api(self, do_cli_mock):
config_values = {
"template_file": "mytemplate.yaml",
"host": "127.0.0.1",
"port": 12345,
"static_dir": "static_dir",
"env_vars": "envvar.json",
"debug_port": [1, 2, 3],
"debug_args": "args",
"debugger_path": "mypath",
"container_env_vars": "container-envvar.json",
"docker_volume_basedir": "basedir",
"docker_network": "mynetwork",
"log_file": "logfile",
"layer_cache_basedir": "basedir",
"skip_pull_image": True,
"force_image_build": True,
"shutdown": False,
"parameter_overrides": "ParameterKey=Key,ParameterValue=Value ParameterKey=Key2,ParameterValue=Value2",
"invoke_image": ["image"],
}
# NOTE: Because we don't load the full Click BaseCommand here, this is mounted as top-level command
with samconfig_parameters(["start-api"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.local.start_api.cli import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
ANY,
"127.0.0.1",
12345,
"static_dir",
str(Path(os.getcwd(), "mytemplate.yaml")),
"envvar.json",
(1, 2, 3),
"args",
"mypath",
"container-envvar.json",
"basedir",
"mynetwork",
"logfile",
"basedir",
True,
True,
{"Key": "Value", "Key2": "Value2"},
None,
False,
None,
"localhost",
"127.0.0.1",
("image",),
)
@patch("samcli.commands.local.start_lambda.cli.do_cli")
def test_local_start_lambda(self, do_cli_mock):
config_values = {
"template_file": "mytemplate.yaml",
"host": "127.0.0.1",
"port": 12345,
"env_vars": "envvar.json",
"debug_port": [1, 2, 3],
"debug_args": "args",
"debugger_path": "mypath",
"container_env_vars": "container-envvar.json",
"docker_volume_basedir": "basedir",
"docker_network": "mynetwork",
"log_file": "logfile",
"layer_cache_basedir": "basedir",
"skip_pull_image": True,
"force_image_build": True,
"shutdown": False,
"parameter_overrides": "ParameterKey=Key,ParameterValue=Value",
"invoke_image": ["image"],
}
# NOTE: Because we don't load the full Click BaseCommand here, this is mounted as top-level command
with samconfig_parameters(["start-lambda"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.local.start_lambda.cli import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
ANY,
"127.0.0.1",
12345,
str(Path(os.getcwd(), "mytemplate.yaml")),
"envvar.json",
(1, 2, 3),
"args",
"mypath",
"container-envvar.json",
"basedir",
"mynetwork",
"logfile",
"basedir",
True,
True,
{"Key": "Value"},
None,
False,
None,
"localhost",
"127.0.0.1",
("image",),
)
@patch("samcli.lib.cli_validation.image_repository_validation._is_all_image_funcs_provided")
@patch("samcli.lib.cli_validation.image_repository_validation.get_template_artifacts_format")
@patch("samcli.commands._utils.options.get_template_artifacts_format")
@patch("samcli.commands.package.command.do_cli")
def test_package(
self,
do_cli_mock,
get_template_artifacts_format_mock,
cli_validation_artifacts_format_mock,
is_all_image_funcs_provided_mock,
):
is_all_image_funcs_provided_mock.return_value = True
cli_validation_artifacts_format_mock.return_value = [ZIP]
get_template_artifacts_format_mock.return_value = [ZIP]
config_values = {
"template_file": "mytemplate.yaml",
"s3_bucket": "mybucket",
"force_upload": True,
"s3_prefix": "myprefix",
"image_repository": "123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
"kms_key_id": "mykms",
"use_json": True,
"metadata": '{"m1": "value1", "m2": "value2"}',
"region": "myregion",
"output_template_file": "output.yaml",
"signing_profiles": "function=profile:owner",
}
with samconfig_parameters(["package"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.package.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
str(Path(os.getcwd(), "mytemplate.yaml")),
"mybucket",
"123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
None,
"myprefix",
"mykms",
"output.yaml",
True,
True,
False,
{"m1": "value1", "m2": "value2"},
{"function": {"profile_name": "profile", "profile_owner": "owner"}},
"myregion",
None,
False,
)
@patch("samcli.commands._utils.options.get_template_artifacts_format")
@patch("samcli.commands.package.command.do_cli")
def test_package_with_image_repository_and_image_repositories(
self, do_cli_mock, get_template_artifacts_format_mock
):
get_template_artifacts_format_mock.return_value = [IMAGE]
config_values = {
"template_file": "mytemplate.yaml",
"s3_bucket": "mybucket",
"force_upload": True,
"s3_prefix": "myprefix",
"image_repository": "123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
"image_repositories": ["HelloWorldFunction=123456789012.dkr.ecr.us-east-1.amazonaws.com/test1"],
"kms_key_id": "mykms",
"use_json": True,
"metadata": '{"m1": "value1", "m2": "value2"}',
"region": "myregion",
"output_template_file": "output.yaml",
"signing_profiles": "function=profile:owner",
}
with samconfig_parameters(["package"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.package.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
self.assertIsNotNone(result.exception)
@patch("samcli.lib.cli_validation.image_repository_validation.get_template_artifacts_format")
@patch("samcli.commands._utils.template.get_template_artifacts_format")
@patch("samcli.commands._utils.options.get_template_artifacts_format")
@patch("samcli.commands.deploy.command.do_cli")
def test_deploy(self, do_cli_mock, template_artifacts_mock1, template_artifacts_mock2, template_artifacts_mock3):
template_artifacts_mock1.return_value = [ZIP]
template_artifacts_mock2.return_value = [ZIP]
template_artifacts_mock3.return_value = [ZIP]
config_values = {
"template_file": "mytemplate.yaml",
"stack_name": "mystack",
"s3_bucket": "mybucket",
"image_repository": "123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
"force_upload": True,
"s3_prefix": "myprefix",
"kms_key_id": "mykms",
"parameter_overrides": "ParameterKey=Key,ParameterValue=Value",
"capabilities": "cap1 cap2",
"no_execute_changeset": True,
"role_arn": "arn",
"notification_arns": "notify1 notify2",
"fail_on_empty_changeset": True,
"use_json": True,
"tags": 'a=tag1 b="tag with spaces"',
"metadata": '{"m1": "value1", "m2": "value2"}',
"guided": True,
"confirm_changeset": True,
"region": "myregion",
"signing_profiles": "function=profile:owner",
"disable_rollback": True,
}
with samconfig_parameters(["deploy"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.deploy.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
str(Path(os.getcwd(), "mytemplate.yaml")),
"mystack",
"mybucket",
"123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
None,
True,
False,
"myprefix",
"mykms",
{"Key": "Value"},
["cap1", "cap2"],
True,
"arn",
["notify1", "notify2"],
True,
True,
{"a": "tag1", "b": "tag with spaces"},
{"m1": "value1", "m2": "value2"},
True,
True,
"myregion",
None,
{"function": {"profile_name": "profile", "profile_owner": "owner"}},
False,
"samconfig.toml",
"default",
False,
True,
)
@patch("samcli.commands.deploy.command.do_cli")
def test_deploy_image_repositories_and_image_repository(self, do_cli_mock):
config_values = {
"template_file": "mytemplate.yaml",
"stack_name": "mystack",
"s3_bucket": "mybucket",
"image_repository": "123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
"image_repositories": ["HelloWorldFunction=123456789012.dkr.ecr.us-east-1.amazonaws.com/test1"],
"force_upload": True,
"s3_prefix": "myprefix",
"kms_key_id": "mykms",
"parameter_overrides": "ParameterKey=Key,ParameterValue=Value",
"capabilities": "cap1 cap2",
"no_execute_changeset": True,
"role_arn": "arn",
"notification_arns": "notify1 notify2",
"fail_on_empty_changeset": True,
"use_json": True,
"tags": 'a=tag1 b="tag with spaces"',
"metadata": '{"m1": "value1", "m2": "value2"}',
"guided": True,
"confirm_changeset": True,
"region": "myregion",
"signing_profiles": "function=profile:owner",
}
with samconfig_parameters(["deploy"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.deploy.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
self.assertIsNotNone(result.exception)
@patch("samcli.lib.cli_validation.image_repository_validation.get_template_artifacts_format")
@patch("samcli.commands._utils.options.get_template_artifacts_format")
@patch("samcli.commands._utils.template.get_template_artifacts_format")
@patch("samcli.commands.deploy.command.do_cli")
def test_deploy_different_parameter_override_format(
self, do_cli_mock, template_artifacts_mock1, template_artifacts_mock2, template_artifacts_mock3
):
template_artifacts_mock1.return_value = [ZIP]
template_artifacts_mock2.return_value = [ZIP]
template_artifacts_mock3.return_value = [ZIP]
config_values = {
"template_file": "mytemplate.yaml",
"stack_name": "mystack",
"s3_bucket": "mybucket",
"image_repository": "123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
"force_upload": True,
"s3_prefix": "myprefix",
"kms_key_id": "mykms",
"parameter_overrides": 'Key1=Value1 Key2="Multiple spaces in the value"',
"capabilities": "cap1 cap2",
"no_execute_changeset": True,
"role_arn": "arn",
"notification_arns": "notify1 notify2",
"fail_on_empty_changeset": True,
"use_json": True,
"tags": 'a=tag1 b="tag with spaces"',
"metadata": '{"m1": "value1", "m2": "value2"}',
"guided": True,
"confirm_changeset": True,
"region": "myregion",
"signing_profiles": "function=profile:owner",
"disable_rollback": True,
}
with samconfig_parameters(["deploy"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.deploy.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
str(Path(os.getcwd(), "mytemplate.yaml")),
"mystack",
"mybucket",
"123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
None,
True,
False,
"myprefix",
"mykms",
{"Key1": "Value1", "Key2": "Multiple spaces in the value"},
["cap1", "cap2"],
True,
"arn",
["notify1", "notify2"],
True,
True,
{"a": "tag1", "b": "tag with spaces"},
{"m1": "value1", "m2": "value2"},
True,
True,
"myregion",
None,
{"function": {"profile_name": "profile", "profile_owner": "owner"}},
False,
"samconfig.toml",
"default",
False,
True,
)
@patch("samcli.commands._utils.experimental.is_experimental_enabled")
@patch("samcli.commands.logs.command.do_cli")
def test_logs(self, do_cli_mock, experimental_mock):
config_values = {
"name": ["myfunction"],
"stack_name": "mystack",
"filter": "myfilter",
"tail": True,
"include_traces": False,
"start_time": "starttime",
"end_time": "endtime",
"region": "myregion",
}
experimental_mock.return_value = False
with samconfig_parameters(["logs"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.logs.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
("myfunction",),
"mystack",
"myfilter",
True,
False,
"starttime",
"endtime",
(),
False,
"myregion",
None,
)
@patch("samcli.commands._utils.experimental.is_experimental_enabled")
@patch("samcli.commands.logs.command.do_cli")
def test_logs_tail(self, do_cli_mock, experimental_mock):
config_values = {
"name": ["myfunction"],
"stack_name": "mystack",
"filter": "myfilter",
"tail": True,
"include_traces": True,
"start_time": "starttime",
"end_time": "endtime",
"cw_log_group": ["cw_log_group"],
"region": "myregion",
}
experimental_mock.return_value = True
with samconfig_parameters(["logs"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.logs.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
("myfunction",),
"mystack",
"myfilter",
True,
True,
"starttime",
"endtime",
("cw_log_group",),
False,
"myregion",
None,
)
@patch("samcli.commands.publish.command.do_cli")
def test_publish(self, do_cli_mock):
config_values = {"template_file": "mytemplate.yaml", "semantic_version": "0.1.1"}
with samconfig_parameters(["publish"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.publish.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(ANY, str(Path(os.getcwd(), "mytemplate.yaml")), "0.1.1")
def test_info_must_not_read_from_config(self):
config_values = {"a": "b"}
with samconfig_parameters([], self.scratch_dir, **config_values) as config_path:
from samcli.cli.main import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, ["--info"])
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
info_result = json.loads(result.output)
self.assertTrue("version" in info_result)
@patch("samcli.commands._utils.experimental.is_experimental_enabled")
@patch("samcli.lib.cli_validation.image_repository_validation._is_all_image_funcs_provided")
@patch("samcli.lib.cli_validation.image_repository_validation.get_template_artifacts_format")
@patch("samcli.commands._utils.template.get_template_artifacts_format")
@patch("samcli.commands._utils.options.get_template_artifacts_format")
@patch("samcli.commands.sync.command.do_cli")
def test_sync(
self,
do_cli_mock,
template_artifacts_mock1,
template_artifacts_mock2,
template_artifacts_mock3,
is_all_image_funcs_provided_mock,
experimental_mock,
):
template_artifacts_mock1.return_value = [ZIP]
template_artifacts_mock2.return_value = [ZIP]
template_artifacts_mock3.return_value = [ZIP]
is_all_image_funcs_provided_mock.return_value = True
experimental_mock.return_value = True
config_values = {
"template_file": "mytemplate.yaml",
"stack_name": "mystack",
"image_repository": "123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
"base_dir": "path",
"s3_prefix": "myprefix",
"kms_key_id": "mykms",
"parameter_overrides": 'Key1=Value1 Key2="Multiple spaces in the value"',
"capabilities": "cap1 cap2",
"no_execute_changeset": True,
"role_arn": "arn",
"notification_arns": "notify1 notify2",
"tags": 'a=tag1 b="tag with spaces"',
"metadata": '{"m1": "value1", "m2": "value2"}',
"guided": True,
"confirm_changeset": True,
"region": "myregion",
"signing_profiles": "function=profile:owner",
}
with samconfig_parameters(["sync"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.sync.command import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
str(Path(os.getcwd(), "mytemplate.yaml")),
False,
False,
(),
(),
True,
"mystack",
"myregion",
None,
"path",
{"Key1": "Value1", "Key2": "Multiple spaces in the value"},
None,
"123456789012.dkr.ecr.us-east-1.amazonaws.com/test1",
None,
"myprefix",
"mykms",
["cap1", "cap2"],
"arn",
["notify1", "notify2"],
{"a": "tag1", "b": "tag with spaces"},
{"m1": "value1", "m2": "value2"},
"samconfig.toml",
"default",
)
class TestSamConfigWithOverrides(TestCase):
def setUp(self):
self._old_cwd = os.getcwd()
self.scratch_dir = tempfile.mkdtemp()
Path(self.scratch_dir, "otherenvvar.json").write_text("{}")
Path(self.scratch_dir, "other-containerenvvar.json").write_text("{}")
os.chdir(self.scratch_dir)
def tearDown(self):
os.chdir(self._old_cwd)
shutil.rmtree(self.scratch_dir)
self.scratch_dir = None
@patch("samcli.commands.local.start_lambda.cli.do_cli")
def test_override_with_cli_params(self, do_cli_mock):
config_values = {
"template_file": "mytemplate.yaml",
"host": "127.0.0.1",
"port": 12345,
"env_vars": "envvar.json",
"debug_port": [1, 2, 3],
"debug_args": "args",
"debugger_path": "mypath",
"container_env_vars": "container-envvar.json",
"docker_volume_basedir": "basedir",
"docker_network": "mynetwork",
"log_file": "logfile",
"layer_cache_basedir": "basedir",
"skip_pull_image": True,
"force_image_build": True,
"shutdown": False,
"parameter_overrides": "ParameterKey=Key,ParameterValue=Value",
"invoke_image": ["image"],
}
# NOTE: Because we don't load the full Click BaseCommand here, this is mounted as top-level command
with samconfig_parameters(["start-lambda"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.local.start_lambda.cli import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(
cli,
[
"--template-file",
"othertemplate.yaml",
"--host",
"otherhost",
"--port",
9999,
"--env-vars",
"otherenvvar.json",
"--debug-port",
9,
"--debug-port",
8,
"--debug-port",
7,
"--debug-args",
"otherargs",
"--debugger-path",
"otherpath",
"--container-env-vars",
"other-containerenvvar.json",
"--docker-volume-basedir",
"otherbasedir",
"--docker-network",
"othernetwork",
"--log-file",
"otherlogfile",
"--layer-cache-basedir",
"otherbasedir",
"--skip-pull-image",
"--force-image-build",
"--shutdown",
"--parameter-overrides",
"A=123 C=D E=F12! G=H",
"--container-host",
"localhost",
"--container-host-interface",
"127.0.0.1",
],
)
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
ANY,
"otherhost",
9999,
str(Path(os.getcwd(), "othertemplate.yaml")),
"otherenvvar.json",
(9, 8, 7),
"otherargs",
"otherpath",
"other-containerenvvar.json",
"otherbasedir",
"othernetwork",
"otherlogfile",
"otherbasedir",
True,
True,
{"A": "123", "C": "D", "E": "F12!", "G": "H"},
None,
True,
None,
"localhost",
"127.0.0.1",
("image",),
)
@patch("samcli.commands.local.start_lambda.cli.do_cli")
def test_override_with_cli_params_and_envvars(self, do_cli_mock):
config_values = {
"template_file": "mytemplate.yaml",
"host": "127.0.0.1",
"port": 12345,
"env_vars": "envvar.json",
"debug_port": [1, 2, 3],
"debug_args": "args",
"debugger_path": "mypath",
"container_env_vars": "container-envvar.json",
"docker_volume_basedir": "basedir",
"docker_network": "mynetwork",
"log_file": "logfile",
"layer_cache_basedir": "basedir",
"skip_pull_image": True,
"force_image_build": False,
"shutdown": False,
"invoke_image": ["image"],
}
# NOTE: Because we don't load the full Click BaseCommand here, this is mounted as top-level command
with samconfig_parameters(["start-lambda"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.local.start_lambda.cli import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(
cli,
env={
"SAM_TEMPLATE_FILE": "envtemplate.yaml",
"SAM_SKIP_PULL_IMAGE": "False",
"SAM_FORCE_IMAGE_BUILD": "False",
"SAM_DOCKER_NETWORK": "envnetwork",
# Debug port is exclusively provided through envvars and not thru CLI args
"SAM_DEBUG_PORT": "13579",
"DEBUGGER_ARGS": "envargs",
"SAM_DOCKER_VOLUME_BASEDIR": "envbasedir",
"SAM_LAYER_CACHE_BASEDIR": "envlayercache",
},
args=[
"--host",
"otherhost",
"--port",
9999,
"--env-vars",
"otherenvvar.json",
"--debugger-path",
"otherpath",
"--container-env-vars",
"other-containerenvvar.json",
"--log-file",
"otherlogfile",
# this is a case where cli args takes precedence over both
# config file and envvar
"--force-image-build",
# Parameter overrides is exclusively provided through CLI args and not config
"--parameter-overrides",
"A=123 C=D E=F12! G=H",
],
)
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(
ANY,
"otherhost",
9999,
str(Path(os.getcwd(), "envtemplate.yaml")),
"otherenvvar.json",
(13579,),
"envargs",
"otherpath",
"other-containerenvvar.json",
"envbasedir",
"envnetwork",
"otherlogfile",
"envlayercache",
False,
True,
{"A": "123", "C": "D", "E": "F12!", "G": "H"},
None,
False,
None,
"localhost",
"127.0.0.1",
("image",),
)
@patch("samcli.commands.validate.validate.do_cli")
def test_secondary_option_name_template_validate(self, do_cli_mock):
# "--template" is an alias of "--template-file"
config_values = {"template": "mytemplate.yaml"}
with samconfig_parameters(["validate"], self.scratch_dir, **config_values) as config_path:
from samcli.commands.validate.validate import cli
LOG.debug(Path(config_path).read_text())
runner = CliRunner()
result = runner.invoke(cli, [])
LOG.info(result.output)
LOG.info(result.exception)
if result.exception:
LOG.exception("Command failed", exc_info=result.exc_info)
self.assertIsNone(result.exception)
do_cli_mock.assert_called_with(ANY, str(Path(os.getcwd(), "mytemplate.yaml")))
@contextmanager
def samconfig_parameters(cmd_names, config_dir=None, env=None, **kwargs):
"""
ContextManager to write a new SAM Config and remove the file after the contextmanager exists
Parameters
----------
cmd_names : list(str)
Name of the full commnad split as a list: ["generate-event", "s3", "put"]
config_dir : str
Path where the SAM config file should be written to. Defaults to os.getcwd()
env : str
Optional name of the config environment. This is currently unused
kwargs : dict
Parameter names and values to be written to the file.
Returns
-------
Path to the config file
"""
env = env or DEFAULT_ENV
section = "parameters"
samconfig = SamConfig(config_dir=config_dir)
try:
for k, v in kwargs.items():
samconfig.put(cmd_names, section, k, v, env=env)
samconfig.flush()
yield samconfig.path()
finally:
Path(samconfig.path()).unlink()
| 36.112664 | 117 | 0.524218 | 4,176 | 43,913 | 5.294301 | 0.085728 | 0.040029 | 0.015469 | 0.018997 | 0.850513 | 0.839387 | 0.822787 | 0.810123 | 0.803021 | 0.783346 | 0 | 0.017738 | 0.354246 | 43,913 | 1,215 | 118 | 36.142387 | 0.761928 | 0.030378 | 0 | 0.809478 | 0 | 0 | 0.270879 | 0.092552 | 0 | 0 | 0 | 0 | 0.038685 | 1 | 0.025145 | false | 0 | 0.032882 | 0 | 0.059961 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
92f07d0a4a63980912f379341e3e37fc77cd2d6e | 3,973 | py | Python | pyformance/decorators.py | alpha-health-ai/pyformance | 3dcf6556a070e89c783f30ddfff03c986e7a5582 | [
"Apache-2.0"
] | 4 | 2019-11-13T11:11:43.000Z | 2021-06-20T11:01:27.000Z | pyformance/decorators.py | alpha-health-ai/pyformance | 3dcf6556a070e89c783f30ddfff03c986e7a5582 | [
"Apache-2.0"
] | 5 | 2019-01-14T14:59:44.000Z | 2020-12-13T17:04:27.000Z | pyformance/decorators.py | alpha-health-ai/pyformance | 3dcf6556a070e89c783f30ddfff03c986e7a5582 | [
"Apache-2.0"
] | 6 | 2019-04-17T21:07:40.000Z | 2022-01-18T16:11:51.000Z | import functools
import sys
import pyformance.registry as global_registry
def get_qualname(obj):
if sys.version_info[0] > 2:
return obj.__qualname__
return obj.__name__
def count_calls(original_func=None, registry=None, tags=None):
"""
Decorator to track the number of times a function is called.
:param original_func: the function to be decorated
:type original_func: C{func}
:param registry: the registry in which to create the meter
:param tags: tags attached to the timer (e.g. {'region': 'us-west-1'})
:type tags: C{dict}
:return: the decorated function
:rtype: C{func}
"""
def _decorate(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
function_name = get_qualname(fn)
metric_name = "%s_calls" % function_name
_registry = registry or global_registry
_histogram = _registry.counter(metric_name, tags).inc()
return fn(*args, **kwargs)
return wrapper
if original_func:
return _decorate(original_func)
return _decorate
def meter_calls(original_func=None, registry=None, tags=None):
"""
Decorator to the rate at which a function is called.
:param original_func: the function to be decorated
:type original_func: C{func}
:param registry: the registry in which to create the meter
:param tags: tags attached to the timer (e.g. {'region': 'us-west-1'})
:type tags: C{dict}
:return: the decorated function
:rtype: C{func}
"""
def _decorate(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
function_name = get_qualname(fn)
metric_name = "%s_calls" % function_name
_registry = registry or global_registry
_histogram = _registry.meter(metric_name, tags).mark()
return fn(*args, **kwargs)
return wrapper
if original_func:
return _decorate(original_func)
return _decorate
def hist_calls(original_func=None, registry=None, tags=None):
"""
Decorator to check the distribution of return values of a function.
:param original_func: the function to be decorated
:type original_func: C{func}
:param registry: the registry in which to create the histogram
:param tags: tags attached to the timer (e.g. {'region': 'us-west-1'})
:type tags: C{dict}
:return: the decorated function
:rtype: C{func}
"""
def _decorate(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
function_name = get_qualname(fn)
metric_name = "%s_calls" % function_name
_registry = registry or global_registry
_histogram = _registry.histogram(metric_name, tags)
rtn = fn(*args, **kwargs)
if type(rtn) in (int, float):
_histogram.update(rtn)
return rtn
return wrapper
if original_func:
return _decorate(original_func)
return _decorate
def time_calls(original_func=None, registry=None, tags=None):
"""
Decorator to time the execution of the function.
:param original_func: the function to be decorated
:type original_func: C{func}
:param registry: the registry in which to create the timer
:param tags: tags attached to the timer (e.g. {'region': 'us-west-1'})
:type tags: C{dict}
:return: the decorated function
:rtype: C{func}
"""
def _decorate(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
function_name = get_qualname(fn)
metric_name = "%s_calls" % function_name
_registry = registry or global_registry
_timer = _registry.timer(metric_name, tags)
with _timer.time(fn=function_name):
return fn(*args, **kwargs)
return wrapper
if original_func:
return _decorate(original_func)
return _decorate
| 25.96732 | 74 | 0.63403 | 501 | 3,973 | 4.848303 | 0.159681 | 0.098806 | 0.059284 | 0.085632 | 0.815974 | 0.815974 | 0.815974 | 0.815974 | 0.815974 | 0.815974 | 0 | 0.002075 | 0.272338 | 3,973 | 152 | 75 | 26.138158 | 0.838118 | 0.342814 | 0 | 0.68254 | 0 | 0 | 0.013126 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.206349 | false | 0 | 0.047619 | 0 | 0.539683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
92f97eacc7afd261a50df396dbaffaf3228a0ce5 | 195 | py | Python | HackerRank/Python/py-set-difference-operation.py | object-oriented-human/competitive | 9e761020e887d8980a39a64eeaeaa39af0ecd777 | [
"MIT"
] | 2 | 2021-07-27T10:46:47.000Z | 2021-07-27T10:47:57.000Z | HackerRank/Python/py-set-difference-operation.py | foooop/competitive | 9e761020e887d8980a39a64eeaeaa39af0ecd777 | [
"MIT"
] | null | null | null | HackerRank/Python/py-set-difference-operation.py | foooop/competitive | 9e761020e887d8980a39a64eeaeaa39af0ecd777 | [
"MIT"
] | null | null | null | a, b = set(), set()
n = int(input())
for x in set(map(int, input().split())):
a.add(x)
n = int(input())
for x in set(map(int, input().split())):
b.add(x)
print(len(a - b)) | 16.25 | 40 | 0.492308 | 36 | 195 | 2.666667 | 0.388889 | 0.333333 | 0.1875 | 0.25 | 0.708333 | 0.708333 | 0.708333 | 0.708333 | 0.708333 | 0.708333 | 0 | 0 | 0.251282 | 195 | 12 | 41 | 16.25 | 0.657534 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1302fcb2aa651b917dc93cb7d1cc26ac1751fac8 | 98 | py | Python | pycoinmon/main.py | RDCH106/pycoinmon | c3bc3c108da0a82a7118a1314d1b7906cd489712 | [
"MIT"
] | 40 | 2017-12-15T18:58:18.000Z | 2022-02-10T04:40:29.000Z | pycoinmon/main.py | RDCH106/pycoinmon | c3bc3c108da0a82a7118a1314d1b7906cd489712 | [
"MIT"
] | 13 | 2017-12-10T16:32:45.000Z | 2021-04-08T23:34:28.000Z | pycoinmon/main.py | RDCH106/pycoinmon | c3bc3c108da0a82a7118a1314d1b7906cd489712 | [
"MIT"
] | 12 | 2017-12-09T22:15:33.000Z | 2021-09-06T12:28:43.000Z | # -*- coding: utf-8 -*-
import pycoinmon.core
def main():
pycoinmon.core.PyCoinmon().run()
| 12.25 | 36 | 0.622449 | 12 | 98 | 5.083333 | 0.75 | 0.42623 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012346 | 0.173469 | 98 | 7 | 37 | 14 | 0.740741 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1349777ebbcf97dab396d7b6f77d17f0419d0903 | 12,695 | py | Python | openqtsim/queue.py | TUDelft-CITG/OpenQTSim | 0a77b1a780ddb9b77dd2b1db8369b72d13b7fbdf | [
"MIT"
] | 6 | 2020-06-18T03:40:56.000Z | 2021-04-08T11:13:21.000Z | openqtsim/queue.py | TUDelft-CITG/OpenQTSim | 0a77b1a780ddb9b77dd2b1db8369b72d13b7fbdf | [
"MIT"
] | 1 | 2020-11-12T14:16:02.000Z | 2020-11-13T10:29:06.000Z | openqtsim/queue.py | TUDelft-CITG/OpenQTSim | 0a77b1a780ddb9b77dd2b1db8369b72d13b7fbdf | [
"MIT"
] | 1 | 2021-09-29T14:01:28.000Z | 2021-09-29T14:01:28.000Z | import numpy as np
import pandas as pd
from openqtsim.customer import Customer
from openqtsim.arrival_process import ArrivalProcess
from openqtsim.service_process import ServiceProcess
class Queue:
"""
Queueing class based on Kendall's notation, in which:
- A is the arrival process
- S is the service time distribution
- c is the number of servers
- K is the number of places in the system
- N is the calling population
- D is the queue discipline
"""
def __init__(self, A=ArrivalProcess(), S=ServiceProcess(), c=1, K=np.inf, N=np.inf, D="FIFO"):
"""
The first six inputs are the typical Kendall inputs. Without inputs the queue object returns an M/M/1 object as
default
"""
self.A = A
self.S = S
self.c = c
self.K = K
self.N = N
self.D = D
def populate(self, Env, Sim):
"""
While the simulation time does not exceed the maximum duration, generate customers
according to the distribution of the arrival process to populate the queue
"""
# Simulation stops either when max arrivals (max_arr) is reached or the tolerance limits are achieved
while Sim.customer_nr < Sim.max_arr:
# Draw IAT from distribution, move time forward and register arrival time (AT)
IAT = Sim.queue.A.get_IAT(Sim.customer_nr)
yield Env.timeout(IAT)
# determine AT
AT = Env.now - Env.epoch
# Create a customer
customer_new = Customer(Env, Sim) # init: +1 for the next customer
# Make the customer go through the system
Env.process(customer_new.move(IAT, AT))
@property
def kendall_notation(self):
"""
Return queue name according to the Kendall notation.
"""
return "{}/{}/{}/{}/{}/{}".format(
self.A.symbol, self.S.symbol, str(self.c), str(self.K), str(self.N), self.D
)
def occupancy_to_waitingfactor(self, utilisation=.3, nr_of_servers_to_chk=4, poly_order=6):
"""
Waiting time factor (E2/E2/n or M/E2/n) queueing theory using 6th order polynomial regression)
"""
kendall = "{}/{}/{}".format(self.A.symbol, self.S.symbol, str(self.c))
if kendall[0:4] == 'M/M/':
# Create dataframe with data from Groenveld (2007) - Table I (M/M/n)
# See also PIANC 2014 Table 6.2
utilisations = np.array([.1, .2, .3, .4, .5, .6, .7, .8, .9])
nr_of_servers = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
data = np.array([
[0.1111, 0.0101, 0.0014, 0.0002, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000],
[0.2500, 0.0417, 0.0103, 0.0030, 0.0010, 0.0003, 0.0001, 0.0000, 0.0000, 0.0000],
[0.4286, 0.0989, 0.0333, 0.0132, 0.0058, 0.0027, 0.0013, 0.0006, 0.0003, 0.0002],
[0.6667, 0.1905, 0.0784, 0.0378, 0.0199, 0.0111, 0.0064, 0.0039, 0.0024, 0.0015],
[1.0000, 0.3333, 0.1579, 0.0870, 0.0521, 0.0330, 0.0218, 0.0148, 0.0102, 0.0072],
[1.5000, 0.5625, 0.2956, 0.1794, 0.1181, 0.0819, 0.0589, 0.0436, 0.0330, 0.0253],
[2.3333, 0.9608, 0.5470, 0.3572, 0.2519, 0.1867, 0.1432, 0.1128, 0.0906, 0.0739],
[4.0000, 1.7778, 1.0787, 0.7455, 0.5541, 0.4315, 0.3471, 0.2860, 0.2401, 0.2046],
[9.0000, 4.2632, 2.7235, 1.9693, 1.5250, 1.2335, 1.0285, 0.8769, 0.7606, 0.6687]])
elif kendall[0:6] == 'E2/E2/':
# Create dataframe with data from Groenveld (2007) - Table V (E2/E2/n)
# See also PIANC 2014 Table 6.2
utilisations = np.array([.1, .2, .3, .4, .5, .6, .7, .8, .9])
nr_of_servers = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
data = np.array([
[0.0166, 0.0006, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000],
[0.0604, 0.0065, 0.0011, 0.0002, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000],
[0.1310, 0.0235, 0.0062, 0.0019, 0.0007, 0.0002, 0.0001, 0.0000, 0.0000, 0.0000],
[0.2355, 0.0576, 0.0205, 0.0085, 0.0039, 0.0019, 0.0009, 0.0005, 0.0003, 0.0001],
[0.3904, 0.1181, 0.0512, 0.0532, 0.0142, 0.0082, 0.0050, 0.0031, 0.0020, 0.0013],
[0.6306, 0.2222, 0.1103, 0.0639, 0.0400, 0.0265, 0.0182, 0.0128, 0.0093, 0.0069],
[1.0391, 0.4125, 0.2275, 0.1441, 0.0988, 0.0712, 0.0532, 0.0407, 0.0319, 0.0258],
[1.8653, 0.8300, 0.4600, 0.3300, 0.2300, 0.1900, 0.1400, 0.1200, 0.0900, 0.0900],
[4.3590, 2.0000, 1.2000, 0.9200, 0.6500, 0.5700, 0.4400, 0.4000, 0.3200, 0.3000]
])
elif kendall[0:5] == 'M/E2/n':
# Create dataframe with data from Groenveld (2007) - Table IV (M/E2/n)
# See also PIANC 2014 Table 6.1
utilisations = np.array([.1, .15, .2, .25, .3, .35, .4, .45, .5, .55, .6, .65, .7, .75, .8, .85, .9])
nr_of_servers = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14])
data = np.array([
[0.08, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.13, 0.02, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.19, 0.03, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.25, 0.05, 0.02, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.32, 0.08, 0.03, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.40, 0.11, 0.04, 0.02, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.50, 0.15, 0.06, 0.03, 0.02, 0.01, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.60, 0.20, 0.08, 0.05, 0.03, 0.02, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.75, 0.26, 0.12, 0.07, 0.04, 0.03, 0.02, 0.01, 0.01, 0.01, 0.00, 0.00, 0.00, 0.00],
[0.91, 0.33, 0.16, 0.10, 0.06, 0.04, 0.03, 0.02, 0.02, 0.01, 0.01, 0.01, 0.00, 0.00],
[1.13, 0.43, 0.23, 0.14, 0.09, 0.06, 0.05, 0.03, 0.03, 0.02, 0.02, 0.01, 0.01, 0.01],
[1.38, 0.55, 0.30, 0.19, 0.12, 0.09, 0.07, 0.05, 0.04, 0.03, 0.03, 0.02, 0.02, 0.02],
[1.75, 0.73, 0.42, 0.27, 0.19, 0.14, 0.11, 0.09, 0.07, 0.06, 0.05, 0.04, 0.03, 0.03],
[2.22, 0.96, 0.59, 0.39, 0.28, 0.21, 0.17, 0.14, 0.12, 0.10, 0.08, 0.07, 0.06, 0.05],
[3.00, 1.34, 0.82, 0.57, 0.42, 0.33, 0.27, 0.22, 0.18, 0.16, 0.13, 0.11, 0.10, 0.09],
[4.50, 2.00, 1.34, 0.90, 0.70, 0.54, 0.46, 0.39, 0.34, 0.30, 0.26, 0.23, 0.20, 0.18],
[6.75, 3.14, 2.01, 1.45, 1.12, 0.91, 0.76, 0.65, 0.56, 0.50, 0.45, 0.40, 0.36, 0.33]
])
df = pd.DataFrame(data, index=utilisations, columns=nr_of_servers)
# Create a 6th order polynomial fit through the data (for nr_of_stations_chk)
target = df.loc[:, nr_of_servers_to_chk];
p_p = np.polyfit(target.index, target.values, poly_order)
waiting_factor = np.polyval(p_p, utilisation)
# todo: when the nr of servers > 10 the waiting factor should be set to inf (definitively more equipment needed)
# Return waiting factor
return waiting_factor
def waitingfactor_to_occupancy(self, factor=.3, nr_of_servers_to_chk=4, poly_order=6):
"""
Waiting time factor (E2/E2/n or M/E2/n) queueing theory using 6th order polynomial regression)
"""
kendall = "{}/{}/{}".format(self.A.symbol, self.S.symbol, str(self.c))
if kendall[0:4] == 'M/M/':
# Create dataframe with data from Groenveld (2007) - Table I (M/M/n)
# See also PIANC 2014 Table 6.2
utilisations = np.array([.1, .2, .3, .4, .5, .6, .7, .8, .9])
nr_of_servers = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
data = np.array([
[0.1111, 0.0101, 0.0014, 0.0002, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000],
[0.2500, 0.0417, 0.0103, 0.0030, 0.0010, 0.0003, 0.0001, 0.0000, 0.0000, 0.0000],
[0.4286, 0.0989, 0.0333, 0.0132, 0.0058, 0.0027, 0.0013, 0.0006, 0.0003, 0.0002],
[0.6667, 0.1905, 0.0784, 0.0378, 0.0199, 0.0111, 0.0064, 0.0039, 0.0024, 0.0015],
[1.0000, 0.3333, 0.1579, 0.0870, 0.0521, 0.0330, 0.0218, 0.0148, 0.0102, 0.0072],
[1.5000, 0.5625, 0.2956, 0.1794, 0.1181, 0.0819, 0.0589, 0.0436, 0.0330, 0.0253],
[2.3333, 0.9608, 0.5470, 0.3572, 0.2519, 0.1867, 0.1432, 0.1128, 0.0906, 0.0739],
[4.0000, 1.7778, 1.0787, 0.7455, 0.5541, 0.4315, 0.3471, 0.2860, 0.2401, 0.2046],
[9.0000, 4.2632, 2.7235, 1.9693, 1.5250, 1.2335, 1.0285, 0.8769, 0.7606, 0.6687]])
elif kendall[0:6] == 'E2/E2/':
# Create dataframe with data from Groenveld (2007) - Table V (E2/E2/n)
# See also PIANC 2014 Table 6.2
utilisations = np.array([.1, .2, .3, .4, .5, .6, .7, .8, .9])
nr_of_servers = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
data = np.array([
[0.0166, 0.0006, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000],
[0.0604, 0.0065, 0.0011, 0.0002, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000],
[0.1310, 0.0235, 0.0062, 0.0019, 0.0007, 0.0002, 0.0001, 0.0000, 0.0000, 0.0000],
[0.2355, 0.0576, 0.0205, 0.0085, 0.0039, 0.0019, 0.0009, 0.0005, 0.0003, 0.0001],
[0.3904, 0.1181, 0.0512, 0.0532, 0.0142, 0.0082, 0.0050, 0.0031, 0.0020, 0.0013],
[0.6306, 0.2222, 0.1103, 0.0639, 0.0400, 0.0265, 0.0182, 0.0128, 0.0093, 0.0069],
[1.0391, 0.4125, 0.2275, 0.1441, 0.0988, 0.0712, 0.0532, 0.0407, 0.0319, 0.0258],
[1.8653, 0.8300, 0.4600, 0.3300, 0.2300, 0.1900, 0.1400, 0.1200, 0.0900, 0.0900],
[4.3590, 2.0000, 1.2000, 0.9200, 0.6500, 0.5700, 0.4400, 0.4000, 0.3200, 0.3000]
])
elif kendall[0:5] == 'M/E2/n':
# Create dataframe with data from Groenveld (2007) - Table IV (M/E2/n)
# See also PIANC 2014 Table 6.1
utilisations = np.array([.1, .15, .2, .25, .3, .35, .4, .45, .5, .55, .6, .65, .7, .75, .8, .85, .9])
nr_of_servers = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14])
data = np.array([
[0.08, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.13, 0.02, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.19, 0.03, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.25, 0.05, 0.02, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.32, 0.08, 0.03, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.40, 0.11, 0.04, 0.02, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.50, 0.15, 0.06, 0.03, 0.02, 0.01, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.60, 0.20, 0.08, 0.05, 0.03, 0.02, 0.01, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],
[0.75, 0.26, 0.12, 0.07, 0.04, 0.03, 0.02, 0.01, 0.01, 0.01, 0.00, 0.00, 0.00, 0.00],
[0.91, 0.33, 0.16, 0.10, 0.06, 0.04, 0.03, 0.02, 0.02, 0.01, 0.01, 0.01, 0.00, 0.00],
[1.13, 0.43, 0.23, 0.14, 0.09, 0.06, 0.05, 0.03, 0.03, 0.02, 0.02, 0.01, 0.01, 0.01],
[1.38, 0.55, 0.30, 0.19, 0.12, 0.09, 0.07, 0.05, 0.04, 0.03, 0.03, 0.02, 0.02, 0.02],
[1.75, 0.73, 0.42, 0.27, 0.19, 0.14, 0.11, 0.09, 0.07, 0.06, 0.05, 0.04, 0.03, 0.03],
[2.22, 0.96, 0.59, 0.39, 0.28, 0.21, 0.17, 0.14, 0.12, 0.10, 0.08, 0.07, 0.06, 0.05],
[3.00, 1.34, 0.82, 0.57, 0.42, 0.33, 0.27, 0.22, 0.18, 0.16, 0.13, 0.11, 0.10, 0.09],
[4.50, 2.00, 1.34, 0.90, 0.70, 0.54, 0.46, 0.39, 0.34, 0.30, 0.26, 0.23, 0.20, 0.18],
[6.75, 3.14, 2.01, 1.45, 1.12, 0.91, 0.76, 0.65, 0.56, 0.50, 0.45, 0.40, 0.36, 0.33]
])
df = pd.DataFrame(data, index=utilisations, columns=nr_of_servers)
# Create a 6th order polynomial fit through the data (for nr_of_stations_chk)
target = df.loc[:, nr_of_servers_to_chk]
p_p = np.polyfit(target.values, target.index, poly_order)
occupancy = np.polyval(p_p, factor)
# Return occupancy
return occupancy
| 58.233945 | 120 | 0.508074 | 2,515 | 12,695 | 2.53837 | 0.143936 | 0.079887 | 0.105263 | 0.140977 | 0.760965 | 0.760965 | 0.760965 | 0.760965 | 0.760965 | 0.760965 | 0 | 0.388611 | 0.290351 | 12,695 | 217 | 121 | 58.502304 | 0.320013 | 0.152737 | 0 | 0.744526 | 0 | 0 | 0.006532 | 0 | 0 | 0 | 0 | 0.004608 | 0 | 1 | 0.036496 | false | 0 | 0.036496 | 0 | 0.10219 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
136951f0468c7f54b6dbca47fde0bffbe8c68b00 | 101,757 | py | Python | tests/test_tiling.py | odinn13/Tilings | c526ab0fb8a66c4497c644b5f5e8a6c51310b157 | [
"BSD-3-Clause"
] | null | null | null | tests/test_tiling.py | odinn13/Tilings | c526ab0fb8a66c4497c644b5f5e8a6c51310b157 | [
"BSD-3-Clause"
] | null | null | null | tests/test_tiling.py | odinn13/Tilings | c526ab0fb8a66c4497c644b5f5e8a6c51310b157 | [
"BSD-3-Clause"
] | null | null | null | import json
from itertools import chain, product
import pytest
import sympy
from permuta import Perm
from tilings import GriddedPerm, Tiling
from tilings.exception import InvalidOperationError
@pytest.fixture
def compresstil():
"""Returns a tiling that has both obstructions and requirements. For
testing compression and json."""
return Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((2, 1),)),
GriddedPerm(Perm((0, 1)), ((1, 1), (1, 1))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((1, 1), (1, 1))),
GriddedPerm(Perm((1, 0)), ((1, 1), (2, 0))),
GriddedPerm(Perm((1, 0)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 1))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 1), (0, 1))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 1), (1, 1))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((1, 0, 2)), ((0, 1), (0, 0), (1, 1))),
GriddedPerm(Perm((2, 0, 1)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 1), (0, 1), (1, 1))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 1), (0, 1), (1, 1))),
GriddedPerm(Perm((0, 2, 3, 1)), ((0, 1), (0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((0, 2, 3, 1)), ((0, 1), (0, 1), (0, 1), (1, 1))),
GriddedPerm(Perm((2, 0, 1, 3)), ((0, 1), (0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((2, 0, 1, 3)), ((0, 1), (0, 1), (0, 1), (1, 1))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((1, 1),)), GriddedPerm(Perm((0,)), ((2, 0),))),
(GriddedPerm(Perm((1, 0, 2)), ((0, 0), (0, 0), (0, 0))),),
),
)
@pytest.fixture
def empty_tiling():
return Tiling(
obstructions=(
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 0),)),),
(GriddedPerm(Perm((0,)), ((0, 1),)),),
),
)
@pytest.fixture
def finite_tiling():
return Tiling(
obstructions=(
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((2, 1, 0)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((2, 1, 0)), ((0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((3, 2, 1, 0)), ((0, 1), (0, 1), (0, 0), (0, 0))),
),
requirements=((GriddedPerm(Perm((0,)), ((0, 0),)),),),
)
@pytest.fixture
def factorable_tiling():
return Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 2, 1)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((2, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((2, 0, 1)), ((2, 3), (2, 3), (2, 3))),
GriddedPerm(Perm((1, 0, 2)), ((5, 4), (5, 4), (5, 4))),
GriddedPerm(Perm((2, 0, 1)), ((5, 4), (5, 4), (5, 4))),
GriddedPerm(Perm((1, 2, 0)), ((4, 6), (4, 6), (4, 6))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (2, 2))),
GriddedPerm(Perm((0, 1, 2, 3)), ((2, 2), (2, 2), (2, 3), (2, 3))),
GriddedPerm(Perm((0, 1)), ((6, 4), (6, 4))),
GriddedPerm(Perm((1, 0)), ((6, 4), (6, 4))),
GriddedPerm(Perm((0, 1)), ((7, 7), (7, 7))),
],
requirements=[
[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((4, 6), (4, 6))),
],
[GriddedPerm(Perm((0,)), ((6, 4),))],
],
)
@pytest.fixture
def obs_inf_til():
return Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
],
requirements=[[GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0)))]],
)
@pytest.fixture
def typical_redundant_obstructions():
"""Returns a very typical list of obstructions clustered together in a
corner of a tiling. """
return [
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((3, 1), (3, 1))),
GriddedPerm(Perm((1, 0)), ((3, 0), (3, 0))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 0))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 1))),
GriddedPerm(Perm((0, 1, 2)), ((3, 0), (3, 0), (3, 0))),
GriddedPerm(Perm((0, 1, 2)), ((3, 0), (3, 0), (3, 1))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (3, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (2, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (2, 0), (3, 0))),
GriddedPerm(Perm((2, 1, 0)), ((2, 0), (2, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((2, 0), (2, 0), (3, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((1, 1), (2, 0), (2, 0), (2, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((2, 1), (2, 1), (3, 0), (3, 0))),
]
@pytest.fixture
def typical_redundant_requirements():
"""Returns a very typical list of requirements of a tiling. """
return [
[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 4))),
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (1, 0), (2, 3))),
GriddedPerm(Perm((1, 0, 2)), ((0, 1), (1, 0), (2, 3))),
],
[
GriddedPerm(Perm((0, 1, 2)), ((2, 3), (2, 3), (2, 3))),
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 1, 2)), ((1, 0), (1, 0), (1, 0))),
],
[
GriddedPerm(Perm((0, 1)), ((1, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 1))),
],
[
GriddedPerm(Perm((1, 0)), ((3, 3), (3, 1))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 1))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 0))),
],
]
@pytest.mark.filterwarnings("ignore::UserWarning")
def test_constructor_no_requirements(typical_redundant_obstructions):
"""Tests the constructor of Tiling, thereby the minimization methods used
in the constructor with different options for remove_empty_rows_and_cols and
derive_empty. Proper update of the dimensions of the tiling and proper
computation of empty and active cells.
Tests without any requirements.
"""
tiling = Tiling(
obstructions=typical_redundant_obstructions,
remove_empty_rows_and_cols=False,
derive_empty=False,
simplify=False,
)
assert len(tiling._obstructions) == 20
assert len(tiling._requirements) == 0
(i, j) = tiling.dimensions
assert i == 4
assert j == 2
tiling = Tiling(
obstructions=typical_redundant_obstructions,
remove_empty_rows_and_cols=False,
derive_empty=False,
simplify=True,
)
assert len(tiling._obstructions) == 18
assert len(tiling._requirements) == 0
(i, j) = tiling.dimensions
assert i == 4
assert j == 2
tiling = Tiling(
obstructions=typical_redundant_obstructions,
remove_empty_rows_and_cols=False,
derive_empty=True,
simplify=False,
)
assert len(tiling._obstructions) == 22
assert len(tiling._requirements) == 0
(i, j) = tiling.dimensions
assert i == 4
assert j == 2
assert tiling.empty_cells == {(0, 0), (0, 1)}
assert tiling.active_cells == {(1, 0), (1, 1), (2, 0), (2, 1), (3, 0), (3, 1)}
tiling = Tiling(
obstructions=typical_redundant_obstructions,
remove_empty_rows_and_cols=False,
derive_empty=True,
simplify=True,
)
assert len(tiling._obstructions) == 22
assert len(tiling._requirements) == 0
(i, j) = tiling.dimensions
assert i == 4
assert j == 2
assert tiling.empty_cells == {(0, 0), (0, 1), (1, 1), (2, 1)}
assert tiling.active_cells == {(1, 0), (2, 0), (3, 0), (3, 1)}
tiling = Tiling(
obstructions=typical_redundant_obstructions,
remove_empty_rows_and_cols=True,
derive_empty=True,
simplify=False,
)
(i, j) = tiling.dimensions
assert i == 3
assert j == 2
assert tiling.empty_cells == set()
assert tiling.active_cells == {(0, 0), (0, 1), (1, 0), (1, 1), (2, 0), (2, 1)}
assert len(tiling._obstructions) == 20
assert len(tiling._requirements) == 0
tiling = Tiling(
obstructions=typical_redundant_obstructions,
remove_empty_rows_and_cols=True,
derive_empty=True,
simplify=True,
)
(i, j) = tiling.dimensions
assert i == 3
assert j == 2
assert tiling.empty_cells == {(0, 1), (1, 1)}
assert tiling.active_cells == {(0, 0), (1, 0), (2, 0), (2, 1)}
assert len(tiling._obstructions) == 20
assert len(tiling._requirements) == 0
tiling2 = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((0, 1)), ((0, 0), (1, 0))),
GriddedPerm(Perm((0, 1)), ((0, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((2, 1), (2, 1))),
GriddedPerm(Perm((1, 0)), ((2, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((2, 1), (2, 0))),
GriddedPerm(Perm((1, 0)), ((2, 1), (2, 1))),
GriddedPerm(Perm((0, 1, 2)), ((2, 0), (2, 0), (2, 0))),
GriddedPerm(Perm((0, 1, 2)), ((2, 0), (2, 0), (2, 1))),
GriddedPerm(Perm((2, 1, 0)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((2, 1, 0)), ((0, 0), (0, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((0, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((0, 0), (1, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (2, 0))),
],
remove_empty_rows_and_cols=True,
derive_empty=True,
simplify=True,
)
assert tiling == tiling2
def test_constructor_with_requirements(
typical_redundant_obstructions, typical_redundant_requirements
):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
remove_empty_rows_and_cols=False,
derive_empty=False,
)
assert len(tiling._obstructions) == 18
assert len(tiling._requirements) == 4
(i, j) = tiling.dimensions
assert i == 4
assert j == 5
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
remove_empty_rows_and_cols=False,
derive_empty=True,
)
assert len(tiling._obstructions) == 29
assert len(tiling._requirements) == 4
(i, j) = tiling.dimensions
assert i == 4
assert j == 5
assert tiling.empty_cells == {
(0, 2),
(0, 3),
(0, 4),
(1, 1),
(1, 2),
(1, 3),
(1, 4),
(2, 1),
(2, 2),
(3, 2),
(3, 4),
}
assert tiling.active_cells == {
(0, 0),
(0, 1),
(1, 0),
(2, 0),
(2, 3),
(2, 4),
(3, 0),
(3, 1),
(3, 3),
}
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
remove_empty_rows_and_cols=True,
derive_empty=True,
)
(i, j) = tiling.dimensions
assert i == 4
assert j == 4
assert tiling.empty_cells == {
(0, 2),
(0, 3),
(1, 1),
(1, 2),
(1, 3),
(2, 1),
(3, 3),
}
assert tiling.active_cells == {
(0, 0),
(0, 1),
(1, 0),
(2, 0),
(2, 2),
(2, 3),
(3, 0),
(3, 1),
(3, 2),
}
assert len(tiling._obstructions) == 25
assert len(tiling._requirements) == 4
tiling2 = Tiling(
obstructions=typical_redundant_obstructions,
requirements=[
[GriddedPerm(Perm((0, 1)), [(2, 0), (3, 1)])],
[GriddedPerm(Perm((1, 0)), [(3, 2), (3, 1)])],
[
GriddedPerm(Perm((0, 1, 2)), [(0, 0), (1, 0), (2, 2)]),
GriddedPerm(Perm((0, 1, 2)), [(0, 0), (1, 0), (2, 3)]),
GriddedPerm(Perm((1, 0, 2)), [(0, 0), (1, 0), (2, 2)]),
GriddedPerm(Perm((1, 0, 2)), [(0, 1), (1, 0), (2, 2)]),
],
[
GriddedPerm(Perm((0, 1, 2)), [(2, 2), (2, 2), (2, 2)]),
GriddedPerm(Perm((1, 0, 2)), [(0, 0), (0, 0), (0, 0)]),
],
],
remove_empty_rows_and_cols=True,
derive_empty=True,
)
assert tiling == tiling2
@pytest.mark.filterwarnings("ignore::UserWarning")
def test_bytes_noreq(typical_redundant_obstructions):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
remove_empty_rows_and_cols=False,
derive_empty=False,
)
assert tiling == Tiling.from_bytes(tiling.to_bytes())
tiling = Tiling(
obstructions=typical_redundant_obstructions,
remove_empty_rows_and_cols=False,
derive_empty=True,
)
assert tiling == Tiling.from_bytes(tiling.to_bytes())
tiling = Tiling(
obstructions=typical_redundant_obstructions,
remove_empty_rows_and_cols=True,
derive_empty=True,
)
assert tiling == Tiling.from_bytes(tiling.to_bytes())
def test_from_string():
string = "123_231_45321"
assert Tiling.from_string(string) == Tiling(
[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((1, 2, 0)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(
Perm((3, 4, 2, 1, 0)), ((0, 0), (0, 0), (0, 0), (0, 0), (0, 0))
),
]
)
string = "3201_1032"
assert Tiling.from_string(string) == Tiling(
[
GriddedPerm(Perm((3, 2, 0, 1)), ((0, 0), (0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 0), (0, 0), (0, 0), (0, 0))),
]
)
string = "3142"
assert Tiling.from_string(string) == Tiling(
[GriddedPerm(Perm((2, 0, 3, 1)), ((0, 0), (0, 0), (0, 0), (0, 0)))]
)
def test_from_perms():
t = Tiling.from_perms(
[Perm((0, 1, 2)), Perm((0, 2, 1, 3))], [[Perm((0, 1)), Perm((1, 0))]]
)
assert t == Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((0, 0),) * 3),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 0),) * 4),
],
requirements=[
[
GriddedPerm(Perm((0, 1)), ((0, 0),) * 2),
GriddedPerm(Perm((1, 0)), ((0, 0),) * 2),
]
],
)
def test_bytes(compresstil):
assert compresstil == Tiling.from_bytes(compresstil.to_bytes())
assert compresstil.to_bytes() == compresstil.to_bytes()
assert (
compresstil.to_bytes()
== Tiling(compresstil.obstructions, compresstil.requirements).to_bytes()
)
def test_json(compresstil):
assert compresstil == Tiling.from_json(json.dumps(compresstil.to_jsonable()))
# For backward compatibility make sure we can load from json that don't have
# the assumptions field
d = compresstil.to_jsonable()
d.pop("assumptions")
assert compresstil == Tiling.from_json(json.dumps(d))
def test_cell_within_bounds(
typical_redundant_obstructions, typical_redundant_requirements
):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
remove_empty_rows_and_cols=False,
derive_empty=False,
)
for i in range(4):
for j in range(5):
assert tiling.cell_within_bounds((i, j))
for i in chain(range(-10, 0), range(5, 10)):
for j in range(-10, 10):
assert not tiling.cell_within_bounds((i, j))
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
remove_empty_rows_and_cols=True,
derive_empty=True,
)
for i in range(4):
for j in range(4):
assert tiling.cell_within_bounds((i, j))
for i in chain(range(-10, 0), range(4, 10)):
for j in range(-10, 10):
assert not tiling.cell_within_bounds((i, j))
def test_empty_cell(typical_redundant_obstructions, typical_redundant_requirements):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
)
tiling1 = tiling.empty_cell((3, 0))
tiling2 = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((3, 1), (3, 1))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 1))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (2, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((2, 0), (2, 0), (2, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((1, 1), (2, 0), (2, 0), (2, 0))),
],
requirements=[
[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 4))),
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (1, 0), (2, 3))),
GriddedPerm(Perm((1, 0, 2)), ((0, 1), (1, 0), (2, 3))),
],
[
GriddedPerm(Perm((0, 1, 2)), ((2, 3), (2, 3), (2, 3))),
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (0, 0), (0, 0))),
],
[
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 1))),
],
[GriddedPerm(Perm((1, 0)), ((3, 3), (3, 1)))],
],
)
assert tiling1 == tiling2
def test_insert_cell(typical_redundant_obstructions, typical_redundant_requirements):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
)
assert tiling.insert_cell((3, 1)) == tiling
assert tiling.insert_cell((1, 1)).obstructions[0] == GriddedPerm(Perm(tuple()), [])
requirements = typical_redundant_requirements + [
[
GriddedPerm(Perm((0, 1)), [(2, 0), (2, 1)]),
GriddedPerm(Perm((0, 1)), [(1, 1), (1, 2)]),
]
]
tiling = Tiling(
obstructions=typical_redundant_obstructions, requirements=requirements
)
tiling1 = tiling.insert_cell((2, 1))
assert tiling1 == Tiling(
obstructions=typical_redundant_obstructions,
requirements=requirements + [[GriddedPerm(Perm((0,)), [(2, 1)])]],
)
def test_add_obstruction(compresstil):
assert compresstil.add_obstruction(Perm((0, 1)), ((0, 0), (0, 1))) == Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((2, 1),)),
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 1))),
GriddedPerm(Perm((0, 1)), ((1, 1), (1, 1))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((1, 1), (1, 1))),
GriddedPerm(Perm((1, 0)), ((1, 1), (2, 0))),
GriddedPerm(Perm((1, 0)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((1, 0, 2)), ((0, 1), (0, 0), (1, 1))),
GriddedPerm(Perm((2, 0, 1)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 1), (0, 1), (1, 1))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 1), (0, 1), (1, 1))),
GriddedPerm(Perm((0, 2, 3, 1)), ((0, 1), (0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((0, 2, 3, 1)), ((0, 1), (0, 1), (0, 1), (1, 1))),
GriddedPerm(Perm((2, 0, 1, 3)), ((0, 1), (0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((2, 0, 1, 3)), ((0, 1), (0, 1), (0, 1), (1, 1))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((1, 1),)), GriddedPerm(Perm((0,)), ((2, 0),))),
(GriddedPerm(Perm((1, 0, 2)), ((0, 0), (0, 0), (0, 0))),),
),
)
assert (
compresstil.add_obstruction(Perm((1, 0, 2)), ((0, 1), (0, 0), (1, 1)))
== compresstil
)
def test_add_list_requirement(finite_tiling):
list_req = [
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
]
assert finite_tiling.add_list_requirement(list_req) == Tiling(
obstructions=(
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((2, 1, 0)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((2, 1, 0)), ((0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((3, 2, 1, 0)), ((0, 1),) * 2 + ((0, 0),) * 2),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 0),)),),
(
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
),
),
)
def test_add_requirement(compresstil, factorable_tiling):
assert compresstil.add_requirement(Perm((1, 0)), ((1, 1), (2, 0))) == Tiling(
obstructions=(GriddedPerm(Perm(), ()),)
)
assert factorable_tiling.add_requirement(Perm((0, 1)), ((0, 0), (5, 3))) == Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 2, 1)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((2, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((2, 0, 1)), ((2, 3), (2, 3), (2, 3))),
GriddedPerm(Perm((1, 0, 2)), ((5, 4), (5, 4), (5, 4))),
GriddedPerm(Perm((2, 0, 1)), ((5, 4), (5, 4), (5, 4))),
GriddedPerm(Perm((1, 2, 0)), ((4, 6), (4, 6), (4, 6))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (2, 2))),
GriddedPerm(Perm((0, 1, 2, 3)), ((2, 2), (2, 2), (2, 3), (2, 3))),
GriddedPerm(Perm((0, 1)), ((6, 4), (6, 4))),
GriddedPerm(Perm((1, 0)), ((6, 4), (6, 4))),
GriddedPerm(Perm((0, 1)), ((7, 7), (7, 7))),
],
requirements=[
[GriddedPerm(Perm((0, 1)), ((0, 0), (6, 4)))],
[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((4, 6), (4, 6))),
],
[GriddedPerm(Perm((0,)), ((6, 4),))],
],
)
def test_add_single_cell_obstruction(
typical_redundant_obstructions, typical_redundant_requirements
):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
)
assert tiling.add_single_cell_obstruction(Perm((0,)), (0, 2)) == tiling
assert tiling.add_single_cell_obstruction(Perm((0, 1, 2)), (3, 0)) == tiling
assert tiling.add_single_cell_obstruction(Perm((2, 1, 0)), (2, 0)) == tiling
tiling1 = Tiling(
requirements=typical_redundant_requirements,
obstructions=[
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((3, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((3, 1), (3, 1))),
GriddedPerm(Perm((1, 0)), ((3, 0), (3, 0))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 0))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 1))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (1, 0), (3, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (2, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 0), (2, 0), (3, 0))),
GriddedPerm(Perm((2, 1, 0)), ((2, 0), (2, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 0)), ((2, 0), (2, 0), (3, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((1, 1), (2, 0), (2, 0), (2, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((2, 1), (2, 1), (3, 0), (3, 0))),
],
)
assert tiling.add_single_cell_obstruction(Perm((0, 1)), (3, 0)) == tiling1
def test_add_single_cell_requirement(
typical_redundant_obstructions, typical_redundant_requirements
):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
)
assert tiling.add_single_cell_requirement(Perm((0, 1)), (1, 0)).obstructions[
0
] == GriddedPerm(Perm(tuple()), [])
tiling1 = Tiling(
obstructions=typical_redundant_obstructions,
requirements=[
[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 4))),
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (1, 0), (2, 3))),
GriddedPerm(Perm((1, 0, 2)), ((0, 1), (1, 0), (2, 3))),
],
[GriddedPerm(Perm((1, 0, 2)), ((0, 0), (0, 0), (0, 0)))],
[
GriddedPerm(Perm((0, 1)), ((1, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 1))),
],
[
GriddedPerm(Perm((1, 0)), ((3, 3), (3, 1))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 1))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 0))),
],
],
)
assert tiling.add_single_cell_requirement(Perm((1, 0, 2)), (0, 0)) == tiling1
tiling2 = Tiling(
obstructions=typical_redundant_obstructions,
requirements=(
typical_redundant_requirements
+ [[GriddedPerm.single_cell(Perm((0, 1, 2)), (0, 0))]]
),
)
assert tiling.add_single_cell_requirement(Perm((0, 1, 2)), (0, 0)) == tiling2
@pytest.fixture
def isolated_tiling():
return Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (1, 2))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 0), (0, 0), (1, 2), (2, 1))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 0), (0, 0), (0, 0), (1, 2))),
],
requirements=[
[
GriddedPerm(Perm((0,)), ((2, 1),)),
GriddedPerm(Perm((1, 0)), ((1, 2), (1, 2))),
],
[GriddedPerm(Perm((1, 2, 0)), ((1, 2), (1, 2), (2, 1)))],
],
)
def test_fully_isolated(
typical_redundant_obstructions, typical_redundant_requirements, isolated_tiling
):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
)
assert not tiling.fully_isolated()
assert isolated_tiling.fully_isolated()
def test_only_positive_in_row_and_col(
typical_redundant_obstructions, typical_redundant_requirements
):
tiling = Tiling(requirements=[[GriddedPerm.single_cell(Perm((0,)), (0, 0))]])
assert tiling.only_positive_in_row((0, 0))
assert tiling.only_positive_in_col((0, 0))
assert tiling.only_positive_in_row_and_col((0, 0))
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=(
typical_redundant_requirements
+ [[GriddedPerm.single_cell(Perm((0,)), (1, 4))]]
),
)
assert tiling.only_positive_in_row((1, 3))
assert not tiling.only_positive_in_row((2, 3))
assert not tiling.only_positive_in_col((2, 3))
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=(
typical_redundant_requirements
+ [[GriddedPerm.single_cell(Perm((0,)), (0, 4))]]
),
)
assert tiling.only_positive_in_row((0, 3))
assert tiling.only_positive_in_col((0, 3))
assert tiling.only_positive_in_row_and_col((0, 3))
assert tiling.only_positive_in_row((3, 2))
assert not tiling.only_positive_in_col((3, 2))
assert not tiling.only_positive_in_col((0, 2))
assert not tiling.only_positive_in_row((0, 2))
def test_only_cell_in_row_or_col(
typical_redundant_obstructions, typical_redundant_requirements
):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
)
assert tiling.only_cell_in_col((1, 0))
assert tiling.only_cell_in_row((2, 3))
assert not tiling.only_cell_in_row((3, 1))
assert not tiling.only_cell_in_col((3, 1))
def test_cells_in_row_col(
typical_redundant_obstructions, typical_redundant_requirements
):
tiling = Tiling(
obstructions=typical_redundant_obstructions,
requirements=typical_redundant_requirements,
)
row_1 = tiling.cells_in_row(-1)
row0 = tiling.cells_in_row(0)
row1 = tiling.cells_in_row(1)
row2 = tiling.cells_in_row(2)
row3 = tiling.cells_in_row(3)
row4 = tiling.cells_in_row(4)
assert row_1 == set()
assert row0 == set((x, 0) for x in range(4))
assert row1 == {(0, 1), (3, 1)}
assert row2 == {(2, 2), (3, 2)}
assert row3 == {(2, 3)}
assert row4 == set()
col_1 = tiling.cells_in_col(-1)
col0 = tiling.cells_in_col(0)
col1 = tiling.cells_in_col(1)
col2 = tiling.cells_in_col(2)
col3 = tiling.cells_in_col(3)
col4 = tiling.cells_in_col(4)
assert col_1 == set()
assert col0 == {(0, 0), (0, 1)}
assert col1 == {(1, 0)}
assert col2 == {(2, 0), (2, 2), (2, 3)}
assert col3 == {(3, 0), (3, 1), (3, 2)}
assert col4 == set()
def test_cell_basis(factorable_tiling):
tiling = Tiling(
obstructions=[
GriddedPerm(Perm((0, 2, 1)), [(0, 0), (0, 0), (0, 0)]),
GriddedPerm(Perm((0, 2, 1)), [(0, 0), (0, 1), (1, 1)]),
GriddedPerm(Perm((0, 2, 1)), [(0, 0), (1, 1), (1, 0)]),
GriddedPerm(Perm((0, 2, 1)), [(1, 1), (1, 1), (1, 1)]),
GriddedPerm(Perm((1, 0)), [(1, 0), (1, 0)]),
GriddedPerm(Perm((2, 0, 1)), [(0, 1), (0, 1), (0, 1)]),
]
)
bdict = tiling.cell_basis()
assert len(bdict) == 4
basis = bdict[(0, 0)]
assert len(basis[1]) == 0
assert set(basis[0]) == {Perm((0, 2, 1))}
basis = bdict[(0, 1)]
assert len(basis[1]) == 0
assert set(basis[0]) == {Perm((2, 0, 1))}
basis = bdict[(1, 0)]
assert len(basis[1]) == 0
assert set(basis[0]) == {Perm((1, 0))}
basis = bdict[(1, 1)]
assert len(basis[1]) == 0
assert set(basis[0]) == {Perm((0, 2, 1))}
# Basis for a non active cell
bdict = factorable_tiling.cell_basis()
assert bdict[(0, 1)] == ([Perm((0,))], [])
assert bdict[(5, 3)] == ([Perm((0, 1)), Perm((1, 0))], [Perm((0,))])
tiling2 = Tiling([], [[GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 1)))]])
bdict2 = tiling2.cell_basis()
assert bdict2[(0, 0)] == ([], [Perm((0, 1))])
assert bdict2[(0, 1)] == ([], [Perm((0,))])
tiling3 = Tiling(
[],
[
[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 1))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 1), (0, 1))),
]
],
)
bdict3 = tiling3.cell_basis()
assert bdict3[(0, 0)] == ([], [Perm((0,))])
assert bdict3[(0, 1)] == ([], [Perm((0,))])
# Check that all cell have a basis
dim = factorable_tiling.dimensions
for cell in product(range(dim[0]), range(dim[1])):
assert cell in bdict
assert len(bdict[cell][0]) >= 1
assert dim + (1, 0) not in bdict
def test_cell_graph(factorable_tiling, compresstil, typical_redundant_obstructions):
cell_graph = factorable_tiling.cell_graph()
assert list(sorted(cell_graph)) == [
((0, 0), (1, 0)),
((2, 1), (2, 2)),
((4, 3), (5, 3)),
]
cell_graph = compresstil.cell_graph()
assert list(sorted(cell_graph)) == [
((0, 0), (0, 1)),
((0, 0), (2, 0)),
((0, 1), (1, 1)),
]
tiling = Tiling(typical_redundant_obstructions)
cell_graph = tiling.cell_graph()
assert list(sorted(cell_graph)) == [
((0, 0), (1, 0)),
((1, 0), (2, 0)),
((2, 0), (2, 1)),
]
def test_sort_requirements(typical_redundant_requirements):
assert Tiling.sort_requirements(typical_redundant_requirements) == (
(
GriddedPerm(Perm((0, 1)), ((1, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 1))),
),
(
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 0))),
GriddedPerm(Perm((1, 0)), ((3, 1), (3, 1))),
GriddedPerm(Perm((1, 0)), ((3, 3), (3, 1))),
),
(
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 4))),
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (1, 0), (2, 3))),
GriddedPerm(Perm((1, 0, 2)), ((0, 1), (1, 0), (2, 3))),
),
(
GriddedPerm(Perm((0, 1, 2)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((0, 1, 2)), ((2, 3), (2, 3), (2, 3))),
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (0, 0), (0, 0))),
),
)
def test_gridded_perms():
tiling = Tiling()
assert len(list(tiling.gridded_perms())) == 1
tiling = Tiling([GriddedPerm(Perm(tuple()), tuple())], [])
assert len(list(tiling.gridded_perms(maxlen=3))) == 0
tiling = Tiling(requirements=[[GriddedPerm(Perm((0,)), [(0, 0)])]])
assert len(list(tiling.gridded_perms(maxlen=3))) == 9
tiling = Tiling(requirements=[[GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)])]])
griddable01 = sorted(list(tiling.gridded_perms(maxlen=3)))
assert griddable01 == sorted(
[
GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)]),
GriddedPerm(Perm((0, 1, 2)), [(0, 0), (0, 0), (1, 0)]),
GriddedPerm(Perm((0, 1, 2)), [(0, 0), (1, 0), (1, 0)]),
GriddedPerm(Perm((0, 2, 1)), [(0, 0), (0, 0), (1, 0)]),
GriddedPerm(Perm((0, 2, 1)), [(0, 0), (1, 0), (1, 0)]),
GriddedPerm(Perm((1, 0, 2)), [(0, 0), (0, 0), (1, 0)]),
GriddedPerm(Perm((1, 0, 2)), [(0, 0), (1, 0), (1, 0)]),
GriddedPerm(Perm((1, 2, 0)), [(0, 0), (1, 0), (1, 0)]),
GriddedPerm(Perm((2, 0, 1)), [(0, 0), (0, 0), (1, 0)]),
]
)
tiling = Tiling(requirements=[[GriddedPerm(Perm((1, 0)), [(0, 0), (1, 0)])]])
griddable10 = sorted(list(tiling.gridded_perms(maxlen=3)))
assert griddable10 == sorted(
[
GriddedPerm(Perm((1, 0)), [(0, 0), (1, 0)]),
GriddedPerm(Perm((0, 2, 1)), [(0, 0), (0, 0), (1, 0)]),
GriddedPerm(Perm((1, 2, 0)), [(0, 0), (0, 0), (1, 0)]),
GriddedPerm(Perm((1, 2, 0)), [(0, 0), (1, 0), (1, 0)]),
GriddedPerm(Perm((2, 1, 0)), [(0, 0), (0, 0), (1, 0)]),
GriddedPerm(Perm((2, 1, 0)), [(0, 0), (1, 0), (1, 0)]),
GriddedPerm(Perm((2, 0, 1)), [(0, 0), (0, 0), (1, 0)]),
GriddedPerm(Perm((2, 0, 1)), [(0, 0), (1, 0), (1, 0)]),
GriddedPerm(Perm((1, 0, 2)), [(0, 0), (1, 0), (1, 0)]),
]
)
tiling = Tiling(requirements=[[GriddedPerm(Perm((0, 1)), [(0, 0), (0, 1)])]])
griddable = sorted(list(tiling.gridded_perms(maxlen=3)))
assert griddable == sorted(
[
GriddedPerm(Perm((0, 1)), [(0, 0), (0, 1)]),
GriddedPerm(Perm((0, 1, 2)), [(0, 0), (0, 0), (0, 1)]),
GriddedPerm(Perm((0, 1, 2)), [(0, 0), (0, 1), (0, 1)]),
GriddedPerm(Perm((0, 2, 1)), [(0, 0), (0, 1), (0, 0)]),
GriddedPerm(Perm((0, 2, 1)), [(0, 0), (0, 1), (0, 1)]),
GriddedPerm(Perm((1, 0, 2)), [(0, 0), (0, 0), (0, 1)]),
GriddedPerm(Perm((1, 0, 2)), [(0, 1), (0, 0), (0, 1)]),
GriddedPerm(Perm((1, 2, 0)), [(0, 0), (0, 1), (0, 0)]),
GriddedPerm(Perm((2, 0, 1)), [(0, 1), (0, 0), (0, 1)]),
]
)
tiling = Tiling(
requirements=[
[GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)])],
[GriddedPerm(Perm((1, 0)), [(0, 0), (1, 0)])],
]
)
griddable = sorted(list(tiling.gridded_perms(maxlen=2)))
assert len(griddable) == 0
griddable = sorted(list(tiling.gridded_perms(maxlen=3)))
assert griddable == sorted(
[
GriddedPerm(Perm((1, 2, 0)), [(0, 0), (1, 0), (1, 0)]),
GriddedPerm(Perm((1, 0, 2)), [(0, 0), (1, 0), (1, 0)]),
GriddedPerm(Perm((0, 2, 1)), [(0, 0), (0, 0), (1, 0)]),
GriddedPerm(Perm((2, 0, 1)), [(0, 0), (0, 0), (1, 0)]),
]
)
tiling = Tiling(
requirements=[
[
GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)]),
GriddedPerm(Perm((1, 0)), [(0, 0), (1, 0)]),
]
]
)
griddable = sorted(list(tiling.gridded_perms(maxlen=3)))
assert griddable == sorted(list(set(griddable01) | set(griddable10)))
tiling = Tiling(
obstructions=[GriddedPerm(Perm((1, 0)), [(0, 0), (1, 0)])],
requirements=[[GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)])]],
)
griddable = sorted(list(tiling.gridded_perms(maxlen=3)))
assert griddable == sorted(list(set(griddable01) - set(griddable10)))
tiling = Tiling(
obstructions=[GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)])],
requirements=[[GriddedPerm(Perm((1, 0)), [(0, 0), (1, 0)])]],
)
griddable = sorted(list(tiling.gridded_perms(maxlen=3)))
assert griddable == sorted(list(set(griddable10) - set(griddable01)))
tiling = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)]),
GriddedPerm(Perm((1, 0)), [(0, 0), (1, 0)]),
],
requirements=[
[GriddedPerm(Perm((0,)), [(0, 0)])],
[GriddedPerm(Perm((0,)), [(1, 0)])],
],
)
assert len(list(tiling.gridded_perms(maxlen=5))) == 0
assert tiling.is_empty()
tiling = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)]),
GriddedPerm(Perm((1, 0)), [(0, 0), (1, 0)]),
]
)
griddable = sorted(list(tiling.gridded_perms(maxlen=2)))
assert griddable == [
GriddedPerm(Perm(), []),
GriddedPerm(Perm((0,)), [(0, 0)]),
GriddedPerm(Perm((0,)), [(1, 0)]),
GriddedPerm(Perm((0, 1)), [(0, 0), (0, 0)]),
GriddedPerm(Perm((0, 1)), [(1, 0), (1, 0)]),
GriddedPerm(Perm((1, 0)), [(0, 0), (0, 0)]),
GriddedPerm(Perm((1, 0)), [(1, 0), (1, 0)]),
]
@pytest.fixture
def christian_til():
return Tiling(
obstructions=[
GriddedPerm(Perm((0, 2, 3, 1)), [(0, 0), (1, 1), (1, 1), (2, 0)]),
GriddedPerm(Perm((0, 1)), [(1, 0), (1, 0)]),
GriddedPerm(Perm((1, 0)), [(1, 0), (1, 0)]),
GriddedPerm(Perm((0, 1)), [(2, 1), (2, 1)]),
GriddedPerm(Perm((1, 0)), [(2, 1), (2, 1)]),
],
requirements=[
[
GriddedPerm(Perm((0, 2, 1)), [(0, 1), (0, 2), (1, 2)]),
GriddedPerm(Perm((1, 0)), [(0, 2), (0, 1)]),
],
[GriddedPerm(Perm((0,)), [(1, 0)])],
[GriddedPerm(Perm((0,)), [(2, 0)])],
[GriddedPerm(Perm((0,)), [(2, 1)])],
],
)
def test_symmetries(christian_til):
rotate90til = Tiling(
obstructions=[
GriddedPerm(Perm((3, 0, 2, 1)), [(0, 2), (0, 0), (1, 1), (1, 1)]),
GriddedPerm(Perm((0, 1)), [(0, 1), (0, 1)]),
GriddedPerm(Perm((1, 0)), [(0, 1), (0, 1)]),
GriddedPerm(Perm((0, 1)), [(1, 0), (1, 0)]),
GriddedPerm(Perm((1, 0)), [(1, 0), (1, 0)]),
],
requirements=[
[
GriddedPerm(Perm((2, 0, 1)), [(1, 2), (2, 1), (2, 2)]),
GriddedPerm(Perm((0, 1)), [(1, 2), (2, 2)]),
],
[GriddedPerm(Perm((0,)), [(0, 0)])],
[GriddedPerm(Perm((0,)), [(0, 1)])],
[GriddedPerm(Perm((0,)), [(1, 0)])],
],
)
assert christian_til.rotate90() == rotate90til
assert rotate90til.rotate90().rotate90().rotate90() == christian_til
assert rotate90til.rotate180().rotate90() == christian_til
assert rotate90til.rotate270() == christian_til
assert rotate90til.rotate90() == christian_til.rotate180()
assert rotate90til.rotate180() == christian_til.rotate270()
rotate270til = Tiling(
obstructions=[
GriddedPerm(Perm((2, 1, 3, 0)), [(1, 1), (1, 1), (2, 2), (2, 0)]),
GriddedPerm(Perm((0, 1)), [(1, 2), (1, 2)]),
GriddedPerm(Perm((1, 0)), [(1, 2), (1, 2)]),
GriddedPerm(Perm((0, 1)), [(2, 1), (2, 1)]),
GriddedPerm(Perm((1, 0)), [(2, 1), (2, 1)]),
],
requirements=[
[
GriddedPerm(Perm((1, 2, 0)), [(0, 0), (0, 1), (1, 0)]),
GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)]),
],
[GriddedPerm(Perm((0,)), [(1, 2)])],
[GriddedPerm(Perm((0,)), [(2, 1)])],
[GriddedPerm(Perm((0,)), [(2, 2)])],
],
)
assert (
christian_til.rotate90().rotate90().rotate90()
== christian_til.rotate270()
== rotate270til
)
assert christian_til.rotate180().rotate180() == christian_til
revtil = Tiling(
obstructions=[
GriddedPerm(Perm((1, 3, 2, 0)), [(0, 0), (1, 1), (1, 1), (2, 0)]),
GriddedPerm(Perm((0, 1)), [(0, 1), (0, 1)]),
GriddedPerm(Perm((1, 0)), [(0, 1), (0, 1)]),
GriddedPerm(Perm((0, 1)), [(1, 0), (1, 0)]),
GriddedPerm(Perm((1, 0)), [(1, 0), (1, 0)]),
],
requirements=[
[
GriddedPerm(Perm((1, 2, 0)), [(1, 2), (2, 2), (2, 1)]),
GriddedPerm(Perm((0, 1)), [(2, 1), (2, 2)]),
],
[GriddedPerm(Perm((0,)), [(0, 0)])],
[GriddedPerm(Perm((0,)), [(0, 1)])],
[GriddedPerm(Perm((0,)), [(1, 0)])],
],
)
assert christian_til.reverse() == revtil
assert christian_til.reverse().reverse() == christian_til
assert christian_til.rotate270().reverse() == christian_til.inverse()
assert christian_til.reverse().rotate270() == christian_til.antidiagonal()
assert christian_til.inverse() == christian_til.inverse()
assert christian_til.rotate180().reverse() == christian_til.complement()
assert christian_til.rotate90().reverse() == christian_til.antidiagonal()
def test_all_symmetries():
t = Tiling.from_string("123")
assert len(t.all_symmetries()) == 2
t = Tiling.from_string("1")
assert len(t.all_symmetries()) == 1
t = Tiling.from_string("1243")
assert len(t.all_symmetries()) == 4
t = Tiling.from_string("1342")
assert len(t.all_symmetries()) == 8
def test_is_empty(compresstil, empty_tiling, finite_tiling):
assert not compresstil.is_empty()
assert not finite_tiling.is_empty()
assert empty_tiling.is_empty()
def test_is_finite(compresstil, empty_tiling, finite_tiling):
assert not compresstil.is_finite()
assert finite_tiling.is_finite()
def test_merge(compresstil, finite_tiling, empty_tiling):
assert finite_tiling.merge() == finite_tiling
print(compresstil)
print(compresstil.merge().requirements)
assert compresstil.merge() == Tiling(
obstructions=compresstil.obstructions,
requirements=(
(
GriddedPerm(Perm((1, 0, 2, 3)), ((0, 0), (0, 0), (0, 0), (1, 1))),
GriddedPerm(Perm((1, 0, 2, 3)), ((0, 0), (0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 0), (0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((2, 0, 3, 1)), ((0, 0), (0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((2, 1, 3, 0)), ((0, 0), (0, 0), (0, 0), (2, 0))),
),
),
)
assert empty_tiling.merge() == Tiling([GriddedPerm.empty_perm()])
def test_point_cells(
compresstil,
finite_tiling,
empty_tiling,
christian_til,
typical_redundant_obstructions,
typical_redundant_requirements,
):
assert Tiling(
typical_redundant_obstructions, typical_redundant_requirements
).point_cells == set([(3, 1)])
assert compresstil.point_cells == set()
assert finite_tiling.point_cells == set()
assert empty_tiling.point_cells == set()
tiling = compresstil.add_single_cell_requirement(Perm((0,)), (1, 1))
assert tiling.point_cells == set([(1, 1)])
tiling = compresstil.add_single_cell_requirement(Perm((0,)), (2, 0))
assert tiling.point_cells == set([(1, 0)])
assert christian_til.point_cells == set([(1, 0), (2, 1)])
def test_positive_cells(compresstil, empty_tiling, finite_tiling, christian_til):
assert compresstil.positive_cells == set([(0, 0)])
assert finite_tiling.positive_cells == set([(0, 0)])
assert empty_tiling.positive_cells == set([(0, 0), (0, 1)])
assert christian_til.positive_cells == set([(1, 0), (0, 2), (0, 1), (2, 0), (2, 1)])
def test_dimensions(compresstil, empty_tiling, finite_tiling, christian_til):
assert empty_tiling.dimensions == (1, 2)
assert finite_tiling.dimensions == (1, 2)
assert compresstil.dimensions == (3, 2)
assert christian_til.dimensions == (3, 3)
def test_add_obstruction_in_all_ways():
initial_tiling = Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((0, 0),)),
GriddedPerm(Perm((0,)), ((0, 1),)),
GriddedPerm(Perm((0,)), ((0, 3),)),
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((1, 1),)),
GriddedPerm(Perm((0,)), ((1, 2),)),
GriddedPerm(Perm((0,)), ((2, 1),)),
GriddedPerm(Perm((0,)), ((2, 2),)),
GriddedPerm(Perm((0,)), ((2, 3),)),
GriddedPerm(Perm((0,)), ((3, 0),)),
GriddedPerm(Perm((0,)), ((3, 2),)),
GriddedPerm(Perm((0, 1)), ((0, 2), (0, 2))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((0, 2), (0, 2))),
GriddedPerm(Perm((1, 0)), ((2, 0), (2, 0))),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((1, 3), (1, 3), (1, 3), (1, 3), (1, 3))
),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((3, 1), (3, 1), (3, 1), (3, 1), (3, 1))
),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((3, 3), (3, 3), (3, 3), (3, 3), (3, 3))
),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 2),)),),
(GriddedPerm(Perm((0,)), ((2, 0),)),),
),
)
final_tiling = Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((0, 0),)),
GriddedPerm(Perm((0,)), ((0, 1),)),
GriddedPerm(Perm((0,)), ((0, 3),)),
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((1, 1),)),
GriddedPerm(Perm((0,)), ((1, 2),)),
GriddedPerm(Perm((0,)), ((2, 1),)),
GriddedPerm(Perm((0,)), ((2, 2),)),
GriddedPerm(Perm((0,)), ((2, 3),)),
GriddedPerm(Perm((0,)), ((3, 0),)),
GriddedPerm(Perm((0,)), ((3, 2),)),
GriddedPerm(Perm((0, 1)), ((0, 2), (0, 2))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((0, 2), (0, 2))),
GriddedPerm(Perm((1, 0)), ((2, 0), (2, 0))),
GriddedPerm(Perm((1, 2, 0)), ((3, 1), (3, 3), (3, 1))),
GriddedPerm(Perm((2, 1, 3, 0)), ((1, 3), (3, 3), (3, 3), (3, 1))),
GriddedPerm(Perm((2, 1, 3, 0)), ((1, 3), (3, 3), (3, 3), (3, 3))),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((1, 3), (1, 3), (1, 3), (1, 3), (1, 3))
),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((1, 3), (1, 3), (1, 3), (1, 3), (3, 3))
),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((1, 3), (1, 3), (1, 3), (3, 3), (3, 3))
),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((3, 1), (3, 1), (3, 1), (3, 1), (3, 1))
),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((3, 3), (3, 1), (3, 3), (3, 3), (3, 1))
),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((3, 3), (3, 1), (3, 3), (3, 3), (3, 3))
),
GriddedPerm(
Perm((3, 0, 2, 4, 1)), ((3, 3), (3, 3), (3, 3), (3, 3), (3, 3))
),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 2),)),),
(GriddedPerm(Perm((0,)), ((2, 0),)),),
),
)
patt = Perm.to_standard((4, 1, 3, 5, 2))
assert initial_tiling.add_obstruction_in_all_ways(patt) == final_tiling
def test_sum_decomposition():
obs = [
GriddedPerm.single_cell(Perm((0, 1)), (0, 0)),
GriddedPerm.single_cell(Perm((0, 1)), (1, 2)),
GriddedPerm.single_cell(Perm((0, 1)), (2, 1)),
GriddedPerm.single_cell(Perm((0, 1)), (3, 2)),
GriddedPerm.single_cell(Perm((0, 1)), (4, 4)),
GriddedPerm.single_cell(Perm((0, 1)), (5, 3)),
]
reqs = []
t = Tiling(obs, reqs)
assert t.sum_decomposition() == [
[(0, 0)],
[(1, 2), (2, 1), (3, 2)],
[(4, 4), (5, 3)],
]
assert t.skew_decomposition() == [[(0, 0), (1, 2), (2, 1), (3, 2), (4, 4), (5, 3)]]
assert len(t.reverse().sum_decomposition()) == 1
assert len(t.reverse().skew_decomposition()) == 3
def test_is_empty_cell(isolated_tiling):
assert isolated_tiling.is_empty_cell((0, 1))
assert not isolated_tiling.is_empty_cell((0, 0))
assert not isolated_tiling.is_empty_cell((2, 1))
def test_is_monotone_cell(isolated_tiling):
assert isolated_tiling.is_monotone_cell((0, 0))
assert isolated_tiling.is_monotone_cell((1, 0))
assert not isolated_tiling.is_monotone_cell((2, 1))
t = Tiling.from_string("123")
assert not t.is_monotone_cell((0, 0))
def test_repr(factorable_tiling, empty_tiling):
assert factorable_tiling == eval(repr(factorable_tiling))
assert empty_tiling == eval(repr(empty_tiling))
assert repr(Tiling()) == "Tiling(obstructions=(), requirements=(), assumptions=())"
# ------------------------------------------------------------
# Test for algorithms
# ------------------------------------------------------------
def test_fusion():
t = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((0, 1)), ((0, 0), (1, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
]
)
with pytest.raises(AssertionError):
t.fusion()
with pytest.raises(AssertionError):
t.fusion(row=0, col=1)
with pytest.raises(InvalidOperationError):
t.fusion(row=1)
with pytest.raises(InvalidOperationError):
t.fusion(col=3)
with pytest.raises(InvalidOperationError):
t.fusion(col=1)
assert t.fusion(col=0) == Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
]
)
def test_component_fusion():
t = Tiling(
obstructions=[
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((1, 1),)),
GriddedPerm(Perm((0,)), ((1, 2),)),
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 1))),
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 2))),
GriddedPerm(Perm((0, 1)), ((0, 0), (1, 3))),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 2))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 1), (0, 1), (0, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 1), (0, 1), (1, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 2), (0, 2), (0, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 2), (0, 2), (1, 3))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 2, 1)), ((0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((0, 2, 1)), ((0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 2, 1)), ((1, 3), (1, 3), (1, 3))),
GriddedPerm(Perm((1, 0, 2)), ((1, 3), (1, 3), (1, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 0), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 3), (0, 3), (1, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 3), (1, 3), (1, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 2), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 2), (0, 3), (0, 3), (1, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 2), (0, 3), (1, 3), (1, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 3), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 3), (0, 3), (0, 3), (1, 3))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 3), (0, 3), (1, 3), (1, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 0), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 3), (0, 3), (1, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 3), (1, 3), (1, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 2), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 2), (0, 3), (0, 3), (1, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 2), (0, 3), (1, 3), (1, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 3), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 3), (0, 3), (0, 3), (1, 3))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 3), (0, 3), (1, 3), (1, 3))),
]
)
assert t.component_fusion(row=1) == Tiling(
obstructions=[
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((1, 1),)),
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 1))),
GriddedPerm(Perm((0, 1)), ((0, 0), (1, 2))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 2))),
GriddedPerm(Perm((0, 1, 2)), ((0, 1), (0, 1), (0, 2))),
GriddedPerm(Perm((0, 1, 2)), ((0, 1), (0, 1), (1, 2))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 2, 1)), ((0, 1), (0, 1), (0, 1))),
GriddedPerm(Perm((0, 2, 1)), ((1, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 2)), ((1, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 0), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 2), (0, 2), (1, 2))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 1), (0, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 2), (0, 2), (0, 2), (1, 2))),
GriddedPerm(Perm((0, 1, 3, 2)), ((0, 2), (0, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 0), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 2), (0, 2), (1, 2))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 1), (0, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 2), (0, 2), (0, 2), (1, 2))),
GriddedPerm(Perm((0, 2, 1, 3)), ((0, 2), (0, 2), (1, 2), (1, 2))),
]
)
with pytest.raises(AssertionError):
t.fusion()
with pytest.raises(AssertionError):
t.fusion(row=0, col=1)
with pytest.raises(InvalidOperationError):
t.fusion(row=5)
with pytest.raises(InvalidOperationError):
t.fusion(col=3)
with pytest.raises(InvalidOperationError):
t.fusion(col=1)
def test_find_factors(compresstil, factorable_tiling):
factors = compresstil.find_factors()
assert len(factors) == 1
assert factors[0] == compresstil
factors = factorable_tiling.find_factors(interleaving="none")
actual_factors = [
Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 2, 1)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((2, 1), (2, 1), (2, 1))),
GriddedPerm(Perm((2, 0, 1)), ((2, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((1, 2, 0)), ((3, 3), (3, 3), (3, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (2, 1))),
GriddedPerm(Perm((0, 1, 2, 3)), ((2, 1), (2, 1), (2, 2), (2, 2))),
],
requirements=[
[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((3, 3), (3, 3))),
]
],
),
Tiling(
obstructions=[
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((2, 0, 1)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((1, 0), (1, 0))),
],
requirements=[[GriddedPerm(Perm((0,)), ((1, 0),))]],
),
Tiling(obstructions=[GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0)))]),
]
assert len(factors) == len(actual_factors)
assert all(f in factors for f in actual_factors)
mon_int_factors = factorable_tiling.find_factors(interleaving="monotone")
actual_mon_int_factors = [
Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 2, 1)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((2, 1, 0)), ((2, 1), (2, 1), (2, 1))),
GriddedPerm(Perm((2, 0, 1)), ((2, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((1, 2, 0)), ((3, 3), (3, 3), (3, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (2, 1))),
GriddedPerm(Perm((0, 1, 2, 3)), ((2, 1), (2, 1), (2, 2), (2, 2))),
],
requirements=[
[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((3, 3), (3, 3))),
]
],
),
Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((1, 0), (1, 0))),
],
requirements=[[GriddedPerm(Perm((0,)), ((1, 0),))]],
),
Tiling(
obstructions=[
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((2, 0, 1)), ((0, 0), (0, 0), (0, 0))),
]
),
Tiling(obstructions=[GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0)))]),
]
assert len(mon_int_factors) == len(actual_mon_int_factors)
assert all(f in mon_int_factors for f in actual_mon_int_factors)
int_factors = factorable_tiling.find_factors(interleaving="any")
actual_int_factors = [
Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((2, 1, 0)), ((1, 1), (1, 1), (1, 1))),
GriddedPerm(Perm((2, 0, 1)), ((1, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((1, 2, 0)), ((2, 3), (2, 3), (2, 3))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (1, 1))),
GriddedPerm(Perm((0, 1, 2, 3)), ((1, 1), (1, 1), (1, 2), (1, 2))),
],
requirements=[
[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((2, 3), (2, 3))),
]
],
),
Tiling(obstructions=[GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 0), (0, 0)))]),
Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((1, 0), (1, 0))),
],
requirements=[[GriddedPerm(Perm((0,)), ((1, 0),))]],
),
Tiling(
obstructions=[
GriddedPerm(Perm((1, 0, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((2, 0, 1)), ((0, 0), (0, 0), (0, 0))),
]
),
Tiling(obstructions=[GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0)))]),
]
for f in int_factors:
print(f)
print("-" * 50)
for f in actual_int_factors:
print(f)
assert len(int_factors) == len(actual_int_factors)
assert all(f in int_factors for f in actual_int_factors)
with pytest.raises(InvalidOperationError):
factors = compresstil.find_factors(interleaving="magic")
def test_row_and_column_separation():
separable_t = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((0, 0),) * 3),
GriddedPerm(Perm((0, 1, 2)), ((0, 1),) * 3),
GriddedPerm(Perm((0, 1, 2)), ((0, 2),) * 3),
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 1))),
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 2))),
]
)
assert separable_t.row_and_column_separation() == Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((1, 0),) * 3),
GriddedPerm(Perm((0, 1, 2)), ((0, 1),) * 3),
GriddedPerm(Perm((0, 1, 2)), ((0, 2),) * 3),
]
)
not_sep_t = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1, 2)), ((0, 0),) * 3),
GriddedPerm(Perm((0, 1, 2)), ((0, 1),) * 3),
GriddedPerm(Perm((0, 1, 2)), ((0, 2),) * 3),
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 1))),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 2))),
]
)
assert not_sep_t.row_and_column_separation() == not_sep_t
need_two_sep_t = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((0, 1),) * 2),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0),) * 3),
GriddedPerm(Perm((0, 1, 2)), ((1, 0),) * 3),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 1), (1, 0))),
],
requirements=[[GriddedPerm(Perm((0,)), ((0, 1),))]],
)
assert need_two_sep_t.row_and_column_separation() == Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((1, 2),) * 2),
GriddedPerm(Perm((0, 1, 2)), ((0, 1),) * 3),
GriddedPerm(Perm((0, 1, 2)), ((2, 0),) * 3),
],
requirements=[[GriddedPerm(Perm((0,)), ((1, 2),))]],
)
def test_obstruction_transitivity():
t1 = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)]),
GriddedPerm(Perm((0, 1)), [(1, 0), (2, 0)]),
],
requirements=[[GriddedPerm(Perm((0,)), [(1, 0)])]],
)
assert t1.obstruction_transitivity() == Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), [(0, 0), (1, 0)]),
GriddedPerm(Perm((0, 1)), [(1, 0), (2, 0)]),
GriddedPerm(Perm((0, 1)), [(0, 0), (2, 0)]),
],
requirements=[[GriddedPerm(Perm((0,)), [(1, 0)])]],
)
# Tiling with no new obstruction
t2 = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), [(0, 0), (0, 1)]),
GriddedPerm(Perm((0, 1)), [(0, 1), (0, 2)]),
],
)
assert t2.obstruction_transitivity() == t2
def test_subobstruction_inferral(obs_inf_til):
assert obs_inf_til.subobstruction_inferral() == Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 2), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
],
requirements=[[GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0)))]],
)
def test_all_obstruction_inferral(obs_inf_til):
assert obs_inf_til.all_obstruction_inferral(3) == Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 2), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
],
requirements=[[GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0)))]],
)
def test_empty_cell_inferral():
t = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((1, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1)), ((2, 0), (3, 0))),
GriddedPerm(Perm((0, 1)), ((3, 0), (3, 0))),
GriddedPerm(Perm((1, 0)), ((1, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((1, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((1, 0), (3, 0))),
GriddedPerm(Perm((1, 0)), ((2, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((2, 0), (3, 0))),
GriddedPerm(Perm((1, 0)), ((3, 0), (3, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (1, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (3, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 0))),
GriddedPerm(Perm((0, 1, 2)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((0, 1, 2)), ((1, 0), (1, 0), (2, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((0, 0), (0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((0, 0), (0, 0), (0, 0), (1, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((0, 0), (0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((0, 0), (0, 0), (0, 0), (3, 0))),
],
requirements=[[GriddedPerm(Perm((0, 1)), ((1, 0), (2, 0)))]],
)
assert t.empty_cell_inferral() == Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((2, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((1, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((1, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (1, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (0, 0), (2, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((0, 1, 2)), ((0, 0), (1, 0), (2, 0))),
GriddedPerm(Perm((0, 1, 2)), ((1, 0), (1, 0), (1, 0))),
GriddedPerm(Perm((0, 1, 2)), ((1, 0), (1, 0), (2, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((0, 0), (0, 0), (0, 0), (0, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((0, 0), (0, 0), (0, 0), (1, 0))),
GriddedPerm(Perm((3, 2, 1, 0)), ((0, 0), (0, 0), (0, 0), (2, 0))),
],
requirements=[[GriddedPerm(Perm((0, 1)), ((1, 0), (2, 0)))]],
)
def place_point_in_cell(obs_inf_til):
assert obs_inf_til.place_point_in_cell((0, 1), 0) == Tiling(
obstructions=[
GriddedPerm(Perm((0,)), ((0, 1),)),
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((1, 2),)),
GriddedPerm(Perm((0,)), ((2, 1),)),
GriddedPerm(Perm((0, 1)), ((1, 1), (1, 1))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((1, 1), (1, 1))),
GriddedPerm(Perm((1, 0)), ((2, 0), (2, 0))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 2), (2, 0))),
GriddedPerm(Perm((0, 2, 1)), ((0, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((0, 2, 1)), ((2, 0), (2, 2), (2, 2))),
GriddedPerm(Perm((2, 1, 0)), ((2, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (2, 2), (2, 2), (2, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((2, 2), (2, 2), (2, 2), (2, 2))),
],
requirements=(
(GriddedPerm(Perm((0,)), ((1, 1),)),),
(GriddedPerm(Perm((0,)), ((2, 0),)),),
),
)
def test_place_point_of_gridded_permutation(obs_inf_til):
gp = GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0)))
assert obs_inf_til.place_point_of_gridded_permutation(gp, 1, 2) == Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((0, 1),)),
GriddedPerm(Perm((0,)), ((0, 2),)),
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((1, 2),)),
GriddedPerm(Perm((0,)), ((1, 3),)),
GriddedPerm(Perm((0,)), ((1, 4),)),
GriddedPerm(Perm((0,)), ((2, 0),)),
GriddedPerm(Perm((0,)), ((2, 1),)),
GriddedPerm(Perm((0, 1)), ((0, 3), (0, 3))),
GriddedPerm(Perm((0, 1)), ((0, 3), (2, 3))),
GriddedPerm(Perm((0, 1)), ((1, 1), (1, 1))),
GriddedPerm(Perm((0, 1)), ((2, 3), (2, 3))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 3), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 3), (0, 3))),
GriddedPerm(Perm((1, 0)), ((0, 3), (2, 3))),
GriddedPerm(Perm((1, 0)), ((1, 1), (1, 1))),
GriddedPerm(Perm((1, 0)), ((2, 2), (2, 2))),
GriddedPerm(Perm((1, 0)), ((2, 3), (2, 3))),
GriddedPerm(Perm((1, 0)), ((2, 4), (2, 4))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 4), (0, 3))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 4), (0, 4))),
GriddedPerm(Perm((2, 1, 0)), ((2, 4), (2, 3), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (2, 3), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (2, 4), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (2, 4), (2, 3))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 3), (0, 4), (0, 4), (0, 4))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 3), (0, 4), (0, 4), (2, 4))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 4), (0, 4), (0, 4), (0, 4))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 4), (0, 4), (0, 4), (2, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 0), (0, 4), (2, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 3), (0, 4), (0, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 3), (0, 4), (2, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 4), (0, 4), (0, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 4), (0, 4), (2, 4))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 3),)),),
(GriddedPerm(Perm((0,)), ((1, 1),)),),
),
)
def test_place_row(obs_inf_til):
assert set(obs_inf_til.place_row(2, 1)) == set(
[
Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((0, 3),)),
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((1, 1),)),
GriddedPerm(Perm((0,)), ((1, 2),)),
GriddedPerm(Perm((0,)), ((2, 3),)),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 1)), ((0, 1), (2, 1))),
GriddedPerm(Perm((0, 1)), ((1, 3), (1, 3))),
GriddedPerm(Perm((0, 1)), ((2, 1), (2, 1))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 1), (2, 1))),
GriddedPerm(Perm((1, 0)), ((1, 3), (1, 3))),
GriddedPerm(Perm((1, 0)), ((2, 0), (2, 0))),
GriddedPerm(Perm((1, 0)), ((2, 1), (2, 1))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (2, 1), (2, 0))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (2, 2), (2, 0))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (2, 2), (2, 1))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (2, 2), (2, 2))),
GriddedPerm(Perm((0, 2, 1)), ((0, 1), (2, 2), (2, 2))),
GriddedPerm(Perm((0, 2, 1)), ((0, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 2)), ((0, 1), (0, 0), (2, 2))),
GriddedPerm(Perm((1, 0, 2)), ((0, 2), (0, 0), (2, 2))),
GriddedPerm(Perm((1, 0, 2)), ((0, 2), (0, 1), (2, 2))),
GriddedPerm(Perm((1, 0, 2)), ((0, 2), (0, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (2, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (2, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (2, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((2, 0), (2, 2), (2, 1), (2, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((2, 0), (2, 2), (2, 2), (2, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((2, 0), (2, 2), (2, 2), (2, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((2, 0), (2, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((2, 1), (2, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((2, 2), (2, 2), (2, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((2, 1), (2, 0), (2, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((2, 2), (2, 0), (2, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((2, 2), (2, 1), (2, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((2, 2), (2, 2), (2, 2), (2, 2))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((1, 3),)),),
(
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (2, 0))),
GriddedPerm(Perm((1, 0)), ((2, 1), (2, 0))),
),
),
)
]
)
def test_place_col(obs_inf_til):
assert set(obs_inf_til.place_col(0, 2)) == set(
[
Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((0, 0),)),
GriddedPerm(Perm((0,)), ((0, 2),)),
GriddedPerm(Perm((0,)), ((1, 1),)),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((1, 0), (1, 0))),
GriddedPerm(Perm((0, 2, 1)), ((1, 0), (1, 2), (1, 2))),
GriddedPerm(Perm((2, 1, 0)), ((1, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 2), (1, 2), (1, 2), (1, 2))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 1),)),),
(GriddedPerm(Perm((0,)), ((1, 0),)),),
),
),
Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((0, 1),)),
GriddedPerm(Perm((0,)), ((0, 2),)),
GriddedPerm(Perm((0,)), ((0, 3),)),
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((0, 1)), ((1, 2), (1, 2))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((1, 1), (1, 1))),
GriddedPerm(Perm((1, 0)), ((1, 2), (1, 2))),
GriddedPerm(Perm((2, 1, 0)), ((1, 3), (1, 2), (1, 1))),
GriddedPerm(Perm((2, 1, 0)), ((1, 3), (1, 3), (1, 1))),
GriddedPerm(Perm((2, 1, 0)), ((1, 3), (1, 3), (1, 2))),
GriddedPerm(Perm((2, 1, 0)), ((1, 3), (1, 3), (1, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 2), (1, 1), (1, 3), (1, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 3), (1, 1), (1, 3), (1, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 3), (1, 2), (1, 3), (1, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 3), (1, 3), (1, 3), (1, 3))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 0),)),),
(GriddedPerm(Perm((1, 0)), ((1, 2), (1, 1))),),
),
),
Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((0, 0),)),
GriddedPerm(Perm((0,)), ((0, 1),)),
GriddedPerm(Perm((0,)), ((0, 2),)),
GriddedPerm(Perm((0,)), ((0, 4),)),
GriddedPerm(Perm((0,)), ((1, 3),)),
GriddedPerm(Perm((0, 1)), ((0, 3), (0, 3))),
GriddedPerm(Perm((0, 1)), ((1, 1), (1, 1))),
GriddedPerm(Perm((1, 0)), ((0, 3), (0, 3))),
GriddedPerm(Perm((1, 0)), ((1, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((1, 1), (1, 1))),
GriddedPerm(Perm((0, 2, 1)), ((1, 0), (1, 4), (1, 4))),
GriddedPerm(Perm((0, 2, 1)), ((1, 1), (1, 4), (1, 4))),
GriddedPerm(Perm((0, 2, 1)), ((1, 2), (1, 4), (1, 4))),
GriddedPerm(Perm((2, 1, 0)), ((1, 4), (1, 4), (1, 4))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 0), (1, 2), (1, 1), (1, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 0), (1, 2), (1, 2), (1, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 0), (1, 2), (1, 2), (1, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 0), (1, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 0), (1, 4), (1, 1), (1, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 0), (1, 4), (1, 2), (1, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 0), (1, 4), (1, 2), (1, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 0), (1, 4), (1, 2), (1, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 1), (1, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 1), (1, 4), (1, 2), (1, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 2), (1, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((1, 2), (1, 4), (1, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 1), (1, 0), (1, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 1), (1, 0), (1, 4), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 2), (1, 0), (1, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 2), (1, 0), (1, 4), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 2), (1, 1), (1, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 2), (1, 1), (1, 4), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 2), (1, 2), (1, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 2), (1, 2), (1, 4), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((1, 4), (1, 4), (1, 4), (1, 4))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 3),)),),
(GriddedPerm(Perm((1, 0)), ((1, 1), (1, 0))),),
),
),
]
)
def test_partial_place_point_in_cell(obs_inf_til):
assert obs_inf_til.partial_place_point_in_cell((0, 0), 0) == Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((1, 1),)),
GriddedPerm(Perm((0,)), ((1, 2),)),
GriddedPerm(Perm((0,)), ((2, 0),)),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 1)), ((0, 1), (2, 1))),
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((0, 1)), ((2, 1), (2, 1))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 1), (2, 1))),
GriddedPerm(Perm((1, 0)), ((1, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((2, 1), (2, 1))),
GriddedPerm(Perm((1, 0)), ((2, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (1, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (1, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (2, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (2, 2), (2, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (2, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (2, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (2, 2))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 1),)),),
(GriddedPerm(Perm((0,)), ((1, 0),)),),
),
)
def test_partial_place_point_of_gridded_permutation(obs_inf_til):
gp = GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0)))
placed = obs_inf_til.partial_place_point_of_gridded_permutation(gp, 1, 1)
assert placed == Tiling(
obstructions=(
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 1)), ((0, 3), (0, 3))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 2), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 2), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 2), (0, 2))),
GriddedPerm(Perm((1, 0)), ((0, 3), (0, 2))),
GriddedPerm(Perm((1, 0)), ((0, 3), (0, 3))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (0, 3), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (0, 3), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (0, 4), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (0, 4), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (0, 4), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (0, 4), (0, 3))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 4), (0, 4), (0, 4))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 4), (0, 4), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 4), (0, 4), (0, 3))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 4), (0, 4), (0, 4))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 4), (0, 4), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 4), (0, 4), (0, 3))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 4), (0, 4), (0, 4))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 3), (0, 4), (0, 4), (0, 4))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 4), (0, 4), (0, 4), (0, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 3), (0, 0), (0, 4), (0, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 3), (0, 1), (0, 4), (0, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 0), (0, 4), (0, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 1), (0, 4), (0, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 2), (0, 4), (0, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 3), (0, 4), (0, 4))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 4), (0, 4), (0, 4), (0, 4))),
),
requirements=((GriddedPerm(Perm((1, 0)), ((0, 3), (0, 1))),),),
)
def test_partial_place_row(obs_inf_til):
assert set(obs_inf_til.partial_place_row(2, 3)) == set(
[
Tiling(
obstructions=(
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 1)), ((0, 2), (0, 2))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 3), (0, 1), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 3), (0, 2), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 3), (0, 2), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 3), (0, 3), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 3), (0, 3), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 3), (0, 3), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 3), (0, 3), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 3), (0, 3), (0, 3), (0, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 3), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 3), (0, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 3), (0, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 3), (0, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 3), (0, 0), (0, 3), (0, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 3), (0, 1), (0, 3), (0, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 3), (0, 2), (0, 3), (0, 3))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 3), (0, 3), (0, 3), (0, 3))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 2),)),),
(GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0))),),
),
)
]
)
def test_partial_place_col(obs_inf_til):
assert set(obs_inf_til.partial_place_col(0, 0)) == set(
[
Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((1, 2),)),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 1)), ((0, 1), (1, 1))),
GriddedPerm(Perm((0, 1)), ((1, 1), (1, 1))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((0, 1), (1, 1))),
GriddedPerm(Perm((1, 0)), ((1, 1), (1, 1))),
GriddedPerm(Perm((0, 2, 1)), ((0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((1, 1),)),),
(GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0))),),
),
),
Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((1, 1),)),
GriddedPerm(Perm((0,)), ((1, 2),)),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 0), (1, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((1, 0), (1, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (1, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (1, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((0, 1),)),),
(GriddedPerm(Perm((0,)), ((1, 0),)),),
),
),
Tiling(
obstructions=(
GriddedPerm(Perm((0,)), ((1, 0),)),
GriddedPerm(Perm((0,)), ((1, 1),)),
GriddedPerm(Perm((0, 1)), ((0, 1), (0, 1))),
GriddedPerm(Perm((0, 1)), ((1, 2), (1, 2))),
GriddedPerm(Perm((1, 0)), ((0, 0), (0, 0))),
GriddedPerm(Perm((1, 0)), ((0, 1), (0, 1))),
GriddedPerm(Perm((1, 0)), ((1, 2), (1, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 1), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 0))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 1))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 0), (0, 2), (0, 2), (1, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 1), (0, 2), (0, 2), (1, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((0, 3, 2, 1)), ((0, 2), (0, 2), (0, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 1), (0, 0), (0, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 0), (0, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 1), (0, 2), (1, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (0, 2))),
GriddedPerm(Perm((1, 0, 3, 2)), ((0, 2), (0, 2), (0, 2), (1, 2))),
),
requirements=(
(GriddedPerm(Perm((0,)), ((1, 2),)),),
(GriddedPerm(Perm((1, 0)), ((0, 1), (0, 0))),),
),
),
]
)
def test_empty_obstruction():
t = Tiling((GriddedPerm.empty_perm(),))
assert t.forward_cell_map == {}
assert t.obstructions == (GriddedPerm.empty_perm(),)
def test_point_obstruction():
t = Tiling((GriddedPerm(Perm((0,)), ((0, 0),)),))
assert t.forward_cell_map == {}
assert t.obstructions == (GriddedPerm(Perm((0,)), ((0, 0),)),)
class TestGetGenf:
"""
Group all the test regarding getting the generating function for a tiling.
"""
def test_empty_tiling(self):
t = Tiling(
[GriddedPerm(Perm((0, 1)), [(0, 0), (0, 0)])],
[[GriddedPerm(Perm((0, 1)), [(0, 0), (0, 0)])]],
)
assert t.get_genf() == sympy.sympify("0")
def test_monotone_cell(self):
t = Tiling([GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0)))])
assert sympy.simplify(t.get_genf() - sympy.sympify("1/(1-x)")) == 0
def test_with_req(self):
t = Tiling(
[GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0)))],
[[GriddedPerm(Perm((0,)), ((0, 0),))]],
)
assert sympy.simplify(t.get_genf() - sympy.sympify("x/(1-x)")) == 0
def test_adjacent_monotone(self):
t = Tiling(
[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((0, 1)), ((1, 0), (1, 0))),
]
)
assert sympy.simplify(t.get_genf() - sympy.sympify("1/(1-2*x)")) == 0
def test_with_list_req(self):
t = Tiling(
[
GriddedPerm(Perm((0, 1)), ((0, 0), (0, 0))),
GriddedPerm(Perm((0, 1)), ((1, 1), (1, 1))),
],
[
[GriddedPerm(Perm((0,)), ((1, 1),))],
[GriddedPerm(Perm((0,)), ((0, 0),))],
],
)
assert sympy.simplify(t.get_genf() - sympy.sympify("(x/(1-x))**2")) == 0
def test_locally_factorable(self):
t = Tiling(
obstructions=[
GriddedPerm(Perm((0, 1)), ((0, 0), (1, 1))),
GriddedPerm(Perm((0, 1)), ((0, 0),) * 2),
GriddedPerm(Perm((0, 1)), ((0, 1),) * 2),
GriddedPerm(Perm((0, 1)), ((1, 1),) * 2),
]
)
assert (
sympy.simplify(t.get_genf() - sympy.sympify("1 / (2*x**2 - 3*x + 1)")) == 0
)
def test_not_enumerable(self):
t = Tiling.from_string("1324")
with pytest.raises(NotImplementedError):
t.get_genf()
| 43.227273 | 88 | 0.426988 | 13,807 | 101,757 | 3.08148 | 0.018324 | 0.04964 | 0.224886 | 0.133456 | 0.867344 | 0.837917 | 0.803248 | 0.768204 | 0.734546 | 0.69929 | 0 | 0.126787 | 0.315428 | 101,757 | 2,353 | 89 | 43.245644 | 0.483979 | 0.009257 | 0 | 0.580056 | 0 | 0 | 0.002273 | 0.000228 | 0 | 0 | 0 | 0 | 0.115169 | 1 | 0.032772 | false | 0 | 0.003277 | 0.002809 | 0.04073 | 0.002341 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1398714b8577ce423214ba15e160cb0f6d40a4ae | 31,942 | py | Python | tests/IT/plugin_integration_test.py | testandconquer/pytest-conquer | da600c7f5bcd06aa62c5cca9b75370bf1a6ebf05 | [
"MIT"
] | null | null | null | tests/IT/plugin_integration_test.py | testandconquer/pytest-conquer | da600c7f5bcd06aa62c5cca9b75370bf1a6ebf05 | [
"MIT"
] | 5 | 2018-12-27T02:52:01.000Z | 2019-01-02T01:52:55.000Z | tests/IT/plugin_integration_test.py | testandconquer/pytest-conquer | da600c7f5bcd06aa62c5cca9b75370bf1a6ebf05 | [
"MIT"
] | null | null | null | import os
import os.path
import pytest
import testandconquer.scheduler
from testandconquer.model import Failure, Location, ReportItem, SuiteItem, Tag
from tests.IT import run_test, assert_outcomes
from tests.mock.client import MockClient
from tests.mock.settings import MockSettings
from tests.mock.scheduler import MockScheduler
def test_function_pass(testdir):
test_file = 'fixtures/test_function_pass.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_pass', 1)),
]
assert scheduler.report_items == [
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_pass', 1), 'passed'),
]
def test_function_fail(testdir):
test_file = 'fixtures/test_function_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, failed=1)
assert scheduler.report_items == [
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_fail', 1), 'failed',
Failure('AssertionError', 'assert (2 + 2) == 22')),
]
def test_function_skip(testdir):
test_file = 'fixtures/test_function_skip.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, skipped=1)
assert scheduler.report_items == [
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_skip', 4), 'skipped'),
]
def test_function_xfail(testdir):
test_file = 'fixtures/test_function_xfail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=0)
assert scheduler.report_items == [
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_xfail', 4), 'skipped',
Failure('AssertionError', 'assert (1 + 2) == 12')),
]
def test_function_setup(testdir):
test_file = 'fixtures/test_function_setup.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('setup', Location(test_file, module_for(test_file), None, 'setup_function', 1), scope='function'),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test', 5)),
]
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), None, 'setup_function', 1), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test', 5), 'passed'),
]
def test_function_setup_fail(testdir):
test_file = 'fixtures/test_function_setup_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1)
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), None, 'setup_function', 1), 'failed',
Failure('Exception', 'setup failed')),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test', 5), 'failed'),
]
def test_function_teardown(testdir):
test_file = 'fixtures/test_function_teardown.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('teardown', Location(test_file, module_for(test_file), None, 'teardown_function', 1), scope='function'),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test', 5)),
]
assert scheduler.report_items == [
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'teardown_function', 1), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test', 5), 'passed'),
]
def test_function_teardown_fail(testdir):
test_file = 'fixtures/test_function_teardown_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1, passed=1)
assert scheduler.report_items == [
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'teardown_function', 1), 'failed',
Failure('Exception', 'teardown failed')),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test', 5), 'passed'),
]
def test_function_parameterized(testdir):
test_file = 'fixtures/test_function_param.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=3)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_param[2+4-6]', 4)),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_param[3+5-8]', 4)),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_param[6*9-54]', 4)),
]
assert scheduler.report_items == [
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_param[2+4-6]', 4), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_param[3+5-8]', 4), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_param[6*9-54]', 4), 'passed'),
]
def test_function_tag(testdir):
test_file = 'fixtures/test_function_tag.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_pass', 4), tags=[Tag('test', False)]),
]
assert scheduler.report_items == [
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_pass', 4), 'passed'),
]
def test_module_setup(testdir):
test_file = 'fixtures/test_module_setup.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=2)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('setup', Location(test_file, module_for(test_file), None, 'setup_module', 4), scope='module'),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test1', 9)),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test2', 14)),
]
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), None, 'setup_module', 4), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test1', 9), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test2', 14), 'passed'),
]
def test_module_setup_fail(testdir):
test_file = 'fixtures/test_module_setup_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1)
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), None, 'setup_module', 1), 'failed',
Failure('Exception', 'setup failed')),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test', 5), 'failed'),
]
def test_module_teardown(testdir):
test_file = 'fixtures/test_module_teardown.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('teardown', Location(test_file, module_for(test_file), None, 'teardown_module', 4), scope='module'),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test', 9)),
]
assert scheduler.report_items == [
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'teardown_module', 4), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test', 9), 'passed'),
]
def test_module_teardown_fail(testdir):
test_file = 'fixtures/test_module_teardown_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1, passed=1)
assert scheduler.report_items == [
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'teardown_module', 1), 'failed',
Failure('Exception', 'teardown failed')),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test', 5), 'passed'),
]
def test_fixture(testdir):
test_file = 'fixtures/test_fixture.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture', 4)),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 9), deps=[
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture', 4)),
]),
]
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), None, 'fixture', 4), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'fixture', 4), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 9), 'passed'),
]
def test_fixtures(testdir):
test_file = 'fixtures/test_fixtures.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture1', 4)),
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture2', 9)),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixtures', 14), deps=[
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture1', 4)),
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture2', 9)),
]),
]
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), None, 'fixture1', 4), 'passed'),
ReportItem('setup', Location(test_file, module_for(test_file), None, 'fixture2', 9), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'fixture1', 4), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'fixture2', 9), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixtures', 14), 'passed'),
]
def test_fixture_nested(testdir):
test_file = 'fixtures/test_fixture_nested.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture1', 4)),
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture2', 9)),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 14), deps=[
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture1', 4)),
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture2', 9)),
]),
]
# note that nested fixtures are evaluated sequentially, one _after_ the other
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), None, 'fixture1', 4), 'passed'),
ReportItem('setup', Location(test_file, module_for(test_file), None, 'fixture2', 9), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'fixture1', 4), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'fixture2', 9), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'fixture2', 9), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 14), 'passed'),
]
def test_fixture_session(testdir):
test_file = 'fixtures/test_fixture_session.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location('fixtures/conftest.py'), size=42),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('fixture', Location('fixtures/conftest.py', 'conftest', None, 'fixture_session', 4)),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 1), deps=[
SuiteItem('fixture', Location('fixtures/conftest.py', 'conftest', None, 'fixture_session', 4)),
]),
]
assert scheduler.report_items == [
ReportItem('setup', Location('fixtures/conftest.py', 'conftest', None, 'fixture_session', 4), 'passed'),
ReportItem('teardown', Location('fixtures/conftest.py', 'conftest', None, 'fixture_session', 4), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 1), 'passed'),
]
def test_fixture_missing(testdir):
test_file = 'fixtures/test_fixture_missing.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_with_missing_fixture', 1)),
]
assert scheduler.report_items == [
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_with_missing_fixture', 1), 'failed'),
]
def test_fixture_import(testdir):
test_file = 'fixtures/test_fixture_import.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('file', Location('fixtures/fixture.py'), size=42),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('fixture', Location('fixtures/fixture.py', 'fixture', None, 'fixture_import', 4)),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 5), deps=[
SuiteItem('fixture', Location('fixtures/fixture.py', 'fixture', None, 'fixture_import', 4)),
]),
]
assert scheduler.report_items == [
ReportItem('setup', Location('fixtures/fixture.py', 'fixture', None, 'fixture_import', 4), 'passed'),
ReportItem('teardown', Location('fixtures/fixture.py', 'fixture', None, 'fixture_import', 4), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 5), 'passed'),
]
def test_fixture_fail(testdir):
test_file = 'fixtures/test_fixture_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1)
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture', 4)),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 9), deps=[
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture', 4)),
]),
]
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), None, 'fixture', 4), 'error',
Failure('Exception', 'setup failed')),
ReportItem('teardown', Location(test_file, module_for(test_file), None, 'fixture', 4), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 9), 'failed'),
]
def test_fixture_tag(testdir):
test_file = 'fixtures/test_fixture_tag.py'
(_, scheduler) = run_test(testdir, [test_file])
assert scheduler.suite_items == [
SuiteItem('file', Location(test_file), size=42),
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture', 4), tags=[Tag('my_group', True)]),
SuiteItem('test', Location(test_file, module_for(test_file), None, 'test_with_fixture', 10), deps=[
SuiteItem('fixture', Location(test_file, module_for(test_file), None, 'fixture', 4), tags=[Tag('my_group', True)]),
]),
]
def test_multiple_test_files(testdir):
(result, scheduler) = run_test(testdir, ['fixtures/test_function_fail.py', 'fixtures/test_function_pass.py', 'fixtures/test_function_skip.py'])
assert_outcomes(result, failed=1, passed=1, skipped=1)
assert len(scheduler.report_items) == 3
def test_class(testdir):
test_file = 'fixtures/test_class.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('class', Location(test_file, module_for(test_file), 'TestObject', None, 1)),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 3)),
]
assert scheduler.report_items == [
ReportItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 3), 'passed'),
]
def test_class_tags(testdir):
test_file = 'fixtures/test_class_tag.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('class', Location(test_file, module_for(test_file), 'TestObject', None, 5), tags=[Tag('my_group', False)]),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 7)),
]
def test_class_inheritance(testdir):
(result, scheduler) = run_test(testdir, ['fixtures/test_class_inheritance_1.py', 'fixtures/test_class_inheritance_2.py'])
assert_outcomes(result, passed=3)
assert scheduler.suite_items == [
SuiteItem('class', Location('fixtures/test_class_inheritance_1.py', 'test_class_inheritance_1', 'TestObject1', None, 1)),
SuiteItem('class', Location('fixtures/test_class_inheritance_2.py', 'test_class_inheritance_2', 'TestObject2', None, 4)),
SuiteItem('file', Location('fixtures/test_class_inheritance_1.py'), size=42),
SuiteItem('file', Location('fixtures/test_class_inheritance_2.py'), size=42),
SuiteItem('setup', Location('fixtures/test_class_inheritance_1.py', 'test_class_inheritance_1', 'TestObject1', 'setup_class', 3), scope='class'),
SuiteItem('setup', Location('fixtures/test_class_inheritance_2.py', 'test_class_inheritance_2', 'TestObject2', 'setup_class', 3), scope='class'),
SuiteItem('teardown', Location('fixtures/test_class_inheritance_1.py', 'test_class_inheritance_1', 'TestObject1', 'teardown_class', 7), scope='class'),
SuiteItem('teardown', Location('fixtures/test_class_inheritance_2.py', 'test_class_inheritance_2', 'TestObject2', 'teardown_class', 7), scope='class'),
SuiteItem('test', Location('fixtures/test_class_inheritance_1.py', 'test_class_inheritance_1', 'TestObject1', 'test1', 11)),
SuiteItem('test', Location('fixtures/test_class_inheritance_2.py', 'test_class_inheritance_2', 'TestObject2', 'test1', 11)),
SuiteItem('test', Location('fixtures/test_class_inheritance_2.py', 'test_class_inheritance_2', 'TestObject2', 'test2', 6)),
]
assert scheduler.report_items == [
ReportItem('setup', Location('fixtures/test_class_inheritance_1.py', 'test_class_inheritance_1', 'TestObject1', 'setup_class', 3), 'passed'),
ReportItem('setup', Location('fixtures/test_class_inheritance_2.py', 'test_class_inheritance_2', 'TestObject2', 'setup_class', 3), 'passed'),
ReportItem('teardown', Location('fixtures/test_class_inheritance_1.py', 'test_class_inheritance_1', 'TestObject1', 'teardown_class', 7), 'passed'),
ReportItem('teardown', Location('fixtures/test_class_inheritance_2.py', 'test_class_inheritance_2', 'TestObject2', 'teardown_class', 7), 'passed'),
ReportItem('test', Location('fixtures/test_class_inheritance_1.py', 'test_class_inheritance_1', 'TestObject1', 'test1', 11), 'passed'),
ReportItem('test', Location('fixtures/test_class_inheritance_2.py', 'test_class_inheritance_2', 'TestObject2', 'test1', 11), 'passed'),
ReportItem('test', Location('fixtures/test_class_inheritance_2.py', 'test_class_inheritance_2', 'TestObject2', 'test2', 6), 'passed'),
]
def test_class_setup(testdir):
test_file = 'fixtures/test_class_setup.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=2)
assert scheduler.suite_items == [
SuiteItem('class', Location(test_file, module_for(test_file), 'TestObject', None, 4)),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('setup', Location(test_file, module_for(test_file), 'TestObject', 'setup_class', 6), scope='class'),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test1', 11)),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test2', 15)),
]
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), 'TestObject', 'setup_class', 6), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test1', 11), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test2', 15), 'passed'),
]
def test_class_setup_tag(testdir):
test_file = 'fixtures/test_class_setup_tag.py'
(_, scheduler) = run_test(testdir, [test_file])
assert scheduler.suite_items == [
SuiteItem('class', Location(test_file, module_for(test_file), 'TestObject', None, 4)),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('setup', Location(test_file, module_for(test_file), 'TestObject', 'setup_class', 6), scope='class', tags=[Tag('my_group', False)]),
]
def test_class_param(testdir):
test_file = 'fixtures/test_class_param.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=2)
assert scheduler.suite_items == [
SuiteItem('class', Location(test_file, module_for(test_file), 'TestObject', None, 4)),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test_param[2+4-6]', 6)),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test_param[3+5-8]', 6)),
]
def test_class_decorator(testdir):
test_file = 'fixtures/test_class_decorator.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.suite_items == [
SuiteItem('class', Location(test_file, module_for(test_file), 'TestObject', None, 4)),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 6)),
]
def test_class_nested(testdir):
test_file = 'fixtures/test_class_nested.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=2)
assert scheduler.suite_items == [
SuiteItem('class', Location(test_file, module_for(test_file), 'TestOuter', None, 1)),
SuiteItem('class', Location(test_file, module_for(test_file), 'TestOuter.TestInner', None, 3)),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('setup', Location(test_file, module_for(test_file), 'TestOuter', 'setup_class', 22), scope='class'),
SuiteItem('setup', Location(test_file, module_for(test_file), 'TestOuter', 'setup_method', 30), scope='method'),
SuiteItem('setup', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'setup_class', 5), scope='class'),
SuiteItem('setup', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'setup_method', 13), scope='method'),
SuiteItem('teardown', Location(test_file, module_for(test_file), 'TestOuter', 'teardown_class', 26), scope='class'),
SuiteItem('teardown', Location(test_file, module_for(test_file), 'TestOuter', 'teardown_method', 33), scope='method'),
SuiteItem('teardown', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'teardown_class', 9), scope='class'),
SuiteItem('teardown', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'teardown_method', 16), scope='method'),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestOuter', 'test', 36)),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'test', 19)),
]
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), 'TestOuter', 'setup_class', 22), 'passed'),
ReportItem('setup', Location(test_file, module_for(test_file), 'TestOuter', 'setup_method', 30), 'passed'),
ReportItem('setup', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'setup_class', 5), 'passed'),
ReportItem('setup', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'setup_method', 13), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), 'TestOuter', 'teardown_class', 26), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), 'TestOuter', 'teardown_method', 33), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'teardown_class', 9), 'passed'),
ReportItem('teardown', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'teardown_method', 16), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), 'TestOuter', 'test', 36), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), 'TestOuter.TestInner', 'test', 19), 'passed'),
]
def test_class_setup_fail(testdir):
test_file = 'fixtures/test_class_setup_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1)
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), 'TestObject', 'setup_class', 3), 'failed',
Failure('Exception', 'setup failed')),
ReportItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 7), 'failed'),
]
def test_class_method_setup_fail(testdir):
test_file = 'fixtures/test_class_method_setup_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1)
assert scheduler.suite_items == [
SuiteItem('class', Location(test_file, module_for(test_file), 'TestObject', None, 1)),
SuiteItem('file', Location(test_file), size=42),
SuiteItem('setup', Location(test_file, module_for(test_file), 'TestObject', 'setup_method', 3), scope='method'),
SuiteItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 6)),
]
assert scheduler.report_items == [
ReportItem('setup', Location(test_file, module_for(test_file), 'TestObject', 'setup_method', 3), 'failed',
Failure('Exception', 'setup failed')),
ReportItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 6), 'failed'),
]
def test_class_teardown(testdir):
test_file = 'fixtures/test_class_teardown.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, passed=1)
assert scheduler.report_items == [
ReportItem('teardown', Location(test_file, module_for(test_file), 'TestObject', 'teardown_class', 3), 'passed'),
ReportItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 7), 'passed'),
]
def test_class_teardown_fail(testdir):
test_file = 'fixtures/test_class_teardown_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1, passed=1)
assert scheduler.report_items == [
ReportItem('teardown', Location(test_file, module_for(test_file), 'TestObject', 'teardown_class', 3), 'failed',
Failure('Exception', 'teardown failed')),
ReportItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 7), 'passed'),
]
def test_class_method_teardown_fail(testdir):
test_file = 'fixtures/test_class_method_teardown_fail.py'
(result, scheduler) = run_test(testdir, [test_file])
assert_outcomes(result, error=1, passed=1)
assert scheduler.report_items == [
ReportItem('teardown', Location(test_file, module_for(test_file), 'TestObject', 'teardown_method', 3), 'failed',
Failure('Exception', 'teardown failed')),
ReportItem('test', Location(test_file, module_for(test_file), 'TestObject', 'test', 6), 'passed'),
]
def test_package(testdir):
init_file = 'fixtures/package/__init__.py'
test_file = 'fixtures/package/test_pass.py'
(result, scheduler) = run_test(testdir, [init_file, test_file])
assert_outcomes(result, passed=1)
def test_collect_only_mode(testdir):
test_file = 'fixtures/test_class.py'
(result, scheduler) = run_test(testdir, [test_file], ['--conquer', '--collect-only'])
assert_outcomes(result)
assert len(testandconquer.plugin.suite_items) == 3
assert scheduler is None
def test_disabled_plugin(testdir):
test_file = 'fixtures/test_class.py'
(result, scheduler) = run_test(testdir, [test_file], [])
assert_outcomes(result, passed=1)
assert testandconquer.plugin.suite_items == []
assert scheduler is None
def test_settings(testdir):
run_test(testdir, ['fixtures/test_class.py'])
settings = testandconquer.plugin.schedulers[0].settings
assert settings.client_workers == 1
assert settings.runner_name == 'pytest'
assert settings.runner_plugins == [('pytest-asyncio', '0.10.0'), ('pytest-conquer', '1.0.0'), ('pytest-cov', '2.9.0'), ('pytest-mock', '1.11.2')]
assert settings.runner_root == os.getcwd()
assert settings.runner_version == pytest.__version__
# ================================ HELPERS ================================
def module_for(file):
return file.replace('fixtures/', '').replace('.py', '')
@pytest.fixture(autouse=True)
def mock_client():
# NOTE: doesn't work when the plugin is imported at any time
previous_client = testandconquer.client.Client
testandconquer.client.Client = MockClient
yield
testandconquer.client.Client = previous_client
@pytest.fixture(autouse=True)
def mock_schedule():
# NOTE: doesn't work when the plugin is imported at any time
previous_scheduler = testandconquer.scheduler.Scheduler
testandconquer.scheduler.Scheduler = MockScheduler
yield
testandconquer.scheduler.Scheduler = previous_scheduler
@pytest.fixture(autouse=True)
def mock_settings():
# NOTE: doesn't work when the plugin is imported at any time
previous = testandconquer.settings.Settings
testandconquer.settings.Settings = MockSettings
yield
testandconquer.settings.Settings = previous
| 48.033083 | 159 | 0.685774 | 3,928 | 31,942 | 5.309063 | 0.038187 | 0.140788 | 0.121224 | 0.142419 | 0.905342 | 0.884866 | 0.842812 | 0.809821 | 0.792462 | 0.786276 | 0 | 0.015667 | 0.158757 | 31,942 | 664 | 160 | 48.105422 | 0.760411 | 0.010206 | 0 | 0.431953 | 0 | 0 | 0.22031 | 0.076938 | 0 | 0 | 0 | 0 | 0.205128 | 1 | 0.086785 | false | 0.189349 | 0.029586 | 0.001972 | 0.118343 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
13a9514679411547663ef72e5f66da47e3057024 | 610,464 | py | Python | polly/lib/External/isl/interface/isl.py | acidburn0zzz/llvm-project | 7ca7a2547f00e34f5ec91be776a1d0bbca74b7a9 | [
"Apache-2.0"
] | 2 | 2019-04-11T16:25:43.000Z | 2020-12-09T10:22:01.000Z | polly/lib/External/isl/interface/isl.py | acidburn0zzz/llvm-project | 7ca7a2547f00e34f5ec91be776a1d0bbca74b7a9 | [
"Apache-2.0"
] | null | null | null | polly/lib/External/isl/interface/isl.py | acidburn0zzz/llvm-project | 7ca7a2547f00e34f5ec91be776a1d0bbca74b7a9 | [
"Apache-2.0"
] | null | null | null | isl_dlname='libisl.so.23'
import os
from ctypes import *
from ctypes.util import find_library
isl_dyld_library_path = os.environ.get('ISL_DYLD_LIBRARY_PATH')
if isl_dyld_library_path != None:
os.environ['DYLD_LIBRARY_PATH'] = isl_dyld_library_path
try:
isl = cdll.LoadLibrary(isl_dlname)
except:
isl = cdll.LoadLibrary(find_library("isl"))
libc = cdll.LoadLibrary(find_library("c"))
class Error(Exception):
pass
class Context:
defaultInstance = None
def __init__(self):
ptr = isl.isl_ctx_alloc()
self.ptr = ptr
def __del__(self):
isl.isl_ctx_free(self)
def from_param(self):
return c_void_p(self.ptr)
@staticmethod
def getDefaultInstance():
if Context.defaultInstance == None:
Context.defaultInstance = Context()
return Context.defaultInstance
isl.isl_ctx_alloc.restype = c_void_p
isl.isl_ctx_free.argtypes = [Context]
class union_pw_multi_aff(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is multi_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_multi_aff_from_multi_aff(isl.isl_multi_aff_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is pw_multi_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_multi_aff_from_pw_multi_aff(isl.isl_pw_multi_aff_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is union_pw_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_multi_aff_from_union_pw_aff(isl.isl_union_pw_aff_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_multi_aff_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_union_pw_multi_aff_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ptr = isl.isl_union_pw_multi_aff_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.union_pw_multi_aff("""%s""")' % s
else:
return 'isl.union_pw_multi_aff("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_multi_aff:
arg1 = union_pw_multi_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_add(isl.isl_union_pw_multi_aff_copy(arg0.ptr), isl.isl_union_pw_multi_aff_copy(arg1.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def apply(*args):
if len(args) == 2 and args[1].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_pw_multi_aff_apply_union_pw_multi_aff(isl.isl_union_pw_multi_aff_copy(args[0].ptr), isl.isl_union_pw_multi_aff_copy(args[1].ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def as_multi_union_pw_aff(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_as_multi_union_pw_aff(isl.isl_union_pw_multi_aff_copy(arg0.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def as_pw_multi_aff(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_as_pw_multi_aff(isl.isl_union_pw_multi_aff_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def as_union_map(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_as_union_map(isl.isl_union_pw_multi_aff_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def coalesce(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_coalesce(isl.isl_union_pw_multi_aff_copy(arg0.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def domain(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_domain(isl.isl_union_pw_multi_aff_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
@staticmethod
def empty(*args):
if len(args) == 0:
ctx = Context.getDefaultInstance()
res = isl.isl_union_pw_multi_aff_empty_ctx(ctx)
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def extract_pw_multi_aff(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_extract_pw_multi_aff(arg0.ptr, isl.isl_space_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def flat_range_product(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_multi_aff:
arg1 = union_pw_multi_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_flat_range_product(isl.isl_union_pw_multi_aff_copy(arg0.ptr), isl.isl_union_pw_multi_aff_copy(arg1.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_gist(isl.isl_union_pw_multi_aff_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def intersect_domain(*args):
if len(args) == 2 and args[1].__class__ is space:
ctx = args[0].ctx
res = isl.isl_union_pw_multi_aff_intersect_domain_space(isl.isl_union_pw_multi_aff_copy(args[0].ptr), isl.isl_space_copy(args[1].ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is union_set:
ctx = args[0].ctx
res = isl.isl_union_pw_multi_aff_intersect_domain_union_set(isl.isl_union_pw_multi_aff_copy(args[0].ptr), isl.isl_union_set_copy(args[1].ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def intersect_domain_wrapped_domain(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_intersect_domain_wrapped_domain(isl.isl_union_pw_multi_aff_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def intersect_domain_wrapped_range(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_intersect_domain_wrapped_range(isl.isl_union_pw_multi_aff_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_intersect_params(isl.isl_union_pw_multi_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def involves_locals(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_involves_locals(arg0.ptr)
if res < 0:
raise
return bool(res)
def isa_pw_multi_aff(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_isa_pw_multi_aff(arg0.ptr)
if res < 0:
raise
return bool(res)
def plain_is_empty(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_plain_is_empty(arg0.ptr)
if res < 0:
raise
return bool(res)
def preimage_domain_wrapped_domain(*args):
if len(args) == 2 and args[1].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_pw_multi_aff_preimage_domain_wrapped_domain_union_pw_multi_aff(isl.isl_union_pw_multi_aff_copy(args[0].ptr), isl.isl_union_pw_multi_aff_copy(args[1].ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def pullback(*args):
if len(args) == 2 and args[1].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_pw_multi_aff_pullback_union_pw_multi_aff(isl.isl_union_pw_multi_aff_copy(args[0].ptr), isl.isl_union_pw_multi_aff_copy(args[1].ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def pw_multi_aff_list(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_get_pw_multi_aff_list(arg0.ptr)
obj = pw_multi_aff_list(ctx=ctx, ptr=res)
return obj
def get_pw_multi_aff_list(arg0):
return arg0.pw_multi_aff_list()
def range_factor_domain(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_range_factor_domain(isl.isl_union_pw_multi_aff_copy(arg0.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def range_factor_range(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_range_factor_range(isl.isl_union_pw_multi_aff_copy(arg0.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def range_product(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_multi_aff:
arg1 = union_pw_multi_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_range_product(isl.isl_union_pw_multi_aff_copy(arg0.ptr), isl.isl_union_pw_multi_aff_copy(arg1.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def space(arg0):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def sub(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_multi_aff:
arg1 = union_pw_multi_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_sub(isl.isl_union_pw_multi_aff_copy(arg0.ptr), isl.isl_union_pw_multi_aff_copy(arg1.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def subtract_domain(*args):
if len(args) == 2 and args[1].__class__ is space:
ctx = args[0].ctx
res = isl.isl_union_pw_multi_aff_subtract_domain_space(isl.isl_union_pw_multi_aff_copy(args[0].ptr), isl.isl_space_copy(args[1].ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is union_set:
ctx = args[0].ctx
res = isl.isl_union_pw_multi_aff_subtract_domain_union_set(isl.isl_union_pw_multi_aff_copy(args[0].ptr), isl.isl_union_set_copy(args[1].ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def union_add(arg0, arg1):
try:
if not arg0.__class__ is union_pw_multi_aff:
arg0 = union_pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_multi_aff:
arg1 = union_pw_multi_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_multi_aff_union_add(isl.isl_union_pw_multi_aff_copy(arg0.ptr), isl.isl_union_pw_multi_aff_copy(arg1.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
isl.isl_union_pw_multi_aff_from_multi_aff.restype = c_void_p
isl.isl_union_pw_multi_aff_from_multi_aff.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_from_pw_multi_aff.restype = c_void_p
isl.isl_union_pw_multi_aff_from_pw_multi_aff.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_from_union_pw_aff.restype = c_void_p
isl.isl_union_pw_multi_aff_from_union_pw_aff.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_read_from_str.restype = c_void_p
isl.isl_union_pw_multi_aff_read_from_str.argtypes = [Context, c_char_p]
isl.isl_union_pw_multi_aff_add.restype = c_void_p
isl.isl_union_pw_multi_aff_add.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_apply_union_pw_multi_aff.restype = c_void_p
isl.isl_union_pw_multi_aff_apply_union_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_as_multi_union_pw_aff.restype = c_void_p
isl.isl_union_pw_multi_aff_as_multi_union_pw_aff.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_as_pw_multi_aff.restype = c_void_p
isl.isl_union_pw_multi_aff_as_pw_multi_aff.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_as_union_map.restype = c_void_p
isl.isl_union_pw_multi_aff_as_union_map.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_coalesce.restype = c_void_p
isl.isl_union_pw_multi_aff_coalesce.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_domain.restype = c_void_p
isl.isl_union_pw_multi_aff_domain.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_empty_ctx.restype = c_void_p
isl.isl_union_pw_multi_aff_empty_ctx.argtypes = [Context]
isl.isl_union_pw_multi_aff_extract_pw_multi_aff.restype = c_void_p
isl.isl_union_pw_multi_aff_extract_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_flat_range_product.restype = c_void_p
isl.isl_union_pw_multi_aff_flat_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_gist.restype = c_void_p
isl.isl_union_pw_multi_aff_gist.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_intersect_domain_space.restype = c_void_p
isl.isl_union_pw_multi_aff_intersect_domain_space.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_intersect_domain_union_set.restype = c_void_p
isl.isl_union_pw_multi_aff_intersect_domain_union_set.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_intersect_domain_wrapped_domain.restype = c_void_p
isl.isl_union_pw_multi_aff_intersect_domain_wrapped_domain.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_intersect_domain_wrapped_range.restype = c_void_p
isl.isl_union_pw_multi_aff_intersect_domain_wrapped_range.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_intersect_params.restype = c_void_p
isl.isl_union_pw_multi_aff_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_involves_locals.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_isa_pw_multi_aff.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_plain_is_empty.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_preimage_domain_wrapped_domain_union_pw_multi_aff.restype = c_void_p
isl.isl_union_pw_multi_aff_preimage_domain_wrapped_domain_union_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_pullback_union_pw_multi_aff.restype = c_void_p
isl.isl_union_pw_multi_aff_pullback_union_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_get_pw_multi_aff_list.restype = c_void_p
isl.isl_union_pw_multi_aff_get_pw_multi_aff_list.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_range_factor_domain.restype = c_void_p
isl.isl_union_pw_multi_aff_range_factor_domain.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_range_factor_range.restype = c_void_p
isl.isl_union_pw_multi_aff_range_factor_range.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_range_product.restype = c_void_p
isl.isl_union_pw_multi_aff_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_get_space.restype = c_void_p
isl.isl_union_pw_multi_aff_get_space.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_sub.restype = c_void_p
isl.isl_union_pw_multi_aff_sub.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_subtract_domain_space.restype = c_void_p
isl.isl_union_pw_multi_aff_subtract_domain_space.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_subtract_domain_union_set.restype = c_void_p
isl.isl_union_pw_multi_aff_subtract_domain_union_set.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_union_add.restype = c_void_p
isl.isl_union_pw_multi_aff_union_add.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_multi_aff_copy.restype = c_void_p
isl.isl_union_pw_multi_aff_copy.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_free.restype = c_void_p
isl.isl_union_pw_multi_aff_free.argtypes = [c_void_p]
isl.isl_union_pw_multi_aff_to_str.restype = POINTER(c_char)
isl.isl_union_pw_multi_aff_to_str.argtypes = [c_void_p]
class multi_union_pw_aff(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is multi_pw_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_union_pw_aff_from_multi_pw_aff(isl.isl_multi_pw_aff_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is union_pw_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_union_pw_aff_from_union_pw_aff(isl.isl_union_pw_aff_copy(args[0].ptr))
return
if len(args) == 2 and args[0].__class__ is space and args[1].__class__ is union_pw_aff_list:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_union_pw_aff_from_union_pw_aff_list(isl.isl_space_copy(args[0].ptr), isl.isl_union_pw_aff_list_copy(args[1].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_union_pw_aff_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_multi_union_pw_aff_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ptr = isl.isl_multi_union_pw_aff_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.multi_union_pw_aff("""%s""")' % s
else:
return 'isl.multi_union_pw_aff("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_union_pw_aff:
arg1 = multi_union_pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_add(isl.isl_multi_union_pw_aff_copy(arg0.ptr), isl.isl_multi_union_pw_aff_copy(arg1.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_get_at(arg0.ptr, arg1)
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def bind(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_bind(isl.isl_multi_union_pw_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def coalesce(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_coalesce(isl.isl_multi_union_pw_aff_copy(arg0.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def domain(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_domain(isl.isl_multi_union_pw_aff_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def flat_range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_union_pw_aff:
arg1 = multi_union_pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_flat_range_product(isl.isl_multi_union_pw_aff_copy(arg0.ptr), isl.isl_multi_union_pw_aff_copy(arg1.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_gist(isl.isl_multi_union_pw_aff_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def has_range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_has_range_tuple_id(arg0.ptr)
if res < 0:
raise
return bool(res)
def intersect_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_intersect_domain(isl.isl_multi_union_pw_aff_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_intersect_params(isl.isl_multi_union_pw_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def involves_nan(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_involves_nan(arg0.ptr)
if res < 0:
raise
return bool(res)
def list(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_get_list(arg0.ptr)
obj = union_pw_aff_list(ctx=ctx, ptr=res)
return obj
def get_list(arg0):
return arg0.list()
def neg(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_neg(isl.isl_multi_union_pw_aff_copy(arg0.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def plain_is_equal(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_union_pw_aff:
arg1 = multi_union_pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_plain_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def pullback(*args):
if len(args) == 2 and args[1].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_multi_union_pw_aff_pullback_union_pw_multi_aff(isl.isl_multi_union_pw_aff_copy(args[0].ptr), isl.isl_union_pw_multi_aff_copy(args[1].ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_union_pw_aff:
arg1 = multi_union_pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_range_product(isl.isl_multi_union_pw_aff_copy(arg0.ptr), isl.isl_multi_union_pw_aff_copy(arg1.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_get_range_tuple_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_range_tuple_id(arg0):
return arg0.range_tuple_id()
def reset_range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_reset_range_tuple_id(isl.isl_multi_union_pw_aff_copy(arg0.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def scale(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_union_pw_aff_scale_multi_val(isl.isl_multi_union_pw_aff_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_union_pw_aff_scale_val(isl.isl_multi_union_pw_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def scale_down(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_union_pw_aff_scale_down_multi_val(isl.isl_multi_union_pw_aff_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_union_pw_aff_scale_down_val(isl.isl_multi_union_pw_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def set_at(arg0, arg1, arg2):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg2.__class__ is union_pw_aff:
arg2 = union_pw_aff(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_set_at(isl.isl_multi_union_pw_aff_copy(arg0.ptr), arg1, isl.isl_union_pw_aff_copy(arg2.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def set_range_tuple(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_union_pw_aff_set_range_tuple_id(isl.isl_multi_union_pw_aff_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def size(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_size(arg0.ptr)
if res < 0:
raise
return int(res)
def space(arg0):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def sub(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_union_pw_aff:
arg1 = multi_union_pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_sub(isl.isl_multi_union_pw_aff_copy(arg0.ptr), isl.isl_multi_union_pw_aff_copy(arg1.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def union_add(arg0, arg1):
try:
if not arg0.__class__ is multi_union_pw_aff:
arg0 = multi_union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_union_pw_aff:
arg1 = multi_union_pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_union_add(isl.isl_multi_union_pw_aff_copy(arg0.ptr), isl.isl_multi_union_pw_aff_copy(arg1.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
@staticmethod
def zero(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_union_pw_aff_zero(isl.isl_space_copy(arg0.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
isl.isl_multi_union_pw_aff_from_multi_pw_aff.restype = c_void_p
isl.isl_multi_union_pw_aff_from_multi_pw_aff.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_from_union_pw_aff.restype = c_void_p
isl.isl_multi_union_pw_aff_from_union_pw_aff.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_from_union_pw_aff_list.restype = c_void_p
isl.isl_multi_union_pw_aff_from_union_pw_aff_list.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_read_from_str.restype = c_void_p
isl.isl_multi_union_pw_aff_read_from_str.argtypes = [Context, c_char_p]
isl.isl_multi_union_pw_aff_add.restype = c_void_p
isl.isl_multi_union_pw_aff_add.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_get_at.restype = c_void_p
isl.isl_multi_union_pw_aff_get_at.argtypes = [c_void_p, c_int]
isl.isl_multi_union_pw_aff_bind.restype = c_void_p
isl.isl_multi_union_pw_aff_bind.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_coalesce.restype = c_void_p
isl.isl_multi_union_pw_aff_coalesce.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_domain.restype = c_void_p
isl.isl_multi_union_pw_aff_domain.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_flat_range_product.restype = c_void_p
isl.isl_multi_union_pw_aff_flat_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_gist.restype = c_void_p
isl.isl_multi_union_pw_aff_gist.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_has_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_intersect_domain.restype = c_void_p
isl.isl_multi_union_pw_aff_intersect_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_intersect_params.restype = c_void_p
isl.isl_multi_union_pw_aff_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_involves_nan.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_get_list.restype = c_void_p
isl.isl_multi_union_pw_aff_get_list.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_neg.restype = c_void_p
isl.isl_multi_union_pw_aff_neg.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_plain_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_pullback_union_pw_multi_aff.restype = c_void_p
isl.isl_multi_union_pw_aff_pullback_union_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_range_product.restype = c_void_p
isl.isl_multi_union_pw_aff_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_get_range_tuple_id.restype = c_void_p
isl.isl_multi_union_pw_aff_get_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_reset_range_tuple_id.restype = c_void_p
isl.isl_multi_union_pw_aff_reset_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_scale_multi_val.restype = c_void_p
isl.isl_multi_union_pw_aff_scale_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_scale_val.restype = c_void_p
isl.isl_multi_union_pw_aff_scale_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_scale_down_multi_val.restype = c_void_p
isl.isl_multi_union_pw_aff_scale_down_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_scale_down_val.restype = c_void_p
isl.isl_multi_union_pw_aff_scale_down_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_set_at.restype = c_void_p
isl.isl_multi_union_pw_aff_set_at.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_multi_union_pw_aff_set_range_tuple_id.restype = c_void_p
isl.isl_multi_union_pw_aff_set_range_tuple_id.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_size.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_get_space.restype = c_void_p
isl.isl_multi_union_pw_aff_get_space.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_sub.restype = c_void_p
isl.isl_multi_union_pw_aff_sub.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_union_add.restype = c_void_p
isl.isl_multi_union_pw_aff_union_add.argtypes = [c_void_p, c_void_p]
isl.isl_multi_union_pw_aff_zero.restype = c_void_p
isl.isl_multi_union_pw_aff_zero.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_copy.restype = c_void_p
isl.isl_multi_union_pw_aff_copy.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_free.restype = c_void_p
isl.isl_multi_union_pw_aff_free.argtypes = [c_void_p]
isl.isl_multi_union_pw_aff_to_str.restype = POINTER(c_char)
isl.isl_multi_union_pw_aff_to_str.argtypes = [c_void_p]
class union_pw_aff(union_pw_multi_aff, multi_union_pw_aff):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_aff_from_aff(isl.isl_aff_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is pw_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_aff_from_pw_aff(isl.isl_pw_aff_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_aff_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_union_pw_aff_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
ptr = isl.isl_union_pw_aff_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.union_pw_aff("""%s""")' % s
else:
return 'isl.union_pw_aff("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_aff:
arg1 = union_pw_aff(arg1)
except:
return union_pw_multi_aff(arg0).add(arg1)
ctx = arg0.ctx
res = isl.isl_union_pw_aff_add(isl.isl_union_pw_aff_copy(arg0.ptr), isl.isl_union_pw_aff_copy(arg1.ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def bind(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_union_pw_aff_bind_id(isl.isl_union_pw_aff_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
raise Error
def coalesce(arg0):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_coalesce(isl.isl_union_pw_aff_copy(arg0.ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def domain(arg0):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_domain(isl.isl_union_pw_aff_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
return union_pw_multi_aff(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_union_pw_aff_gist(isl.isl_union_pw_aff_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def intersect_domain(*args):
if len(args) == 2 and args[1].__class__ is space:
ctx = args[0].ctx
res = isl.isl_union_pw_aff_intersect_domain_space(isl.isl_union_pw_aff_copy(args[0].ptr), isl.isl_space_copy(args[1].ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is union_set:
ctx = args[0].ctx
res = isl.isl_union_pw_aff_intersect_domain_union_set(isl.isl_union_pw_aff_copy(args[0].ptr), isl.isl_union_set_copy(args[1].ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def intersect_domain_wrapped_domain(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
return union_pw_multi_aff(arg0).intersect_domain_wrapped_domain(arg1)
ctx = arg0.ctx
res = isl.isl_union_pw_aff_intersect_domain_wrapped_domain(isl.isl_union_pw_aff_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def intersect_domain_wrapped_range(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
return union_pw_multi_aff(arg0).intersect_domain_wrapped_range(arg1)
ctx = arg0.ctx
res = isl.isl_union_pw_aff_intersect_domain_wrapped_range(isl.isl_union_pw_aff_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_pw_multi_aff(arg0).intersect_params(arg1)
ctx = arg0.ctx
res = isl.isl_union_pw_aff_intersect_params(isl.isl_union_pw_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def pullback(*args):
if len(args) == 2 and args[1].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_pw_aff_pullback_union_pw_multi_aff(isl.isl_union_pw_aff_copy(args[0].ptr), isl.isl_union_pw_multi_aff_copy(args[1].ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def space(arg0):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def sub(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_aff:
arg1 = union_pw_aff(arg1)
except:
return union_pw_multi_aff(arg0).sub(arg1)
ctx = arg0.ctx
res = isl.isl_union_pw_aff_sub(isl.isl_union_pw_aff_copy(arg0.ptr), isl.isl_union_pw_aff_copy(arg1.ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def subtract_domain(*args):
if len(args) == 2 and args[1].__class__ is space:
ctx = args[0].ctx
res = isl.isl_union_pw_aff_subtract_domain_space(isl.isl_union_pw_aff_copy(args[0].ptr), isl.isl_space_copy(args[1].ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is union_set:
ctx = args[0].ctx
res = isl.isl_union_pw_aff_subtract_domain_union_set(isl.isl_union_pw_aff_copy(args[0].ptr), isl.isl_union_set_copy(args[1].ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def to_list(arg0):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_to_list(isl.isl_union_pw_aff_copy(arg0.ptr))
obj = union_pw_aff_list(ctx=ctx, ptr=res)
return obj
def union_add(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff:
arg0 = union_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_aff:
arg1 = union_pw_aff(arg1)
except:
return union_pw_multi_aff(arg0).union_add(arg1)
ctx = arg0.ctx
res = isl.isl_union_pw_aff_union_add(isl.isl_union_pw_aff_copy(arg0.ptr), isl.isl_union_pw_aff_copy(arg1.ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
isl.isl_union_pw_aff_from_aff.restype = c_void_p
isl.isl_union_pw_aff_from_aff.argtypes = [c_void_p]
isl.isl_union_pw_aff_from_pw_aff.restype = c_void_p
isl.isl_union_pw_aff_from_pw_aff.argtypes = [c_void_p]
isl.isl_union_pw_aff_read_from_str.restype = c_void_p
isl.isl_union_pw_aff_read_from_str.argtypes = [Context, c_char_p]
isl.isl_union_pw_aff_add.restype = c_void_p
isl.isl_union_pw_aff_add.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_bind_id.restype = c_void_p
isl.isl_union_pw_aff_bind_id.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_coalesce.restype = c_void_p
isl.isl_union_pw_aff_coalesce.argtypes = [c_void_p]
isl.isl_union_pw_aff_domain.restype = c_void_p
isl.isl_union_pw_aff_domain.argtypes = [c_void_p]
isl.isl_union_pw_aff_gist.restype = c_void_p
isl.isl_union_pw_aff_gist.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_intersect_domain_space.restype = c_void_p
isl.isl_union_pw_aff_intersect_domain_space.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_intersect_domain_union_set.restype = c_void_p
isl.isl_union_pw_aff_intersect_domain_union_set.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_intersect_domain_wrapped_domain.restype = c_void_p
isl.isl_union_pw_aff_intersect_domain_wrapped_domain.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_intersect_domain_wrapped_range.restype = c_void_p
isl.isl_union_pw_aff_intersect_domain_wrapped_range.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_intersect_params.restype = c_void_p
isl.isl_union_pw_aff_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_pullback_union_pw_multi_aff.restype = c_void_p
isl.isl_union_pw_aff_pullback_union_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_get_space.restype = c_void_p
isl.isl_union_pw_aff_get_space.argtypes = [c_void_p]
isl.isl_union_pw_aff_sub.restype = c_void_p
isl.isl_union_pw_aff_sub.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_subtract_domain_space.restype = c_void_p
isl.isl_union_pw_aff_subtract_domain_space.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_subtract_domain_union_set.restype = c_void_p
isl.isl_union_pw_aff_subtract_domain_union_set.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_to_list.restype = c_void_p
isl.isl_union_pw_aff_to_list.argtypes = [c_void_p]
isl.isl_union_pw_aff_union_add.restype = c_void_p
isl.isl_union_pw_aff_union_add.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_copy.restype = c_void_p
isl.isl_union_pw_aff_copy.argtypes = [c_void_p]
isl.isl_union_pw_aff_free.restype = c_void_p
isl.isl_union_pw_aff_free.argtypes = [c_void_p]
isl.isl_union_pw_aff_to_str.restype = POINTER(c_char)
isl.isl_union_pw_aff_to_str.argtypes = [c_void_p]
class multi_pw_aff(multi_union_pw_aff):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_pw_aff_from_aff(isl.isl_aff_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is multi_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_pw_aff_from_multi_aff(isl.isl_multi_aff_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is pw_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_pw_aff_from_pw_aff(isl.isl_pw_aff_copy(args[0].ptr))
return
if len(args) == 2 and args[0].__class__ is space and args[1].__class__ is pw_aff_list:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_pw_aff_from_pw_aff_list(isl.isl_space_copy(args[0].ptr), isl.isl_pw_aff_list_copy(args[1].ptr))
return
if len(args) == 1 and args[0].__class__ is pw_multi_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_pw_aff_from_pw_multi_aff(isl.isl_pw_multi_aff_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_pw_aff_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_multi_pw_aff_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ptr = isl.isl_multi_pw_aff_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.multi_pw_aff("""%s""")' % s
else:
return 'isl.multi_pw_aff("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_pw_aff:
arg1 = multi_pw_aff(arg1)
except:
return multi_union_pw_aff(arg0).add(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_add(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_pw_aff_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def add_constant(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_add_constant_multi_val(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_add_constant_val(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def as_map(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_as_map(isl.isl_multi_pw_aff_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def as_multi_aff(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_as_multi_aff(isl.isl_multi_pw_aff_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def as_set(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_as_set(isl.isl_multi_pw_aff_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_get_at(arg0.ptr, arg1)
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def bind(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return multi_union_pw_aff(arg0).bind(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_bind(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def bind_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return multi_union_pw_aff(arg0).bind_domain(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_bind_domain(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def bind_domain_wrapped_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return multi_union_pw_aff(arg0).bind_domain_wrapped_domain(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_bind_domain_wrapped_domain(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def coalesce(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_coalesce(isl.isl_multi_pw_aff_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def domain(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_domain(isl.isl_multi_pw_aff_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def flat_range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_pw_aff:
arg1 = multi_pw_aff(arg1)
except:
return multi_union_pw_aff(arg0).flat_range_product(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_flat_range_product(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_pw_aff_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return multi_union_pw_aff(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_gist(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def has_range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_has_range_tuple_id(arg0.ptr)
if res < 0:
raise
return bool(res)
def identity(*args):
if len(args) == 1:
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_identity_multi_pw_aff(isl.isl_multi_pw_aff_copy(args[0].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
@staticmethod
def identity_on_domain(*args):
if len(args) == 1 and args[0].__class__ is space:
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_identity_on_domain_space(isl.isl_space_copy(args[0].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def insert_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
return multi_union_pw_aff(arg0).insert_domain(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_insert_domain(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_space_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def intersect_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return multi_union_pw_aff(arg0).intersect_domain(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_intersect_domain(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return multi_union_pw_aff(arg0).intersect_params(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_intersect_params(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def involves_nan(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_involves_nan(arg0.ptr)
if res < 0:
raise
return bool(res)
def involves_param(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_involves_param_id(args[0].ptr, args[1].ptr)
if res < 0:
raise
return bool(res)
if len(args) == 2 and args[1].__class__ is id_list:
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_involves_param_id_list(args[0].ptr, args[1].ptr)
if res < 0:
raise
return bool(res)
raise Error
def isa_multi_aff(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_isa_multi_aff(arg0.ptr)
if res < 0:
raise
return bool(res)
def list(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_get_list(arg0.ptr)
obj = pw_aff_list(ctx=ctx, ptr=res)
return obj
def get_list(arg0):
return arg0.list()
def max(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_pw_aff:
arg1 = multi_pw_aff(arg1)
except:
return multi_union_pw_aff(arg0).max(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_max(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_pw_aff_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def max_multi_val(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_max_multi_val(isl.isl_multi_pw_aff_copy(arg0.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def min(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_pw_aff:
arg1 = multi_pw_aff(arg1)
except:
return multi_union_pw_aff(arg0).min(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_min(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_pw_aff_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def min_multi_val(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_min_multi_val(isl.isl_multi_pw_aff_copy(arg0.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def neg(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_neg(isl.isl_multi_pw_aff_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def plain_is_equal(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_pw_aff:
arg1 = multi_pw_aff(arg1)
except:
return multi_union_pw_aff(arg0).plain_is_equal(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_plain_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def product(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_pw_aff:
arg1 = multi_pw_aff(arg1)
except:
return multi_union_pw_aff(arg0).product(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_product(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_pw_aff_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def pullback(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_pullback_multi_aff(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_pullback_multi_pw_aff(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_pullback_pw_multi_aff(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_pw_aff:
arg1 = multi_pw_aff(arg1)
except:
return multi_union_pw_aff(arg0).range_product(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_range_product(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_pw_aff_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_get_range_tuple_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_range_tuple_id(arg0):
return arg0.range_tuple_id()
def reset_range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_reset_range_tuple_id(isl.isl_multi_pw_aff_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def scale(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_scale_multi_val(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_scale_val(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def scale_down(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_scale_down_multi_val(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_scale_down_val(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def set_at(arg0, arg1, arg2):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg2.__class__ is pw_aff:
arg2 = pw_aff(arg2)
except:
return multi_union_pw_aff(arg0).set_at(arg1, arg2)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_set_at(isl.isl_multi_pw_aff_copy(arg0.ptr), arg1, isl.isl_pw_aff_copy(arg2.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def set_range_tuple(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_pw_aff_set_range_tuple_id(isl.isl_multi_pw_aff_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def size(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_size(arg0.ptr)
if res < 0:
raise
return int(res)
def space(arg0):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def sub(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_pw_aff:
arg1 = multi_pw_aff(arg1)
except:
return multi_union_pw_aff(arg0).sub(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_sub(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_pw_aff_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def unbind_params_insert_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return multi_union_pw_aff(arg0).unbind_params_insert_domain(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_unbind_params_insert_domain(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def union_add(arg0, arg1):
try:
if not arg0.__class__ is multi_pw_aff:
arg0 = multi_pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_pw_aff:
arg1 = multi_pw_aff(arg1)
except:
return multi_union_pw_aff(arg0).union_add(arg1)
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_union_add(isl.isl_multi_pw_aff_copy(arg0.ptr), isl.isl_multi_pw_aff_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
@staticmethod
def zero(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_pw_aff_zero(isl.isl_space_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
isl.isl_multi_pw_aff_from_aff.restype = c_void_p
isl.isl_multi_pw_aff_from_aff.argtypes = [c_void_p]
isl.isl_multi_pw_aff_from_multi_aff.restype = c_void_p
isl.isl_multi_pw_aff_from_multi_aff.argtypes = [c_void_p]
isl.isl_multi_pw_aff_from_pw_aff.restype = c_void_p
isl.isl_multi_pw_aff_from_pw_aff.argtypes = [c_void_p]
isl.isl_multi_pw_aff_from_pw_aff_list.restype = c_void_p
isl.isl_multi_pw_aff_from_pw_aff_list.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_from_pw_multi_aff.restype = c_void_p
isl.isl_multi_pw_aff_from_pw_multi_aff.argtypes = [c_void_p]
isl.isl_multi_pw_aff_read_from_str.restype = c_void_p
isl.isl_multi_pw_aff_read_from_str.argtypes = [Context, c_char_p]
isl.isl_multi_pw_aff_add.restype = c_void_p
isl.isl_multi_pw_aff_add.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_add_constant_multi_val.restype = c_void_p
isl.isl_multi_pw_aff_add_constant_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_add_constant_val.restype = c_void_p
isl.isl_multi_pw_aff_add_constant_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_as_map.restype = c_void_p
isl.isl_multi_pw_aff_as_map.argtypes = [c_void_p]
isl.isl_multi_pw_aff_as_multi_aff.restype = c_void_p
isl.isl_multi_pw_aff_as_multi_aff.argtypes = [c_void_p]
isl.isl_multi_pw_aff_as_set.restype = c_void_p
isl.isl_multi_pw_aff_as_set.argtypes = [c_void_p]
isl.isl_multi_pw_aff_get_at.restype = c_void_p
isl.isl_multi_pw_aff_get_at.argtypes = [c_void_p, c_int]
isl.isl_multi_pw_aff_bind.restype = c_void_p
isl.isl_multi_pw_aff_bind.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_bind_domain.restype = c_void_p
isl.isl_multi_pw_aff_bind_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_bind_domain_wrapped_domain.restype = c_void_p
isl.isl_multi_pw_aff_bind_domain_wrapped_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_coalesce.restype = c_void_p
isl.isl_multi_pw_aff_coalesce.argtypes = [c_void_p]
isl.isl_multi_pw_aff_domain.restype = c_void_p
isl.isl_multi_pw_aff_domain.argtypes = [c_void_p]
isl.isl_multi_pw_aff_flat_range_product.restype = c_void_p
isl.isl_multi_pw_aff_flat_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_gist.restype = c_void_p
isl.isl_multi_pw_aff_gist.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_has_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_pw_aff_identity_multi_pw_aff.restype = c_void_p
isl.isl_multi_pw_aff_identity_multi_pw_aff.argtypes = [c_void_p]
isl.isl_multi_pw_aff_identity_on_domain_space.restype = c_void_p
isl.isl_multi_pw_aff_identity_on_domain_space.argtypes = [c_void_p]
isl.isl_multi_pw_aff_insert_domain.restype = c_void_p
isl.isl_multi_pw_aff_insert_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_intersect_domain.restype = c_void_p
isl.isl_multi_pw_aff_intersect_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_intersect_params.restype = c_void_p
isl.isl_multi_pw_aff_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_involves_nan.argtypes = [c_void_p]
isl.isl_multi_pw_aff_involves_param_id.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_involves_param_id_list.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_isa_multi_aff.argtypes = [c_void_p]
isl.isl_multi_pw_aff_get_list.restype = c_void_p
isl.isl_multi_pw_aff_get_list.argtypes = [c_void_p]
isl.isl_multi_pw_aff_max.restype = c_void_p
isl.isl_multi_pw_aff_max.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_max_multi_val.restype = c_void_p
isl.isl_multi_pw_aff_max_multi_val.argtypes = [c_void_p]
isl.isl_multi_pw_aff_min.restype = c_void_p
isl.isl_multi_pw_aff_min.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_min_multi_val.restype = c_void_p
isl.isl_multi_pw_aff_min_multi_val.argtypes = [c_void_p]
isl.isl_multi_pw_aff_neg.restype = c_void_p
isl.isl_multi_pw_aff_neg.argtypes = [c_void_p]
isl.isl_multi_pw_aff_plain_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_product.restype = c_void_p
isl.isl_multi_pw_aff_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_pullback_multi_aff.restype = c_void_p
isl.isl_multi_pw_aff_pullback_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_pullback_multi_pw_aff.restype = c_void_p
isl.isl_multi_pw_aff_pullback_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_pullback_pw_multi_aff.restype = c_void_p
isl.isl_multi_pw_aff_pullback_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_range_product.restype = c_void_p
isl.isl_multi_pw_aff_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_get_range_tuple_id.restype = c_void_p
isl.isl_multi_pw_aff_get_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_pw_aff_reset_range_tuple_id.restype = c_void_p
isl.isl_multi_pw_aff_reset_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_pw_aff_scale_multi_val.restype = c_void_p
isl.isl_multi_pw_aff_scale_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_scale_val.restype = c_void_p
isl.isl_multi_pw_aff_scale_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_scale_down_multi_val.restype = c_void_p
isl.isl_multi_pw_aff_scale_down_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_scale_down_val.restype = c_void_p
isl.isl_multi_pw_aff_scale_down_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_set_at.restype = c_void_p
isl.isl_multi_pw_aff_set_at.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_multi_pw_aff_set_range_tuple_id.restype = c_void_p
isl.isl_multi_pw_aff_set_range_tuple_id.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_size.argtypes = [c_void_p]
isl.isl_multi_pw_aff_get_space.restype = c_void_p
isl.isl_multi_pw_aff_get_space.argtypes = [c_void_p]
isl.isl_multi_pw_aff_sub.restype = c_void_p
isl.isl_multi_pw_aff_sub.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_unbind_params_insert_domain.restype = c_void_p
isl.isl_multi_pw_aff_unbind_params_insert_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_union_add.restype = c_void_p
isl.isl_multi_pw_aff_union_add.argtypes = [c_void_p, c_void_p]
isl.isl_multi_pw_aff_zero.restype = c_void_p
isl.isl_multi_pw_aff_zero.argtypes = [c_void_p]
isl.isl_multi_pw_aff_copy.restype = c_void_p
isl.isl_multi_pw_aff_copy.argtypes = [c_void_p]
isl.isl_multi_pw_aff_free.restype = c_void_p
isl.isl_multi_pw_aff_free.argtypes = [c_void_p]
isl.isl_multi_pw_aff_to_str.restype = POINTER(c_char)
isl.isl_multi_pw_aff_to_str.argtypes = [c_void_p]
class pw_multi_aff(union_pw_multi_aff, multi_pw_aff):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is multi_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_multi_aff_from_multi_aff(isl.isl_multi_aff_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is pw_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_multi_aff_from_pw_aff(isl.isl_pw_aff_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_multi_aff_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_pw_multi_aff_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ptr = isl.isl_pw_multi_aff_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.pw_multi_aff("""%s""")' % s
else:
return 'isl.pw_multi_aff("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_multi_aff:
arg1 = pw_multi_aff(arg1)
except:
return union_pw_multi_aff(arg0).add(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_add(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_pw_multi_aff_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def add_constant(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_pw_multi_aff_add_constant_multi_val(isl.isl_pw_multi_aff_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_multi_aff_add_constant_val(isl.isl_pw_multi_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def as_map(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_as_map(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def as_multi_aff(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_as_multi_aff(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def as_set(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_as_set(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_get_at(arg0.ptr, arg1)
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def bind_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return union_pw_multi_aff(arg0).bind_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_bind_domain(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def bind_domain_wrapped_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return union_pw_multi_aff(arg0).bind_domain_wrapped_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_bind_domain_wrapped_domain(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def coalesce(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_coalesce(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def domain(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_domain(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
@staticmethod
def domain_map(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_domain_map(isl.isl_space_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def flat_range_product(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_multi_aff:
arg1 = pw_multi_aff(arg1)
except:
return union_pw_multi_aff(arg0).flat_range_product(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_flat_range_product(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_pw_multi_aff_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def foreach_piece(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1, cb_arg2):
cb_arg0 = set(ctx=arg0.ctx, ptr=(cb_arg0))
cb_arg1 = multi_aff(ctx=arg0.ctx, ptr=(cb_arg1))
try:
arg1(cb_arg0, cb_arg1)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_foreach_piece(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def gist(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_pw_multi_aff(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_gist(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def has_range_tuple_id(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_has_range_tuple_id(arg0.ptr)
if res < 0:
raise
return bool(res)
@staticmethod
def identity_on_domain(*args):
if len(args) == 1 and args[0].__class__ is space:
ctx = args[0].ctx
res = isl.isl_pw_multi_aff_identity_on_domain_space(isl.isl_space_copy(args[0].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def insert_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
return union_pw_multi_aff(arg0).insert_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_insert_domain(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_space_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def intersect_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_pw_multi_aff(arg0).intersect_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_intersect_domain(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_pw_multi_aff(arg0).intersect_params(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_intersect_params(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def involves_locals(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_involves_locals(arg0.ptr)
if res < 0:
raise
return bool(res)
def isa_multi_aff(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_isa_multi_aff(arg0.ptr)
if res < 0:
raise
return bool(res)
def max_multi_val(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_max_multi_val(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def min_multi_val(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_min_multi_val(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
@staticmethod
def multi_val_on_domain(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
return union_pw_multi_aff(arg0).multi_val_on_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_multi_val_on_domain(isl.isl_set_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def n_piece(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_n_piece(arg0.ptr)
if res < 0:
raise
return int(res)
def preimage_domain_wrapped_domain(*args):
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_pw_multi_aff_preimage_domain_wrapped_domain_pw_multi_aff(isl.isl_pw_multi_aff_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def product(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_multi_aff:
arg1 = pw_multi_aff(arg1)
except:
return union_pw_multi_aff(arg0).product(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_product(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_pw_multi_aff_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def pullback(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_pw_multi_aff_pullback_multi_aff(isl.isl_pw_multi_aff_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_pw_multi_aff_pullback_pw_multi_aff(isl.isl_pw_multi_aff_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def range_factor_domain(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_range_factor_domain(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def range_factor_range(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_range_factor_range(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
@staticmethod
def range_map(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_range_map(isl.isl_space_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def range_product(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_multi_aff:
arg1 = pw_multi_aff(arg1)
except:
return union_pw_multi_aff(arg0).range_product(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_range_product(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_pw_multi_aff_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def range_tuple_id(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_get_range_tuple_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_range_tuple_id(arg0):
return arg0.range_tuple_id()
def scale(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_multi_aff_scale_val(isl.isl_pw_multi_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def scale_down(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_multi_aff_scale_down_val(isl.isl_pw_multi_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def set_range_tuple(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_multi_aff_set_range_tuple_id(isl.isl_pw_multi_aff_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def space(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def sub(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_multi_aff:
arg1 = pw_multi_aff(arg1)
except:
return union_pw_multi_aff(arg0).sub(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_sub(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_pw_multi_aff_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def subtract_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_pw_multi_aff(arg0).subtract_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_subtract_domain(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def to_list(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_to_list(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = pw_multi_aff_list(ctx=ctx, ptr=res)
return obj
def to_multi_pw_aff(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_to_multi_pw_aff(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def to_union_pw_multi_aff(arg0):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_to_union_pw_multi_aff(isl.isl_pw_multi_aff_copy(arg0.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def union_add(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff:
arg0 = pw_multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_multi_aff:
arg1 = pw_multi_aff(arg1)
except:
return union_pw_multi_aff(arg0).union_add(arg1)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_union_add(isl.isl_pw_multi_aff_copy(arg0.ptr), isl.isl_pw_multi_aff_copy(arg1.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
@staticmethod
def zero(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_zero(isl.isl_space_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
isl.isl_pw_multi_aff_from_multi_aff.restype = c_void_p
isl.isl_pw_multi_aff_from_multi_aff.argtypes = [c_void_p]
isl.isl_pw_multi_aff_from_pw_aff.restype = c_void_p
isl.isl_pw_multi_aff_from_pw_aff.argtypes = [c_void_p]
isl.isl_pw_multi_aff_read_from_str.restype = c_void_p
isl.isl_pw_multi_aff_read_from_str.argtypes = [Context, c_char_p]
isl.isl_pw_multi_aff_add.restype = c_void_p
isl.isl_pw_multi_aff_add.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_add_constant_multi_val.restype = c_void_p
isl.isl_pw_multi_aff_add_constant_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_add_constant_val.restype = c_void_p
isl.isl_pw_multi_aff_add_constant_val.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_as_map.restype = c_void_p
isl.isl_pw_multi_aff_as_map.argtypes = [c_void_p]
isl.isl_pw_multi_aff_as_multi_aff.restype = c_void_p
isl.isl_pw_multi_aff_as_multi_aff.argtypes = [c_void_p]
isl.isl_pw_multi_aff_as_set.restype = c_void_p
isl.isl_pw_multi_aff_as_set.argtypes = [c_void_p]
isl.isl_pw_multi_aff_get_at.restype = c_void_p
isl.isl_pw_multi_aff_get_at.argtypes = [c_void_p, c_int]
isl.isl_pw_multi_aff_bind_domain.restype = c_void_p
isl.isl_pw_multi_aff_bind_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_bind_domain_wrapped_domain.restype = c_void_p
isl.isl_pw_multi_aff_bind_domain_wrapped_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_coalesce.restype = c_void_p
isl.isl_pw_multi_aff_coalesce.argtypes = [c_void_p]
isl.isl_pw_multi_aff_domain.restype = c_void_p
isl.isl_pw_multi_aff_domain.argtypes = [c_void_p]
isl.isl_pw_multi_aff_domain_map.restype = c_void_p
isl.isl_pw_multi_aff_domain_map.argtypes = [c_void_p]
isl.isl_pw_multi_aff_flat_range_product.restype = c_void_p
isl.isl_pw_multi_aff_flat_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_foreach_piece.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_pw_multi_aff_gist.restype = c_void_p
isl.isl_pw_multi_aff_gist.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_has_range_tuple_id.argtypes = [c_void_p]
isl.isl_pw_multi_aff_identity_on_domain_space.restype = c_void_p
isl.isl_pw_multi_aff_identity_on_domain_space.argtypes = [c_void_p]
isl.isl_pw_multi_aff_insert_domain.restype = c_void_p
isl.isl_pw_multi_aff_insert_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_intersect_domain.restype = c_void_p
isl.isl_pw_multi_aff_intersect_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_intersect_params.restype = c_void_p
isl.isl_pw_multi_aff_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_involves_locals.argtypes = [c_void_p]
isl.isl_pw_multi_aff_isa_multi_aff.argtypes = [c_void_p]
isl.isl_pw_multi_aff_max_multi_val.restype = c_void_p
isl.isl_pw_multi_aff_max_multi_val.argtypes = [c_void_p]
isl.isl_pw_multi_aff_min_multi_val.restype = c_void_p
isl.isl_pw_multi_aff_min_multi_val.argtypes = [c_void_p]
isl.isl_pw_multi_aff_multi_val_on_domain.restype = c_void_p
isl.isl_pw_multi_aff_multi_val_on_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_n_piece.argtypes = [c_void_p]
isl.isl_pw_multi_aff_preimage_domain_wrapped_domain_pw_multi_aff.restype = c_void_p
isl.isl_pw_multi_aff_preimage_domain_wrapped_domain_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_product.restype = c_void_p
isl.isl_pw_multi_aff_product.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_pullback_multi_aff.restype = c_void_p
isl.isl_pw_multi_aff_pullback_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_pullback_pw_multi_aff.restype = c_void_p
isl.isl_pw_multi_aff_pullback_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_range_factor_domain.restype = c_void_p
isl.isl_pw_multi_aff_range_factor_domain.argtypes = [c_void_p]
isl.isl_pw_multi_aff_range_factor_range.restype = c_void_p
isl.isl_pw_multi_aff_range_factor_range.argtypes = [c_void_p]
isl.isl_pw_multi_aff_range_map.restype = c_void_p
isl.isl_pw_multi_aff_range_map.argtypes = [c_void_p]
isl.isl_pw_multi_aff_range_product.restype = c_void_p
isl.isl_pw_multi_aff_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_get_range_tuple_id.restype = c_void_p
isl.isl_pw_multi_aff_get_range_tuple_id.argtypes = [c_void_p]
isl.isl_pw_multi_aff_scale_val.restype = c_void_p
isl.isl_pw_multi_aff_scale_val.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_scale_down_val.restype = c_void_p
isl.isl_pw_multi_aff_scale_down_val.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_set_range_tuple_id.restype = c_void_p
isl.isl_pw_multi_aff_set_range_tuple_id.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_get_space.restype = c_void_p
isl.isl_pw_multi_aff_get_space.argtypes = [c_void_p]
isl.isl_pw_multi_aff_sub.restype = c_void_p
isl.isl_pw_multi_aff_sub.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_subtract_domain.restype = c_void_p
isl.isl_pw_multi_aff_subtract_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_to_list.restype = c_void_p
isl.isl_pw_multi_aff_to_list.argtypes = [c_void_p]
isl.isl_pw_multi_aff_to_multi_pw_aff.restype = c_void_p
isl.isl_pw_multi_aff_to_multi_pw_aff.argtypes = [c_void_p]
isl.isl_pw_multi_aff_to_union_pw_multi_aff.restype = c_void_p
isl.isl_pw_multi_aff_to_union_pw_multi_aff.argtypes = [c_void_p]
isl.isl_pw_multi_aff_union_add.restype = c_void_p
isl.isl_pw_multi_aff_union_add.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_zero.restype = c_void_p
isl.isl_pw_multi_aff_zero.argtypes = [c_void_p]
isl.isl_pw_multi_aff_copy.restype = c_void_p
isl.isl_pw_multi_aff_copy.argtypes = [c_void_p]
isl.isl_pw_multi_aff_free.restype = c_void_p
isl.isl_pw_multi_aff_free.argtypes = [c_void_p]
isl.isl_pw_multi_aff_to_str.restype = POINTER(c_char)
isl.isl_pw_multi_aff_to_str.argtypes = [c_void_p]
class pw_aff(union_pw_aff, pw_multi_aff, multi_pw_aff):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_aff_from_aff(isl.isl_aff_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_aff_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_pw_aff_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ptr = isl.isl_pw_aff_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.pw_aff("""%s""")' % s
else:
return 'isl.pw_aff("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).add(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_add(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def add_constant(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_aff_add_constant_val(isl.isl_pw_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def as_aff(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_as_aff(isl.isl_pw_aff_copy(arg0.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def as_map(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_as_map(isl.isl_pw_aff_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def bind(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_aff_bind_id(isl.isl_pw_aff_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
raise Error
def bind_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return union_pw_aff(arg0).bind_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_bind_domain(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def bind_domain_wrapped_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return union_pw_aff(arg0).bind_domain_wrapped_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_bind_domain_wrapped_domain(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def ceil(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_ceil(isl.isl_pw_aff_copy(arg0.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def coalesce(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_coalesce(isl.isl_pw_aff_copy(arg0.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def cond(arg0, arg1, arg2):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).cond(arg1, arg2)
try:
if not arg2.__class__ is pw_aff:
arg2 = pw_aff(arg2)
except:
return union_pw_aff(arg0).cond(arg1, arg2)
ctx = arg0.ctx
res = isl.isl_pw_aff_cond(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr), isl.isl_pw_aff_copy(arg2.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def div(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).div(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_div(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def domain(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_domain(isl.isl_pw_aff_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def eq_set(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).eq_set(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_eq_set(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def eval(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is point:
arg1 = point(arg1)
except:
return union_pw_aff(arg0).eval(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_eval(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_point_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def floor(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_floor(isl.isl_pw_aff_copy(arg0.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def ge_set(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).ge_set(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_ge_set(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_pw_aff(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_gist(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def gt_set(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).gt_set(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_gt_set(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def insert_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
return union_pw_aff(arg0).insert_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_insert_domain(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_space_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def intersect_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_pw_aff(arg0).intersect_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_intersect_domain(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_pw_aff(arg0).intersect_params(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_intersect_params(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def isa_aff(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_isa_aff(arg0.ptr)
if res < 0:
raise
return bool(res)
def le_set(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).le_set(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_le_set(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def lt_set(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).lt_set(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_lt_set(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def max(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).max(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_max(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def min(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).min(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_min(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def mod(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_aff_mod_val(isl.isl_pw_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def mul(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).mul(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_mul(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def ne_set(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).ne_set(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_ne_set(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def neg(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_neg(isl.isl_pw_aff_copy(arg0.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
@staticmethod
def param_on_domain(*args):
if len(args) == 2 and args[0].__class__ is set and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_aff_param_on_domain_id(isl.isl_set_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def pullback(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_pw_aff_pullback_multi_aff(isl.isl_pw_aff_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_pw_aff_pullback_multi_pw_aff(isl.isl_pw_aff_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_pw_aff_pullback_pw_multi_aff(isl.isl_pw_aff_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def scale(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_aff_scale_val(isl.isl_pw_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def scale_down(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_pw_aff_scale_down_val(isl.isl_pw_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
raise Error
def sub(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).sub(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_sub(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def subtract_domain(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_pw_aff(arg0).subtract_domain(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_subtract_domain(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def tdiv_q(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).tdiv_q(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_tdiv_q(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def tdiv_r(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).tdiv_r(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_tdiv_r(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def to_list(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_to_list(isl.isl_pw_aff_copy(arg0.ptr))
obj = pw_aff_list(ctx=ctx, ptr=res)
return obj
def to_union_pw_aff(arg0):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_to_union_pw_aff(isl.isl_pw_aff_copy(arg0.ptr))
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def union_add(arg0, arg1):
try:
if not arg0.__class__ is pw_aff:
arg0 = pw_aff(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
return union_pw_aff(arg0).union_add(arg1)
ctx = arg0.ctx
res = isl.isl_pw_aff_union_add(isl.isl_pw_aff_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
isl.isl_pw_aff_from_aff.restype = c_void_p
isl.isl_pw_aff_from_aff.argtypes = [c_void_p]
isl.isl_pw_aff_read_from_str.restype = c_void_p
isl.isl_pw_aff_read_from_str.argtypes = [Context, c_char_p]
isl.isl_pw_aff_add.restype = c_void_p
isl.isl_pw_aff_add.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_add_constant_val.restype = c_void_p
isl.isl_pw_aff_add_constant_val.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_as_aff.restype = c_void_p
isl.isl_pw_aff_as_aff.argtypes = [c_void_p]
isl.isl_pw_aff_as_map.restype = c_void_p
isl.isl_pw_aff_as_map.argtypes = [c_void_p]
isl.isl_pw_aff_bind_id.restype = c_void_p
isl.isl_pw_aff_bind_id.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_bind_domain.restype = c_void_p
isl.isl_pw_aff_bind_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_bind_domain_wrapped_domain.restype = c_void_p
isl.isl_pw_aff_bind_domain_wrapped_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_ceil.restype = c_void_p
isl.isl_pw_aff_ceil.argtypes = [c_void_p]
isl.isl_pw_aff_coalesce.restype = c_void_p
isl.isl_pw_aff_coalesce.argtypes = [c_void_p]
isl.isl_pw_aff_cond.restype = c_void_p
isl.isl_pw_aff_cond.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_pw_aff_div.restype = c_void_p
isl.isl_pw_aff_div.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_domain.restype = c_void_p
isl.isl_pw_aff_domain.argtypes = [c_void_p]
isl.isl_pw_aff_eq_set.restype = c_void_p
isl.isl_pw_aff_eq_set.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_eval.restype = c_void_p
isl.isl_pw_aff_eval.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_floor.restype = c_void_p
isl.isl_pw_aff_floor.argtypes = [c_void_p]
isl.isl_pw_aff_ge_set.restype = c_void_p
isl.isl_pw_aff_ge_set.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_gist.restype = c_void_p
isl.isl_pw_aff_gist.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_gt_set.restype = c_void_p
isl.isl_pw_aff_gt_set.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_insert_domain.restype = c_void_p
isl.isl_pw_aff_insert_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_intersect_domain.restype = c_void_p
isl.isl_pw_aff_intersect_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_intersect_params.restype = c_void_p
isl.isl_pw_aff_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_isa_aff.argtypes = [c_void_p]
isl.isl_pw_aff_le_set.restype = c_void_p
isl.isl_pw_aff_le_set.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_lt_set.restype = c_void_p
isl.isl_pw_aff_lt_set.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_max.restype = c_void_p
isl.isl_pw_aff_max.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_min.restype = c_void_p
isl.isl_pw_aff_min.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_mod_val.restype = c_void_p
isl.isl_pw_aff_mod_val.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_mul.restype = c_void_p
isl.isl_pw_aff_mul.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_ne_set.restype = c_void_p
isl.isl_pw_aff_ne_set.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_neg.restype = c_void_p
isl.isl_pw_aff_neg.argtypes = [c_void_p]
isl.isl_pw_aff_param_on_domain_id.restype = c_void_p
isl.isl_pw_aff_param_on_domain_id.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_pullback_multi_aff.restype = c_void_p
isl.isl_pw_aff_pullback_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_pullback_multi_pw_aff.restype = c_void_p
isl.isl_pw_aff_pullback_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_pullback_pw_multi_aff.restype = c_void_p
isl.isl_pw_aff_pullback_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_scale_val.restype = c_void_p
isl.isl_pw_aff_scale_val.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_scale_down_val.restype = c_void_p
isl.isl_pw_aff_scale_down_val.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_sub.restype = c_void_p
isl.isl_pw_aff_sub.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_subtract_domain.restype = c_void_p
isl.isl_pw_aff_subtract_domain.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_tdiv_q.restype = c_void_p
isl.isl_pw_aff_tdiv_q.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_tdiv_r.restype = c_void_p
isl.isl_pw_aff_tdiv_r.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_to_list.restype = c_void_p
isl.isl_pw_aff_to_list.argtypes = [c_void_p]
isl.isl_pw_aff_to_union_pw_aff.restype = c_void_p
isl.isl_pw_aff_to_union_pw_aff.argtypes = [c_void_p]
isl.isl_pw_aff_union_add.restype = c_void_p
isl.isl_pw_aff_union_add.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_copy.restype = c_void_p
isl.isl_pw_aff_copy.argtypes = [c_void_p]
isl.isl_pw_aff_free.restype = c_void_p
isl.isl_pw_aff_free.argtypes = [c_void_p]
isl.isl_pw_aff_to_str.restype = POINTER(c_char)
isl.isl_pw_aff_to_str.argtypes = [c_void_p]
class multi_aff(pw_multi_aff, multi_pw_aff):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_aff_from_aff(isl.isl_aff_copy(args[0].ptr))
return
if len(args) == 2 and args[0].__class__ is space and args[1].__class__ is aff_list:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_aff_from_aff_list(isl.isl_space_copy(args[0].ptr), isl.isl_aff_list_copy(args[1].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_aff_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_multi_aff_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ptr = isl.isl_multi_aff_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.multi_aff("""%s""")' % s
else:
return 'isl.multi_aff("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_aff:
arg1 = multi_aff(arg1)
except:
return pw_multi_aff(arg0).add(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_add(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_multi_aff_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def add_constant(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_aff_add_constant_multi_val(isl.isl_multi_aff_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_aff_add_constant_val(isl.isl_multi_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def as_map(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_as_map(isl.isl_multi_aff_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def as_set(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_as_set(isl.isl_multi_aff_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_get_at(arg0.ptr, arg1)
obj = aff(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def bind(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return pw_multi_aff(arg0).bind(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_bind(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def bind_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return pw_multi_aff(arg0).bind_domain(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_bind_domain(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def bind_domain_wrapped_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return pw_multi_aff(arg0).bind_domain_wrapped_domain(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_bind_domain_wrapped_domain(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def constant_multi_val(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_get_constant_multi_val(arg0.ptr)
obj = multi_val(ctx=ctx, ptr=res)
return obj
def get_constant_multi_val(arg0):
return arg0.constant_multi_val()
@staticmethod
def domain_map(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_domain_map(isl.isl_space_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def flat_range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_aff:
arg1 = multi_aff(arg1)
except:
return pw_multi_aff(arg0).flat_range_product(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_flat_range_product(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_multi_aff_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def floor(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_floor(isl.isl_multi_aff_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return pw_multi_aff(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_gist(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def has_range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_has_range_tuple_id(arg0.ptr)
if res < 0:
raise
return bool(res)
def identity(*args):
if len(args) == 1:
ctx = args[0].ctx
res = isl.isl_multi_aff_identity_multi_aff(isl.isl_multi_aff_copy(args[0].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
@staticmethod
def identity_on_domain(*args):
if len(args) == 1 and args[0].__class__ is space:
ctx = args[0].ctx
res = isl.isl_multi_aff_identity_on_domain_space(isl.isl_space_copy(args[0].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def insert_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
return pw_multi_aff(arg0).insert_domain(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_insert_domain(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_space_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def involves_locals(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_involves_locals(arg0.ptr)
if res < 0:
raise
return bool(res)
def involves_nan(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_involves_nan(arg0.ptr)
if res < 0:
raise
return bool(res)
def list(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_get_list(arg0.ptr)
obj = aff_list(ctx=ctx, ptr=res)
return obj
def get_list(arg0):
return arg0.list()
@staticmethod
def multi_val_on_domain(*args):
if len(args) == 2 and args[0].__class__ is space and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_aff_multi_val_on_domain_space(isl.isl_space_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def neg(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_neg(isl.isl_multi_aff_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def plain_is_equal(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_aff:
arg1 = multi_aff(arg1)
except:
return pw_multi_aff(arg0).plain_is_equal(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_plain_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def product(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_aff:
arg1 = multi_aff(arg1)
except:
return pw_multi_aff(arg0).product(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_product(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_multi_aff_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def pullback(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_multi_aff_pullback_multi_aff(isl.isl_multi_aff_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
@staticmethod
def range_map(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_range_map(isl.isl_space_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_aff:
arg1 = multi_aff(arg1)
except:
return pw_multi_aff(arg0).range_product(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_range_product(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_multi_aff_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_get_range_tuple_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_range_tuple_id(arg0):
return arg0.range_tuple_id()
def reset_range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_reset_range_tuple_id(isl.isl_multi_aff_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def scale(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_aff_scale_multi_val(isl.isl_multi_aff_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_aff_scale_val(isl.isl_multi_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def scale_down(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_aff_scale_down_multi_val(isl.isl_multi_aff_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_aff_scale_down_val(isl.isl_multi_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def set_at(arg0, arg1, arg2):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg2.__class__ is aff:
arg2 = aff(arg2)
except:
return pw_multi_aff(arg0).set_at(arg1, arg2)
ctx = arg0.ctx
res = isl.isl_multi_aff_set_at(isl.isl_multi_aff_copy(arg0.ptr), arg1, isl.isl_aff_copy(arg2.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def set_range_tuple(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_aff_set_range_tuple_id(isl.isl_multi_aff_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def size(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_size(arg0.ptr)
if res < 0:
raise
return int(res)
def space(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def sub(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_aff:
arg1 = multi_aff(arg1)
except:
return pw_multi_aff(arg0).sub(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_sub(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_multi_aff_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def to_multi_pw_aff(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_to_multi_pw_aff(isl.isl_multi_aff_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def to_multi_union_pw_aff(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_to_multi_union_pw_aff(isl.isl_multi_aff_copy(arg0.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def to_pw_multi_aff(arg0):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_to_pw_multi_aff(isl.isl_multi_aff_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def unbind_params_insert_domain(arg0, arg1):
try:
if not arg0.__class__ is multi_aff:
arg0 = multi_aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return pw_multi_aff(arg0).unbind_params_insert_domain(arg1)
ctx = arg0.ctx
res = isl.isl_multi_aff_unbind_params_insert_domain(isl.isl_multi_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
@staticmethod
def zero(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_aff_zero(isl.isl_space_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
isl.isl_multi_aff_from_aff.restype = c_void_p
isl.isl_multi_aff_from_aff.argtypes = [c_void_p]
isl.isl_multi_aff_from_aff_list.restype = c_void_p
isl.isl_multi_aff_from_aff_list.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_read_from_str.restype = c_void_p
isl.isl_multi_aff_read_from_str.argtypes = [Context, c_char_p]
isl.isl_multi_aff_add.restype = c_void_p
isl.isl_multi_aff_add.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_add_constant_multi_val.restype = c_void_p
isl.isl_multi_aff_add_constant_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_add_constant_val.restype = c_void_p
isl.isl_multi_aff_add_constant_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_as_map.restype = c_void_p
isl.isl_multi_aff_as_map.argtypes = [c_void_p]
isl.isl_multi_aff_as_set.restype = c_void_p
isl.isl_multi_aff_as_set.argtypes = [c_void_p]
isl.isl_multi_aff_get_at.restype = c_void_p
isl.isl_multi_aff_get_at.argtypes = [c_void_p, c_int]
isl.isl_multi_aff_bind.restype = c_void_p
isl.isl_multi_aff_bind.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_bind_domain.restype = c_void_p
isl.isl_multi_aff_bind_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_bind_domain_wrapped_domain.restype = c_void_p
isl.isl_multi_aff_bind_domain_wrapped_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_get_constant_multi_val.restype = c_void_p
isl.isl_multi_aff_get_constant_multi_val.argtypes = [c_void_p]
isl.isl_multi_aff_domain_map.restype = c_void_p
isl.isl_multi_aff_domain_map.argtypes = [c_void_p]
isl.isl_multi_aff_flat_range_product.restype = c_void_p
isl.isl_multi_aff_flat_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_floor.restype = c_void_p
isl.isl_multi_aff_floor.argtypes = [c_void_p]
isl.isl_multi_aff_gist.restype = c_void_p
isl.isl_multi_aff_gist.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_has_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_aff_identity_multi_aff.restype = c_void_p
isl.isl_multi_aff_identity_multi_aff.argtypes = [c_void_p]
isl.isl_multi_aff_identity_on_domain_space.restype = c_void_p
isl.isl_multi_aff_identity_on_domain_space.argtypes = [c_void_p]
isl.isl_multi_aff_insert_domain.restype = c_void_p
isl.isl_multi_aff_insert_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_involves_locals.argtypes = [c_void_p]
isl.isl_multi_aff_involves_nan.argtypes = [c_void_p]
isl.isl_multi_aff_get_list.restype = c_void_p
isl.isl_multi_aff_get_list.argtypes = [c_void_p]
isl.isl_multi_aff_multi_val_on_domain_space.restype = c_void_p
isl.isl_multi_aff_multi_val_on_domain_space.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_neg.restype = c_void_p
isl.isl_multi_aff_neg.argtypes = [c_void_p]
isl.isl_multi_aff_plain_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_product.restype = c_void_p
isl.isl_multi_aff_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_pullback_multi_aff.restype = c_void_p
isl.isl_multi_aff_pullback_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_range_map.restype = c_void_p
isl.isl_multi_aff_range_map.argtypes = [c_void_p]
isl.isl_multi_aff_range_product.restype = c_void_p
isl.isl_multi_aff_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_get_range_tuple_id.restype = c_void_p
isl.isl_multi_aff_get_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_aff_reset_range_tuple_id.restype = c_void_p
isl.isl_multi_aff_reset_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_aff_scale_multi_val.restype = c_void_p
isl.isl_multi_aff_scale_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_scale_val.restype = c_void_p
isl.isl_multi_aff_scale_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_scale_down_multi_val.restype = c_void_p
isl.isl_multi_aff_scale_down_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_scale_down_val.restype = c_void_p
isl.isl_multi_aff_scale_down_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_set_at.restype = c_void_p
isl.isl_multi_aff_set_at.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_multi_aff_set_range_tuple_id.restype = c_void_p
isl.isl_multi_aff_set_range_tuple_id.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_size.argtypes = [c_void_p]
isl.isl_multi_aff_get_space.restype = c_void_p
isl.isl_multi_aff_get_space.argtypes = [c_void_p]
isl.isl_multi_aff_sub.restype = c_void_p
isl.isl_multi_aff_sub.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_to_multi_pw_aff.restype = c_void_p
isl.isl_multi_aff_to_multi_pw_aff.argtypes = [c_void_p]
isl.isl_multi_aff_to_multi_union_pw_aff.restype = c_void_p
isl.isl_multi_aff_to_multi_union_pw_aff.argtypes = [c_void_p]
isl.isl_multi_aff_to_pw_multi_aff.restype = c_void_p
isl.isl_multi_aff_to_pw_multi_aff.argtypes = [c_void_p]
isl.isl_multi_aff_unbind_params_insert_domain.restype = c_void_p
isl.isl_multi_aff_unbind_params_insert_domain.argtypes = [c_void_p, c_void_p]
isl.isl_multi_aff_zero.restype = c_void_p
isl.isl_multi_aff_zero.argtypes = [c_void_p]
isl.isl_multi_aff_copy.restype = c_void_p
isl.isl_multi_aff_copy.argtypes = [c_void_p]
isl.isl_multi_aff_free.restype = c_void_p
isl.isl_multi_aff_free.argtypes = [c_void_p]
isl.isl_multi_aff_to_str.restype = POINTER(c_char)
isl.isl_multi_aff_to_str.argtypes = [c_void_p]
class aff(pw_aff, multi_aff):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_aff_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_aff_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
ptr = isl.isl_aff_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.aff("""%s""")' % s
else:
return 'isl.aff("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).add(arg1)
ctx = arg0.ctx
res = isl.isl_aff_add(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def add_constant(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_aff_add_constant_val(isl.isl_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
raise Error
def bind(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_aff_bind_id(isl.isl_aff_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
raise Error
def ceil(arg0):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_ceil(isl.isl_aff_copy(arg0.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def constant_val(arg0):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_get_constant_val(arg0.ptr)
obj = val(ctx=ctx, ptr=res)
return obj
def get_constant_val(arg0):
return arg0.constant_val()
def div(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).div(arg1)
ctx = arg0.ctx
res = isl.isl_aff_div(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def eq_set(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).eq_set(arg1)
ctx = arg0.ctx
res = isl.isl_aff_eq_set(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def eval(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is point:
arg1 = point(arg1)
except:
return pw_aff(arg0).eval(arg1)
ctx = arg0.ctx
res = isl.isl_aff_eval(isl.isl_aff_copy(arg0.ptr), isl.isl_point_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def floor(arg0):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_floor(isl.isl_aff_copy(arg0.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def ge_set(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).ge_set(arg1)
ctx = arg0.ctx
res = isl.isl_aff_ge_set(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return pw_aff(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_aff_gist(isl.isl_aff_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def gt_set(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).gt_set(arg1)
ctx = arg0.ctx
res = isl.isl_aff_gt_set(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def is_cst(arg0):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_is_cst(arg0.ptr)
if res < 0:
raise
return bool(res)
def le_set(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).le_set(arg1)
ctx = arg0.ctx
res = isl.isl_aff_le_set(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def lt_set(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).lt_set(arg1)
ctx = arg0.ctx
res = isl.isl_aff_lt_set(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def mod(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_aff_mod_val(isl.isl_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
raise Error
def mul(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).mul(arg1)
ctx = arg0.ctx
res = isl.isl_aff_mul(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def ne_set(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).ne_set(arg1)
ctx = arg0.ctx
res = isl.isl_aff_ne_set(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def neg(arg0):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_neg(isl.isl_aff_copy(arg0.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def pullback(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_aff_pullback_multi_aff(isl.isl_aff_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
raise Error
def scale(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_aff_scale_val(isl.isl_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
raise Error
def scale_down(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_aff_scale_down_val(isl.isl_aff_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
raise Error
def sub(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return pw_aff(arg0).sub(arg1)
ctx = arg0.ctx
res = isl.isl_aff_sub(isl.isl_aff_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def to_list(arg0):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_to_list(isl.isl_aff_copy(arg0.ptr))
obj = aff_list(ctx=ctx, ptr=res)
return obj
def unbind_params_insert_domain(arg0, arg1):
try:
if not arg0.__class__ is aff:
arg0 = aff(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return pw_aff(arg0).unbind_params_insert_domain(arg1)
ctx = arg0.ctx
res = isl.isl_aff_unbind_params_insert_domain(isl.isl_aff_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
@staticmethod
def zero_on_domain(*args):
if len(args) == 1 and args[0].__class__ is space:
ctx = args[0].ctx
res = isl.isl_aff_zero_on_domain_space(isl.isl_space_copy(args[0].ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
raise Error
isl.isl_aff_read_from_str.restype = c_void_p
isl.isl_aff_read_from_str.argtypes = [Context, c_char_p]
isl.isl_aff_add.restype = c_void_p
isl.isl_aff_add.argtypes = [c_void_p, c_void_p]
isl.isl_aff_add_constant_val.restype = c_void_p
isl.isl_aff_add_constant_val.argtypes = [c_void_p, c_void_p]
isl.isl_aff_bind_id.restype = c_void_p
isl.isl_aff_bind_id.argtypes = [c_void_p, c_void_p]
isl.isl_aff_ceil.restype = c_void_p
isl.isl_aff_ceil.argtypes = [c_void_p]
isl.isl_aff_get_constant_val.restype = c_void_p
isl.isl_aff_get_constant_val.argtypes = [c_void_p]
isl.isl_aff_div.restype = c_void_p
isl.isl_aff_div.argtypes = [c_void_p, c_void_p]
isl.isl_aff_eq_set.restype = c_void_p
isl.isl_aff_eq_set.argtypes = [c_void_p, c_void_p]
isl.isl_aff_eval.restype = c_void_p
isl.isl_aff_eval.argtypes = [c_void_p, c_void_p]
isl.isl_aff_floor.restype = c_void_p
isl.isl_aff_floor.argtypes = [c_void_p]
isl.isl_aff_ge_set.restype = c_void_p
isl.isl_aff_ge_set.argtypes = [c_void_p, c_void_p]
isl.isl_aff_gist.restype = c_void_p
isl.isl_aff_gist.argtypes = [c_void_p, c_void_p]
isl.isl_aff_gt_set.restype = c_void_p
isl.isl_aff_gt_set.argtypes = [c_void_p, c_void_p]
isl.isl_aff_is_cst.argtypes = [c_void_p]
isl.isl_aff_le_set.restype = c_void_p
isl.isl_aff_le_set.argtypes = [c_void_p, c_void_p]
isl.isl_aff_lt_set.restype = c_void_p
isl.isl_aff_lt_set.argtypes = [c_void_p, c_void_p]
isl.isl_aff_mod_val.restype = c_void_p
isl.isl_aff_mod_val.argtypes = [c_void_p, c_void_p]
isl.isl_aff_mul.restype = c_void_p
isl.isl_aff_mul.argtypes = [c_void_p, c_void_p]
isl.isl_aff_ne_set.restype = c_void_p
isl.isl_aff_ne_set.argtypes = [c_void_p, c_void_p]
isl.isl_aff_neg.restype = c_void_p
isl.isl_aff_neg.argtypes = [c_void_p]
isl.isl_aff_pullback_multi_aff.restype = c_void_p
isl.isl_aff_pullback_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_aff_scale_val.restype = c_void_p
isl.isl_aff_scale_val.argtypes = [c_void_p, c_void_p]
isl.isl_aff_scale_down_val.restype = c_void_p
isl.isl_aff_scale_down_val.argtypes = [c_void_p, c_void_p]
isl.isl_aff_sub.restype = c_void_p
isl.isl_aff_sub.argtypes = [c_void_p, c_void_p]
isl.isl_aff_to_list.restype = c_void_p
isl.isl_aff_to_list.argtypes = [c_void_p]
isl.isl_aff_unbind_params_insert_domain.restype = c_void_p
isl.isl_aff_unbind_params_insert_domain.argtypes = [c_void_p, c_void_p]
isl.isl_aff_zero_on_domain_space.restype = c_void_p
isl.isl_aff_zero_on_domain_space.argtypes = [c_void_p]
isl.isl_aff_copy.restype = c_void_p
isl.isl_aff_copy.argtypes = [c_void_p]
isl.isl_aff_free.restype = c_void_p
isl.isl_aff_free.argtypes = [c_void_p]
isl.isl_aff_to_str.restype = POINTER(c_char)
isl.isl_aff_to_str.argtypes = [c_void_p]
class aff_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_aff_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and args[0].__class__ is aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_aff_list_from_aff(isl.isl_aff_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_aff_list_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_aff_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is aff_list:
arg0 = aff_list(arg0)
except:
raise
ptr = isl.isl_aff_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.aff_list("""%s""")' % s
else:
return 'isl.aff_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is aff_list:
arg0 = aff_list(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_list_add(isl.isl_aff_list_copy(arg0.ptr), isl.isl_aff_copy(arg1.ptr))
obj = aff_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is aff_list:
arg0 = aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_list_get_at(arg0.ptr, arg1)
obj = aff(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is aff_list:
arg0 = aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_list_clear(isl.isl_aff_list_copy(arg0.ptr))
obj = aff_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is aff_list:
arg0 = aff_list(arg0)
except:
raise
try:
if not arg1.__class__ is aff_list:
arg1 = aff_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_list_concat(isl.isl_aff_list_copy(arg0.ptr), isl.isl_aff_list_copy(arg1.ptr))
obj = aff_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is aff_list:
arg0 = aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_list_drop(isl.isl_aff_list_copy(arg0.ptr), arg1, arg2)
obj = aff_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is aff_list:
arg0 = aff_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = aff(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_aff_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is aff_list:
arg0 = aff_list(arg0)
except:
raise
try:
if not arg2.__class__ is aff:
arg2 = aff(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_list_insert(isl.isl_aff_list_copy(arg0.ptr), arg1, isl.isl_aff_copy(arg2.ptr))
obj = aff_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is aff_list:
arg0 = aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_aff_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_aff_list_alloc.restype = c_void_p
isl.isl_aff_list_alloc.argtypes = [Context, c_int]
isl.isl_aff_list_from_aff.restype = c_void_p
isl.isl_aff_list_from_aff.argtypes = [c_void_p]
isl.isl_aff_list_read_from_str.restype = c_void_p
isl.isl_aff_list_read_from_str.argtypes = [Context, c_char_p]
isl.isl_aff_list_add.restype = c_void_p
isl.isl_aff_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_aff_list_get_at.restype = c_void_p
isl.isl_aff_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_aff_list_clear.restype = c_void_p
isl.isl_aff_list_clear.argtypes = [c_void_p]
isl.isl_aff_list_concat.restype = c_void_p
isl.isl_aff_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_aff_list_drop.restype = c_void_p
isl.isl_aff_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_aff_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_aff_list_insert.restype = c_void_p
isl.isl_aff_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_aff_list_size.argtypes = [c_void_p]
isl.isl_aff_list_copy.restype = c_void_p
isl.isl_aff_list_copy.argtypes = [c_void_p]
isl.isl_aff_list_free.restype = c_void_p
isl.isl_aff_list_free.argtypes = [c_void_p]
isl.isl_aff_list_to_str.restype = POINTER(c_char)
isl.isl_aff_list_to_str.argtypes = [c_void_p]
class ast_build(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 0:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_ast_build_alloc(self.ctx)
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_build_free(self.ptr)
def copy_callbacks(self, obj):
if hasattr(obj, 'at_each_domain'):
self.at_each_domain = obj.at_each_domain
def set_at_each_domain(arg0, arg1):
try:
if not arg0.__class__ is ast_build:
arg0 = ast_build(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_void_p, c_void_p, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1, cb_arg2):
cb_arg0 = ast_node(ctx=arg0.ctx, ptr=(cb_arg0))
cb_arg1 = ast_build(ctx=arg0.ctx, ptr=isl.isl_ast_build_copy(cb_arg1))
try:
res = arg1(cb_arg0, cb_arg1)
except BaseException as e:
exc_info[0] = e
return None
return isl.isl_ast_node_copy(res.ptr)
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_ast_build_set_at_each_domain(isl.isl_ast_build_copy(arg0.ptr), cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if hasattr(arg0, 'at_each_domain') and arg0.at_each_domain['exc_info'] != None:
exc_info = arg0.at_each_domain['exc_info'][0]
arg0.at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = ast_build(ctx=ctx, ptr=res)
obj.copy_callbacks(arg0)
obj.at_each_domain = { 'func': cb, 'exc_info': exc_info }
return obj
def access_from(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_ast_build_access_from_multi_pw_aff(args[0].ptr, isl.isl_multi_pw_aff_copy(args[1].ptr))
if hasattr(args[0], 'at_each_domain') and args[0].at_each_domain['exc_info'] != None:
exc_info = args[0].at_each_domain['exc_info'][0]
args[0].at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = ast_expr(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_ast_build_access_from_pw_multi_aff(args[0].ptr, isl.isl_pw_multi_aff_copy(args[1].ptr))
if hasattr(args[0], 'at_each_domain') and args[0].at_each_domain['exc_info'] != None:
exc_info = args[0].at_each_domain['exc_info'][0]
args[0].at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = ast_expr(ctx=ctx, ptr=res)
return obj
raise Error
def call_from(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_ast_build_call_from_multi_pw_aff(args[0].ptr, isl.isl_multi_pw_aff_copy(args[1].ptr))
if hasattr(args[0], 'at_each_domain') and args[0].at_each_domain['exc_info'] != None:
exc_info = args[0].at_each_domain['exc_info'][0]
args[0].at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = ast_expr(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_ast_build_call_from_pw_multi_aff(args[0].ptr, isl.isl_pw_multi_aff_copy(args[1].ptr))
if hasattr(args[0], 'at_each_domain') and args[0].at_each_domain['exc_info'] != None:
exc_info = args[0].at_each_domain['exc_info'][0]
args[0].at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = ast_expr(ctx=ctx, ptr=res)
return obj
raise Error
def expr_from(*args):
if len(args) == 2 and args[1].__class__ is pw_aff:
ctx = args[0].ctx
res = isl.isl_ast_build_expr_from_pw_aff(args[0].ptr, isl.isl_pw_aff_copy(args[1].ptr))
if hasattr(args[0], 'at_each_domain') and args[0].at_each_domain['exc_info'] != None:
exc_info = args[0].at_each_domain['exc_info'][0]
args[0].at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = ast_expr(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is set:
ctx = args[0].ctx
res = isl.isl_ast_build_expr_from_set(args[0].ptr, isl.isl_set_copy(args[1].ptr))
if hasattr(args[0], 'at_each_domain') and args[0].at_each_domain['exc_info'] != None:
exc_info = args[0].at_each_domain['exc_info'][0]
args[0].at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = ast_expr(ctx=ctx, ptr=res)
return obj
raise Error
@staticmethod
def from_context(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_build_from_context(isl.isl_set_copy(arg0.ptr))
obj = ast_build(ctx=ctx, ptr=res)
return obj
def node_from(*args):
if len(args) == 2 and args[1].__class__ is schedule:
ctx = args[0].ctx
res = isl.isl_ast_build_node_from_schedule(args[0].ptr, isl.isl_schedule_copy(args[1].ptr))
if hasattr(args[0], 'at_each_domain') and args[0].at_each_domain['exc_info'] != None:
exc_info = args[0].at_each_domain['exc_info'][0]
args[0].at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = ast_node(ctx=ctx, ptr=res)
return obj
raise Error
def node_from_schedule_map(arg0, arg1):
try:
if not arg0.__class__ is ast_build:
arg0 = ast_build(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_build_node_from_schedule_map(arg0.ptr, isl.isl_union_map_copy(arg1.ptr))
if hasattr(arg0, 'at_each_domain') and arg0.at_each_domain['exc_info'] != None:
exc_info = arg0.at_each_domain['exc_info'][0]
arg0.at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = ast_node(ctx=ctx, ptr=res)
return obj
def schedule(arg0):
try:
if not arg0.__class__ is ast_build:
arg0 = ast_build(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_build_get_schedule(arg0.ptr)
if hasattr(arg0, 'at_each_domain') and arg0.at_each_domain['exc_info'] != None:
exc_info = arg0.at_each_domain['exc_info'][0]
arg0.at_each_domain['exc_info'][0] = None
if exc_info is not None:
raise exc_info
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_schedule(arg0):
return arg0.schedule()
isl.isl_ast_build_alloc.restype = c_void_p
isl.isl_ast_build_alloc.argtypes = [Context]
isl.isl_ast_build_set_at_each_domain.restype = c_void_p
isl.isl_ast_build_set_at_each_domain.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_ast_build_access_from_multi_pw_aff.restype = c_void_p
isl.isl_ast_build_access_from_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_ast_build_access_from_pw_multi_aff.restype = c_void_p
isl.isl_ast_build_access_from_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_ast_build_call_from_multi_pw_aff.restype = c_void_p
isl.isl_ast_build_call_from_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_ast_build_call_from_pw_multi_aff.restype = c_void_p
isl.isl_ast_build_call_from_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_ast_build_expr_from_pw_aff.restype = c_void_p
isl.isl_ast_build_expr_from_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_ast_build_expr_from_set.restype = c_void_p
isl.isl_ast_build_expr_from_set.argtypes = [c_void_p, c_void_p]
isl.isl_ast_build_from_context.restype = c_void_p
isl.isl_ast_build_from_context.argtypes = [c_void_p]
isl.isl_ast_build_node_from_schedule.restype = c_void_p
isl.isl_ast_build_node_from_schedule.argtypes = [c_void_p, c_void_p]
isl.isl_ast_build_node_from_schedule_map.restype = c_void_p
isl.isl_ast_build_node_from_schedule_map.argtypes = [c_void_p, c_void_p]
isl.isl_ast_build_get_schedule.restype = c_void_p
isl.isl_ast_build_get_schedule.argtypes = [c_void_p]
isl.isl_ast_build_copy.restype = c_void_p
isl.isl_ast_build_copy.argtypes = [c_void_p]
isl.isl_ast_build_free.restype = c_void_p
isl.isl_ast_build_free.argtypes = [c_void_p]
class ast_expr(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and isinstance(args[0], ast_expr_op):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_id):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_int):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
if "ptr" in keywords:
type = isl.isl_ast_expr_get_type(keywords["ptr"])
if type == 0:
return ast_expr_op(**keywords)
if type == 1:
return ast_expr_id(**keywords)
if type == 2:
return ast_expr_int(**keywords)
raise
return super(ast_expr, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr:
arg0 = ast_expr(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr("""%s""")' % s
else:
return 'isl.ast_expr("%s")' % s
def to_C_str(arg0):
try:
if not arg0.__class__ is ast_expr:
arg0 = ast_expr(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_expr_to_C_str(arg0.ptr)
if res == 0:
raise
string = cast(res, c_char_p).value.decode('ascii')
libc.free(res)
return string
isl.isl_ast_expr_to_C_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_C_str.argtypes = [c_void_p]
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
isl.isl_ast_expr_get_type.argtypes = [c_void_p]
class ast_expr_id(ast_expr):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_id, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_id:
arg0 = ast_expr_id(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_id("""%s""")' % s
else:
return 'isl.ast_expr_id("%s")' % s
def id(arg0):
try:
if not arg0.__class__ is ast_expr:
arg0 = ast_expr(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_expr_id_get_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_id(arg0):
return arg0.id()
isl.isl_ast_expr_id_get_id.restype = c_void_p
isl.isl_ast_expr_id_get_id.argtypes = [c_void_p]
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_int(ast_expr):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_int, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_int:
arg0 = ast_expr_int(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_int("""%s""")' % s
else:
return 'isl.ast_expr_int("%s")' % s
def val(arg0):
try:
if not arg0.__class__ is ast_expr:
arg0 = ast_expr(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_expr_int_get_val(arg0.ptr)
obj = val(ctx=ctx, ptr=res)
return obj
def get_val(arg0):
return arg0.val()
isl.isl_ast_expr_int_get_val.restype = c_void_p
isl.isl_ast_expr_int_get_val.argtypes = [c_void_p]
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op(ast_expr):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_and):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_and_then):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_or):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_or_else):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_max):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_min):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_minus):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_add):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_sub):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_mul):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_div):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_fdiv_q):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_pdiv_q):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_pdiv_r):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_zdiv_r):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_cond):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_select):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_eq):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_le):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_lt):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_ge):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_gt):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_call):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_access):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_member):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_expr_op_address_of):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_expr_copy(args[0].ptr)
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
if "ptr" in keywords:
type = isl.isl_ast_expr_op_get_type(keywords["ptr"])
if type == 0:
return ast_expr_op_and(**keywords)
if type == 1:
return ast_expr_op_and_then(**keywords)
if type == 2:
return ast_expr_op_or(**keywords)
if type == 3:
return ast_expr_op_or_else(**keywords)
if type == 4:
return ast_expr_op_max(**keywords)
if type == 5:
return ast_expr_op_min(**keywords)
if type == 6:
return ast_expr_op_minus(**keywords)
if type == 7:
return ast_expr_op_add(**keywords)
if type == 8:
return ast_expr_op_sub(**keywords)
if type == 9:
return ast_expr_op_mul(**keywords)
if type == 10:
return ast_expr_op_div(**keywords)
if type == 11:
return ast_expr_op_fdiv_q(**keywords)
if type == 12:
return ast_expr_op_pdiv_q(**keywords)
if type == 13:
return ast_expr_op_pdiv_r(**keywords)
if type == 14:
return ast_expr_op_zdiv_r(**keywords)
if type == 15:
return ast_expr_op_cond(**keywords)
if type == 16:
return ast_expr_op_select(**keywords)
if type == 17:
return ast_expr_op_eq(**keywords)
if type == 18:
return ast_expr_op_le(**keywords)
if type == 19:
return ast_expr_op_lt(**keywords)
if type == 20:
return ast_expr_op_ge(**keywords)
if type == 21:
return ast_expr_op_gt(**keywords)
if type == 22:
return ast_expr_op_call(**keywords)
if type == 23:
return ast_expr_op_access(**keywords)
if type == 24:
return ast_expr_op_member(**keywords)
if type == 25:
return ast_expr_op_address_of(**keywords)
raise
return super(ast_expr_op, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op:
arg0 = ast_expr_op(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op("""%s""")' % s
else:
return 'isl.ast_expr_op("%s")' % s
def arg(arg0, arg1):
try:
if not arg0.__class__ is ast_expr:
arg0 = ast_expr(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_expr_op_get_arg(arg0.ptr, arg1)
obj = ast_expr(ctx=ctx, ptr=res)
return obj
def get_arg(arg0, arg1):
return arg0.arg(arg1)
def n_arg(arg0):
try:
if not arg0.__class__ is ast_expr:
arg0 = ast_expr(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_expr_op_get_n_arg(arg0.ptr)
if res < 0:
raise
return int(res)
def get_n_arg(arg0):
return arg0.n_arg()
isl.isl_ast_expr_op_get_arg.restype = c_void_p
isl.isl_ast_expr_op_get_arg.argtypes = [c_void_p, c_int]
isl.isl_ast_expr_op_get_n_arg.argtypes = [c_void_p]
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
isl.isl_ast_expr_op_get_type.argtypes = [c_void_p]
class ast_expr_op_access(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_access, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_access:
arg0 = ast_expr_op_access(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_access("""%s""")' % s
else:
return 'isl.ast_expr_op_access("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_add(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_add, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_add:
arg0 = ast_expr_op_add(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_add("""%s""")' % s
else:
return 'isl.ast_expr_op_add("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_address_of(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_address_of, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_address_of:
arg0 = ast_expr_op_address_of(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_address_of("""%s""")' % s
else:
return 'isl.ast_expr_op_address_of("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_and(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_and, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_and:
arg0 = ast_expr_op_and(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_and("""%s""")' % s
else:
return 'isl.ast_expr_op_and("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_and_then(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_and_then, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_and_then:
arg0 = ast_expr_op_and_then(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_and_then("""%s""")' % s
else:
return 'isl.ast_expr_op_and_then("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_call(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_call, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_call:
arg0 = ast_expr_op_call(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_call("""%s""")' % s
else:
return 'isl.ast_expr_op_call("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_cond(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_cond, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_cond:
arg0 = ast_expr_op_cond(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_cond("""%s""")' % s
else:
return 'isl.ast_expr_op_cond("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_div(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_div, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_div:
arg0 = ast_expr_op_div(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_div("""%s""")' % s
else:
return 'isl.ast_expr_op_div("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_eq(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_eq, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_eq:
arg0 = ast_expr_op_eq(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_eq("""%s""")' % s
else:
return 'isl.ast_expr_op_eq("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_fdiv_q(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_fdiv_q, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_fdiv_q:
arg0 = ast_expr_op_fdiv_q(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_fdiv_q("""%s""")' % s
else:
return 'isl.ast_expr_op_fdiv_q("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_ge(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_ge, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_ge:
arg0 = ast_expr_op_ge(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_ge("""%s""")' % s
else:
return 'isl.ast_expr_op_ge("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_gt(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_gt, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_gt:
arg0 = ast_expr_op_gt(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_gt("""%s""")' % s
else:
return 'isl.ast_expr_op_gt("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_le(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_le, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_le:
arg0 = ast_expr_op_le(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_le("""%s""")' % s
else:
return 'isl.ast_expr_op_le("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_lt(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_lt, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_lt:
arg0 = ast_expr_op_lt(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_lt("""%s""")' % s
else:
return 'isl.ast_expr_op_lt("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_max(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_max, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_max:
arg0 = ast_expr_op_max(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_max("""%s""")' % s
else:
return 'isl.ast_expr_op_max("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_member(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_member, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_member:
arg0 = ast_expr_op_member(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_member("""%s""")' % s
else:
return 'isl.ast_expr_op_member("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_min(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_min, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_min:
arg0 = ast_expr_op_min(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_min("""%s""")' % s
else:
return 'isl.ast_expr_op_min("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_minus(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_minus, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_minus:
arg0 = ast_expr_op_minus(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_minus("""%s""")' % s
else:
return 'isl.ast_expr_op_minus("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_mul(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_mul, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_mul:
arg0 = ast_expr_op_mul(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_mul("""%s""")' % s
else:
return 'isl.ast_expr_op_mul("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_or(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_or, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_or:
arg0 = ast_expr_op_or(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_or("""%s""")' % s
else:
return 'isl.ast_expr_op_or("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_or_else(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_or_else, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_or_else:
arg0 = ast_expr_op_or_else(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_or_else("""%s""")' % s
else:
return 'isl.ast_expr_op_or_else("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_pdiv_q(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_pdiv_q, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_pdiv_q:
arg0 = ast_expr_op_pdiv_q(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_pdiv_q("""%s""")' % s
else:
return 'isl.ast_expr_op_pdiv_q("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_pdiv_r(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_pdiv_r, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_pdiv_r:
arg0 = ast_expr_op_pdiv_r(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_pdiv_r("""%s""")' % s
else:
return 'isl.ast_expr_op_pdiv_r("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_select(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_select, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_select:
arg0 = ast_expr_op_select(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_select("""%s""")' % s
else:
return 'isl.ast_expr_op_select("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_sub(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_sub, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_sub:
arg0 = ast_expr_op_sub(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_sub("""%s""")' % s
else:
return 'isl.ast_expr_op_sub("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_expr_op_zdiv_r(ast_expr_op):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_expr_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_expr_op_zdiv_r, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_expr_op_zdiv_r:
arg0 = ast_expr_op_zdiv_r(arg0)
except:
raise
ptr = isl.isl_ast_expr_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_expr_op_zdiv_r("""%s""")' % s
else:
return 'isl.ast_expr_op_zdiv_r("%s")' % s
isl.isl_ast_expr_copy.restype = c_void_p
isl.isl_ast_expr_copy.argtypes = [c_void_p]
isl.isl_ast_expr_free.restype = c_void_p
isl.isl_ast_expr_free.argtypes = [c_void_p]
isl.isl_ast_expr_to_str.restype = POINTER(c_char)
isl.isl_ast_expr_to_str.argtypes = [c_void_p]
class ast_node(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and isinstance(args[0], ast_node_for):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_node_if):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_node_block):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_node_mark):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], ast_node_user):
self.ctx = args[0].ctx
self.ptr = isl.isl_ast_node_copy(args[0].ptr)
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_node_free(self.ptr)
def __new__(cls, *args, **keywords):
if "ptr" in keywords:
type = isl.isl_ast_node_get_type(keywords["ptr"])
if type == 1:
return ast_node_for(**keywords)
if type == 2:
return ast_node_if(**keywords)
if type == 3:
return ast_node_block(**keywords)
if type == 4:
return ast_node_mark(**keywords)
if type == 5:
return ast_node_user(**keywords)
raise
return super(ast_node, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ptr = isl.isl_ast_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_node("""%s""")' % s
else:
return 'isl.ast_node("%s")' % s
def to_C_str(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_to_C_str(arg0.ptr)
if res == 0:
raise
string = cast(res, c_char_p).value.decode('ascii')
libc.free(res)
return string
def to_list(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_to_list(isl.isl_ast_node_copy(arg0.ptr))
obj = ast_node_list(ctx=ctx, ptr=res)
return obj
isl.isl_ast_node_to_C_str.restype = POINTER(c_char)
isl.isl_ast_node_to_C_str.argtypes = [c_void_p]
isl.isl_ast_node_to_list.restype = c_void_p
isl.isl_ast_node_to_list.argtypes = [c_void_p]
isl.isl_ast_node_copy.restype = c_void_p
isl.isl_ast_node_copy.argtypes = [c_void_p]
isl.isl_ast_node_free.restype = c_void_p
isl.isl_ast_node_free.argtypes = [c_void_p]
isl.isl_ast_node_to_str.restype = POINTER(c_char)
isl.isl_ast_node_to_str.argtypes = [c_void_p]
isl.isl_ast_node_get_type.argtypes = [c_void_p]
class ast_node_block(ast_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_node_block, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_node_block:
arg0 = ast_node_block(arg0)
except:
raise
ptr = isl.isl_ast_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_node_block("""%s""")' % s
else:
return 'isl.ast_node_block("%s")' % s
def children(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_block_get_children(arg0.ptr)
obj = ast_node_list(ctx=ctx, ptr=res)
return obj
def get_children(arg0):
return arg0.children()
isl.isl_ast_node_block_get_children.restype = c_void_p
isl.isl_ast_node_block_get_children.argtypes = [c_void_p]
isl.isl_ast_node_copy.restype = c_void_p
isl.isl_ast_node_copy.argtypes = [c_void_p]
isl.isl_ast_node_free.restype = c_void_p
isl.isl_ast_node_free.argtypes = [c_void_p]
isl.isl_ast_node_to_str.restype = POINTER(c_char)
isl.isl_ast_node_to_str.argtypes = [c_void_p]
class ast_node_for(ast_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_node_for, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_node_for:
arg0 = ast_node_for(arg0)
except:
raise
ptr = isl.isl_ast_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_node_for("""%s""")' % s
else:
return 'isl.ast_node_for("%s")' % s
def body(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_for_get_body(arg0.ptr)
obj = ast_node(ctx=ctx, ptr=res)
return obj
def get_body(arg0):
return arg0.body()
def cond(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_for_get_cond(arg0.ptr)
obj = ast_expr(ctx=ctx, ptr=res)
return obj
def get_cond(arg0):
return arg0.cond()
def inc(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_for_get_inc(arg0.ptr)
obj = ast_expr(ctx=ctx, ptr=res)
return obj
def get_inc(arg0):
return arg0.inc()
def init(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_for_get_init(arg0.ptr)
obj = ast_expr(ctx=ctx, ptr=res)
return obj
def get_init(arg0):
return arg0.init()
def is_degenerate(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_for_is_degenerate(arg0.ptr)
if res < 0:
raise
return bool(res)
def iterator(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_for_get_iterator(arg0.ptr)
obj = ast_expr(ctx=ctx, ptr=res)
return obj
def get_iterator(arg0):
return arg0.iterator()
isl.isl_ast_node_for_get_body.restype = c_void_p
isl.isl_ast_node_for_get_body.argtypes = [c_void_p]
isl.isl_ast_node_for_get_cond.restype = c_void_p
isl.isl_ast_node_for_get_cond.argtypes = [c_void_p]
isl.isl_ast_node_for_get_inc.restype = c_void_p
isl.isl_ast_node_for_get_inc.argtypes = [c_void_p]
isl.isl_ast_node_for_get_init.restype = c_void_p
isl.isl_ast_node_for_get_init.argtypes = [c_void_p]
isl.isl_ast_node_for_is_degenerate.argtypes = [c_void_p]
isl.isl_ast_node_for_get_iterator.restype = c_void_p
isl.isl_ast_node_for_get_iterator.argtypes = [c_void_p]
isl.isl_ast_node_copy.restype = c_void_p
isl.isl_ast_node_copy.argtypes = [c_void_p]
isl.isl_ast_node_free.restype = c_void_p
isl.isl_ast_node_free.argtypes = [c_void_p]
isl.isl_ast_node_to_str.restype = POINTER(c_char)
isl.isl_ast_node_to_str.argtypes = [c_void_p]
class ast_node_if(ast_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_node_if, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_node_if:
arg0 = ast_node_if(arg0)
except:
raise
ptr = isl.isl_ast_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_node_if("""%s""")' % s
else:
return 'isl.ast_node_if("%s")' % s
def cond(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_if_get_cond(arg0.ptr)
obj = ast_expr(ctx=ctx, ptr=res)
return obj
def get_cond(arg0):
return arg0.cond()
def else_node(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_if_get_else_node(arg0.ptr)
obj = ast_node(ctx=ctx, ptr=res)
return obj
def get_else_node(arg0):
return arg0.else_node()
def has_else_node(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_if_has_else_node(arg0.ptr)
if res < 0:
raise
return bool(res)
def then_node(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_if_get_then_node(arg0.ptr)
obj = ast_node(ctx=ctx, ptr=res)
return obj
def get_then_node(arg0):
return arg0.then_node()
isl.isl_ast_node_if_get_cond.restype = c_void_p
isl.isl_ast_node_if_get_cond.argtypes = [c_void_p]
isl.isl_ast_node_if_get_else_node.restype = c_void_p
isl.isl_ast_node_if_get_else_node.argtypes = [c_void_p]
isl.isl_ast_node_if_has_else_node.argtypes = [c_void_p]
isl.isl_ast_node_if_get_then_node.restype = c_void_p
isl.isl_ast_node_if_get_then_node.argtypes = [c_void_p]
isl.isl_ast_node_copy.restype = c_void_p
isl.isl_ast_node_copy.argtypes = [c_void_p]
isl.isl_ast_node_free.restype = c_void_p
isl.isl_ast_node_free.argtypes = [c_void_p]
isl.isl_ast_node_to_str.restype = POINTER(c_char)
isl.isl_ast_node_to_str.argtypes = [c_void_p]
class ast_node_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_ast_node_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and args[0].__class__ is ast_node:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_ast_node_list_from_ast_node(isl.isl_ast_node_copy(args[0].ptr))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_node_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is ast_node_list:
arg0 = ast_node_list(arg0)
except:
raise
ptr = isl.isl_ast_node_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_node_list("""%s""")' % s
else:
return 'isl.ast_node_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is ast_node_list:
arg0 = ast_node_list(arg0)
except:
raise
try:
if not arg1.__class__ is ast_node:
arg1 = ast_node(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_list_add(isl.isl_ast_node_list_copy(arg0.ptr), isl.isl_ast_node_copy(arg1.ptr))
obj = ast_node_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is ast_node_list:
arg0 = ast_node_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_list_get_at(arg0.ptr, arg1)
obj = ast_node(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is ast_node_list:
arg0 = ast_node_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_list_clear(isl.isl_ast_node_list_copy(arg0.ptr))
obj = ast_node_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is ast_node_list:
arg0 = ast_node_list(arg0)
except:
raise
try:
if not arg1.__class__ is ast_node_list:
arg1 = ast_node_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_list_concat(isl.isl_ast_node_list_copy(arg0.ptr), isl.isl_ast_node_list_copy(arg1.ptr))
obj = ast_node_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is ast_node_list:
arg0 = ast_node_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_list_drop(isl.isl_ast_node_list_copy(arg0.ptr), arg1, arg2)
obj = ast_node_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is ast_node_list:
arg0 = ast_node_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = ast_node(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_ast_node_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is ast_node_list:
arg0 = ast_node_list(arg0)
except:
raise
try:
if not arg2.__class__ is ast_node:
arg2 = ast_node(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_list_insert(isl.isl_ast_node_list_copy(arg0.ptr), arg1, isl.isl_ast_node_copy(arg2.ptr))
obj = ast_node_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is ast_node_list:
arg0 = ast_node_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_ast_node_list_alloc.restype = c_void_p
isl.isl_ast_node_list_alloc.argtypes = [Context, c_int]
isl.isl_ast_node_list_from_ast_node.restype = c_void_p
isl.isl_ast_node_list_from_ast_node.argtypes = [c_void_p]
isl.isl_ast_node_list_add.restype = c_void_p
isl.isl_ast_node_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_ast_node_list_get_at.restype = c_void_p
isl.isl_ast_node_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_ast_node_list_clear.restype = c_void_p
isl.isl_ast_node_list_clear.argtypes = [c_void_p]
isl.isl_ast_node_list_concat.restype = c_void_p
isl.isl_ast_node_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_ast_node_list_drop.restype = c_void_p
isl.isl_ast_node_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_ast_node_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_ast_node_list_insert.restype = c_void_p
isl.isl_ast_node_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_ast_node_list_size.argtypes = [c_void_p]
isl.isl_ast_node_list_copy.restype = c_void_p
isl.isl_ast_node_list_copy.argtypes = [c_void_p]
isl.isl_ast_node_list_free.restype = c_void_p
isl.isl_ast_node_list_free.argtypes = [c_void_p]
isl.isl_ast_node_list_to_str.restype = POINTER(c_char)
isl.isl_ast_node_list_to_str.argtypes = [c_void_p]
class ast_node_mark(ast_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_node_mark, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_node_mark:
arg0 = ast_node_mark(arg0)
except:
raise
ptr = isl.isl_ast_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_node_mark("""%s""")' % s
else:
return 'isl.ast_node_mark("%s")' % s
def id(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_mark_get_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_id(arg0):
return arg0.id()
def node(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_mark_get_node(arg0.ptr)
obj = ast_node(ctx=ctx, ptr=res)
return obj
def get_node(arg0):
return arg0.node()
isl.isl_ast_node_mark_get_id.restype = c_void_p
isl.isl_ast_node_mark_get_id.argtypes = [c_void_p]
isl.isl_ast_node_mark_get_node.restype = c_void_p
isl.isl_ast_node_mark_get_node.argtypes = [c_void_p]
isl.isl_ast_node_copy.restype = c_void_p
isl.isl_ast_node_copy.argtypes = [c_void_p]
isl.isl_ast_node_free.restype = c_void_p
isl.isl_ast_node_free.argtypes = [c_void_p]
isl.isl_ast_node_to_str.restype = POINTER(c_char)
isl.isl_ast_node_to_str.argtypes = [c_void_p]
class ast_node_user(ast_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_ast_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(ast_node_user, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is ast_node_user:
arg0 = ast_node_user(arg0)
except:
raise
ptr = isl.isl_ast_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.ast_node_user("""%s""")' % s
else:
return 'isl.ast_node_user("%s")' % s
def expr(arg0):
try:
if not arg0.__class__ is ast_node:
arg0 = ast_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_ast_node_user_get_expr(arg0.ptr)
obj = ast_expr(ctx=ctx, ptr=res)
return obj
def get_expr(arg0):
return arg0.expr()
isl.isl_ast_node_user_get_expr.restype = c_void_p
isl.isl_ast_node_user_get_expr.argtypes = [c_void_p]
isl.isl_ast_node_copy.restype = c_void_p
isl.isl_ast_node_copy.argtypes = [c_void_p]
isl.isl_ast_node_free.restype = c_void_p
isl.isl_ast_node_free.argtypes = [c_void_p]
isl.isl_ast_node_to_str.restype = POINTER(c_char)
isl.isl_ast_node_to_str.argtypes = [c_void_p]
class union_map(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is basic_map:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_map_from_basic_map(isl.isl_basic_map_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is map:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_map_from_map(isl.isl_map_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_map_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_union_map_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ptr = isl.isl_union_map_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.union_map("""%s""")' % s
else:
return 'isl.union_map("%s")' % s
def affine_hull(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_affine_hull(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def apply_domain(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_apply_domain(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def apply_range(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_apply_range(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def as_map(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_as_map(isl.isl_union_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def as_multi_union_pw_aff(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_as_multi_union_pw_aff(isl.isl_union_map_copy(arg0.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def as_union_pw_multi_aff(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_as_union_pw_multi_aff(isl.isl_union_map_copy(arg0.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def bind_range(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_bind_range(isl.isl_union_map_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def coalesce(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_coalesce(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def compute_divs(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_compute_divs(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def curry(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_curry(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def deltas(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_deltas(isl.isl_union_map_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def detect_equalities(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_detect_equalities(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def domain(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_domain(isl.isl_union_map_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def domain_factor_domain(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_domain_factor_domain(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def domain_factor_range(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_domain_factor_range(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def domain_map(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_domain_map(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def domain_map_union_pw_multi_aff(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_domain_map_union_pw_multi_aff(isl.isl_union_map_copy(arg0.ptr))
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def domain_product(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_domain_product(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
@staticmethod
def empty(*args):
if len(args) == 0:
ctx = Context.getDefaultInstance()
res = isl.isl_union_map_empty_ctx(ctx)
obj = union_map(ctx=ctx, ptr=res)
return obj
raise Error
def eq_at(*args):
if len(args) == 2 and args[1].__class__ is multi_union_pw_aff:
ctx = args[0].ctx
res = isl.isl_union_map_eq_at_multi_union_pw_aff(isl.isl_union_map_copy(args[0].ptr), isl.isl_multi_union_pw_aff_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
raise Error
def every_map(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = map(ctx=arg0.ctx, ptr=isl.isl_map_copy(cb_arg0))
try:
res = arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 1 if res else 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_union_map_every_map(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
return bool(res)
def extract_map(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_extract_map(arg0.ptr, isl.isl_space_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def factor_domain(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_factor_domain(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def factor_range(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_factor_range(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def fixed_power(*args):
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_union_map_fixed_power_val(isl.isl_union_map_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
raise Error
def foreach_map(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = map(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_union_map_foreach_map(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
@staticmethod
def convert_from(*args):
if len(args) == 1 and args[0].__class__ is multi_union_pw_aff:
ctx = args[0].ctx
res = isl.isl_union_map_from_multi_union_pw_aff(isl.isl_multi_union_pw_aff_copy(args[0].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
if len(args) == 1 and args[0].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_map_from_union_pw_multi_aff(isl.isl_union_pw_multi_aff_copy(args[0].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
raise Error
@staticmethod
def from_domain(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_from_domain(isl.isl_union_set_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
@staticmethod
def from_domain_and_range(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_from_domain_and_range(isl.isl_union_set_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
@staticmethod
def from_range(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_from_range(isl.isl_union_set_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_gist(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def gist_domain(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_gist_domain(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def gist_params(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_gist_params(isl.isl_union_map_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def gist_range(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_gist_range(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def intersect(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_intersect(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def intersect_domain(*args):
if len(args) == 2 and args[1].__class__ is space:
ctx = args[0].ctx
res = isl.isl_union_map_intersect_domain_space(isl.isl_union_map_copy(args[0].ptr), isl.isl_space_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is union_set:
ctx = args[0].ctx
res = isl.isl_union_map_intersect_domain_union_set(isl.isl_union_map_copy(args[0].ptr), isl.isl_union_set_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
raise Error
def intersect_domain_factor_domain(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_intersect_domain_factor_domain(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def intersect_domain_factor_range(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_intersect_domain_factor_range(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_intersect_params(isl.isl_union_map_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def intersect_range(*args):
if len(args) == 2 and args[1].__class__ is space:
ctx = args[0].ctx
res = isl.isl_union_map_intersect_range_space(isl.isl_union_map_copy(args[0].ptr), isl.isl_space_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is union_set:
ctx = args[0].ctx
res = isl.isl_union_map_intersect_range_union_set(isl.isl_union_map_copy(args[0].ptr), isl.isl_union_set_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
raise Error
def intersect_range_factor_domain(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_intersect_range_factor_domain(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def intersect_range_factor_range(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_intersect_range_factor_range(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def is_bijective(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_is_bijective(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_disjoint(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_is_disjoint(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_empty(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_is_empty(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_equal(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_injective(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_is_injective(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_single_valued(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_is_single_valued(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_strict_subset(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_is_strict_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_subset(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_is_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def isa_map(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_isa_map(arg0.ptr)
if res < 0:
raise
return bool(res)
def lexmax(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_lexmax(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def lexmin(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_lexmin(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def map_list(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_get_map_list(arg0.ptr)
obj = map_list(ctx=ctx, ptr=res)
return obj
def get_map_list(arg0):
return arg0.map_list()
def polyhedral_hull(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_polyhedral_hull(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def preimage_domain(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_union_map_preimage_domain_multi_aff(isl.isl_union_map_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_union_map_preimage_domain_multi_pw_aff(isl.isl_union_map_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_map_preimage_domain_pw_multi_aff(isl.isl_union_map_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_map_preimage_domain_union_pw_multi_aff(isl.isl_union_map_copy(args[0].ptr), isl.isl_union_pw_multi_aff_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
raise Error
def preimage_range(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_union_map_preimage_range_multi_aff(isl.isl_union_map_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_map_preimage_range_pw_multi_aff(isl.isl_union_map_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_map_preimage_range_union_pw_multi_aff(isl.isl_union_map_copy(args[0].ptr), isl.isl_union_pw_multi_aff_copy(args[1].ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
raise Error
def product(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_product(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def project_out_all_params(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_project_out_all_params(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def range(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_range(isl.isl_union_map_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def range_factor_domain(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_range_factor_domain(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def range_factor_range(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_range_factor_range(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def range_map(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_range_map(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def range_product(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_range_product(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def range_reverse(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_range_reverse(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def reverse(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_reverse(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def space(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def subtract(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_subtract(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def subtract_domain(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_subtract_domain(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def subtract_range(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_subtract_range(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def uncurry(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_uncurry(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def union(arg0, arg1):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_union(isl.isl_union_map_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def universe(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_universe(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def wrap(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_wrap(isl.isl_union_map_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def zip(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_map_zip(isl.isl_union_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
isl.isl_union_map_from_basic_map.restype = c_void_p
isl.isl_union_map_from_basic_map.argtypes = [c_void_p]
isl.isl_union_map_from_map.restype = c_void_p
isl.isl_union_map_from_map.argtypes = [c_void_p]
isl.isl_union_map_read_from_str.restype = c_void_p
isl.isl_union_map_read_from_str.argtypes = [Context, c_char_p]
isl.isl_union_map_affine_hull.restype = c_void_p
isl.isl_union_map_affine_hull.argtypes = [c_void_p]
isl.isl_union_map_apply_domain.restype = c_void_p
isl.isl_union_map_apply_domain.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_apply_range.restype = c_void_p
isl.isl_union_map_apply_range.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_as_map.restype = c_void_p
isl.isl_union_map_as_map.argtypes = [c_void_p]
isl.isl_union_map_as_multi_union_pw_aff.restype = c_void_p
isl.isl_union_map_as_multi_union_pw_aff.argtypes = [c_void_p]
isl.isl_union_map_as_union_pw_multi_aff.restype = c_void_p
isl.isl_union_map_as_union_pw_multi_aff.argtypes = [c_void_p]
isl.isl_union_map_bind_range.restype = c_void_p
isl.isl_union_map_bind_range.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_coalesce.restype = c_void_p
isl.isl_union_map_coalesce.argtypes = [c_void_p]
isl.isl_union_map_compute_divs.restype = c_void_p
isl.isl_union_map_compute_divs.argtypes = [c_void_p]
isl.isl_union_map_curry.restype = c_void_p
isl.isl_union_map_curry.argtypes = [c_void_p]
isl.isl_union_map_deltas.restype = c_void_p
isl.isl_union_map_deltas.argtypes = [c_void_p]
isl.isl_union_map_detect_equalities.restype = c_void_p
isl.isl_union_map_detect_equalities.argtypes = [c_void_p]
isl.isl_union_map_domain.restype = c_void_p
isl.isl_union_map_domain.argtypes = [c_void_p]
isl.isl_union_map_domain_factor_domain.restype = c_void_p
isl.isl_union_map_domain_factor_domain.argtypes = [c_void_p]
isl.isl_union_map_domain_factor_range.restype = c_void_p
isl.isl_union_map_domain_factor_range.argtypes = [c_void_p]
isl.isl_union_map_domain_map.restype = c_void_p
isl.isl_union_map_domain_map.argtypes = [c_void_p]
isl.isl_union_map_domain_map_union_pw_multi_aff.restype = c_void_p
isl.isl_union_map_domain_map_union_pw_multi_aff.argtypes = [c_void_p]
isl.isl_union_map_domain_product.restype = c_void_p
isl.isl_union_map_domain_product.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_empty_ctx.restype = c_void_p
isl.isl_union_map_empty_ctx.argtypes = [Context]
isl.isl_union_map_eq_at_multi_union_pw_aff.restype = c_void_p
isl.isl_union_map_eq_at_multi_union_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_every_map.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_union_map_extract_map.restype = c_void_p
isl.isl_union_map_extract_map.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_factor_domain.restype = c_void_p
isl.isl_union_map_factor_domain.argtypes = [c_void_p]
isl.isl_union_map_factor_range.restype = c_void_p
isl.isl_union_map_factor_range.argtypes = [c_void_p]
isl.isl_union_map_fixed_power_val.restype = c_void_p
isl.isl_union_map_fixed_power_val.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_foreach_map.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_union_map_from_multi_union_pw_aff.restype = c_void_p
isl.isl_union_map_from_multi_union_pw_aff.argtypes = [c_void_p]
isl.isl_union_map_from_union_pw_multi_aff.restype = c_void_p
isl.isl_union_map_from_union_pw_multi_aff.argtypes = [c_void_p]
isl.isl_union_map_from_domain.restype = c_void_p
isl.isl_union_map_from_domain.argtypes = [c_void_p]
isl.isl_union_map_from_domain_and_range.restype = c_void_p
isl.isl_union_map_from_domain_and_range.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_from_range.restype = c_void_p
isl.isl_union_map_from_range.argtypes = [c_void_p]
isl.isl_union_map_gist.restype = c_void_p
isl.isl_union_map_gist.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_gist_domain.restype = c_void_p
isl.isl_union_map_gist_domain.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_gist_params.restype = c_void_p
isl.isl_union_map_gist_params.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_gist_range.restype = c_void_p
isl.isl_union_map_gist_range.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect.restype = c_void_p
isl.isl_union_map_intersect.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect_domain_space.restype = c_void_p
isl.isl_union_map_intersect_domain_space.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect_domain_union_set.restype = c_void_p
isl.isl_union_map_intersect_domain_union_set.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect_domain_factor_domain.restype = c_void_p
isl.isl_union_map_intersect_domain_factor_domain.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect_domain_factor_range.restype = c_void_p
isl.isl_union_map_intersect_domain_factor_range.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect_params.restype = c_void_p
isl.isl_union_map_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect_range_space.restype = c_void_p
isl.isl_union_map_intersect_range_space.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect_range_union_set.restype = c_void_p
isl.isl_union_map_intersect_range_union_set.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect_range_factor_domain.restype = c_void_p
isl.isl_union_map_intersect_range_factor_domain.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_intersect_range_factor_range.restype = c_void_p
isl.isl_union_map_intersect_range_factor_range.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_is_bijective.argtypes = [c_void_p]
isl.isl_union_map_is_disjoint.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_is_empty.argtypes = [c_void_p]
isl.isl_union_map_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_is_injective.argtypes = [c_void_p]
isl.isl_union_map_is_single_valued.argtypes = [c_void_p]
isl.isl_union_map_is_strict_subset.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_is_subset.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_isa_map.argtypes = [c_void_p]
isl.isl_union_map_lexmax.restype = c_void_p
isl.isl_union_map_lexmax.argtypes = [c_void_p]
isl.isl_union_map_lexmin.restype = c_void_p
isl.isl_union_map_lexmin.argtypes = [c_void_p]
isl.isl_union_map_get_map_list.restype = c_void_p
isl.isl_union_map_get_map_list.argtypes = [c_void_p]
isl.isl_union_map_polyhedral_hull.restype = c_void_p
isl.isl_union_map_polyhedral_hull.argtypes = [c_void_p]
isl.isl_union_map_preimage_domain_multi_aff.restype = c_void_p
isl.isl_union_map_preimage_domain_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_preimage_domain_multi_pw_aff.restype = c_void_p
isl.isl_union_map_preimage_domain_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_preimage_domain_pw_multi_aff.restype = c_void_p
isl.isl_union_map_preimage_domain_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_preimage_domain_union_pw_multi_aff.restype = c_void_p
isl.isl_union_map_preimage_domain_union_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_preimage_range_multi_aff.restype = c_void_p
isl.isl_union_map_preimage_range_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_preimage_range_pw_multi_aff.restype = c_void_p
isl.isl_union_map_preimage_range_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_preimage_range_union_pw_multi_aff.restype = c_void_p
isl.isl_union_map_preimage_range_union_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_product.restype = c_void_p
isl.isl_union_map_product.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_project_out_all_params.restype = c_void_p
isl.isl_union_map_project_out_all_params.argtypes = [c_void_p]
isl.isl_union_map_range.restype = c_void_p
isl.isl_union_map_range.argtypes = [c_void_p]
isl.isl_union_map_range_factor_domain.restype = c_void_p
isl.isl_union_map_range_factor_domain.argtypes = [c_void_p]
isl.isl_union_map_range_factor_range.restype = c_void_p
isl.isl_union_map_range_factor_range.argtypes = [c_void_p]
isl.isl_union_map_range_map.restype = c_void_p
isl.isl_union_map_range_map.argtypes = [c_void_p]
isl.isl_union_map_range_product.restype = c_void_p
isl.isl_union_map_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_range_reverse.restype = c_void_p
isl.isl_union_map_range_reverse.argtypes = [c_void_p]
isl.isl_union_map_reverse.restype = c_void_p
isl.isl_union_map_reverse.argtypes = [c_void_p]
isl.isl_union_map_get_space.restype = c_void_p
isl.isl_union_map_get_space.argtypes = [c_void_p]
isl.isl_union_map_subtract.restype = c_void_p
isl.isl_union_map_subtract.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_subtract_domain.restype = c_void_p
isl.isl_union_map_subtract_domain.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_subtract_range.restype = c_void_p
isl.isl_union_map_subtract_range.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_uncurry.restype = c_void_p
isl.isl_union_map_uncurry.argtypes = [c_void_p]
isl.isl_union_map_union.restype = c_void_p
isl.isl_union_map_union.argtypes = [c_void_p, c_void_p]
isl.isl_union_map_universe.restype = c_void_p
isl.isl_union_map_universe.argtypes = [c_void_p]
isl.isl_union_map_wrap.restype = c_void_p
isl.isl_union_map_wrap.argtypes = [c_void_p]
isl.isl_union_map_zip.restype = c_void_p
isl.isl_union_map_zip.argtypes = [c_void_p]
isl.isl_union_map_copy.restype = c_void_p
isl.isl_union_map_copy.argtypes = [c_void_p]
isl.isl_union_map_free.restype = c_void_p
isl.isl_union_map_free.argtypes = [c_void_p]
isl.isl_union_map_to_str.restype = POINTER(c_char)
isl.isl_union_map_to_str.argtypes = [c_void_p]
class map(union_map):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is basic_map:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_map_from_basic_map(isl.isl_basic_map_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_map_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_map_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ptr = isl.isl_map_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.map("""%s""")' % s
else:
return 'isl.map("%s")' % s
def affine_hull(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_affine_hull(isl.isl_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def apply_domain(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).apply_domain(arg1)
ctx = arg0.ctx
res = isl.isl_map_apply_domain(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def apply_range(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).apply_range(arg1)
ctx = arg0.ctx
res = isl.isl_map_apply_range(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def as_pw_multi_aff(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_as_pw_multi_aff(isl.isl_map_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def bind_domain(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return union_map(arg0).bind_domain(arg1)
ctx = arg0.ctx
res = isl.isl_map_bind_domain(isl.isl_map_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def bind_range(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return union_map(arg0).bind_range(arg1)
ctx = arg0.ctx
res = isl.isl_map_bind_range(isl.isl_map_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def coalesce(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_coalesce(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def complement(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_complement(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def curry(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_curry(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def deltas(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_deltas(isl.isl_map_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def detect_equalities(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_detect_equalities(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def domain(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_domain(isl.isl_map_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def domain_factor_domain(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_domain_factor_domain(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def domain_factor_range(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_domain_factor_range(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def domain_product(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).domain_product(arg1)
ctx = arg0.ctx
res = isl.isl_map_domain_product(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def domain_tuple_dim(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_domain_tuple_dim(arg0.ptr)
if res < 0:
raise
return int(res)
def domain_tuple_id(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_get_domain_tuple_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_domain_tuple_id(arg0):
return arg0.domain_tuple_id()
@staticmethod
def empty(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_empty(isl.isl_space_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def eq_at(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_map_eq_at_multi_pw_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def factor_domain(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_factor_domain(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def factor_range(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_factor_range(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def flatten(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_flatten(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def flatten_domain(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_flatten_domain(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def flatten_range(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_flatten_range(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def foreach_basic_map(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = basic_map(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_map_foreach_basic_map(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def gist(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_map_gist(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def gist_domain(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_map(arg0).gist_domain(arg1)
ctx = arg0.ctx
res = isl.isl_map_gist_domain(isl.isl_map_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def has_domain_tuple_id(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_has_domain_tuple_id(arg0.ptr)
if res < 0:
raise
return bool(res)
def has_range_tuple_id(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_has_range_tuple_id(arg0.ptr)
if res < 0:
raise
return bool(res)
def intersect(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).intersect(arg1)
ctx = arg0.ctx
res = isl.isl_map_intersect(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def intersect_domain(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_map(arg0).intersect_domain(arg1)
ctx = arg0.ctx
res = isl.isl_map_intersect_domain(isl.isl_map_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def intersect_domain_factor_domain(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).intersect_domain_factor_domain(arg1)
ctx = arg0.ctx
res = isl.isl_map_intersect_domain_factor_domain(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def intersect_domain_factor_range(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).intersect_domain_factor_range(arg1)
ctx = arg0.ctx
res = isl.isl_map_intersect_domain_factor_range(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_map(arg0).intersect_params(arg1)
ctx = arg0.ctx
res = isl.isl_map_intersect_params(isl.isl_map_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def intersect_range(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_map(arg0).intersect_range(arg1)
ctx = arg0.ctx
res = isl.isl_map_intersect_range(isl.isl_map_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def intersect_range_factor_domain(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).intersect_range_factor_domain(arg1)
ctx = arg0.ctx
res = isl.isl_map_intersect_range_factor_domain(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def intersect_range_factor_range(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).intersect_range_factor_range(arg1)
ctx = arg0.ctx
res = isl.isl_map_intersect_range_factor_range(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def is_bijective(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_is_bijective(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_disjoint(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).is_disjoint(arg1)
ctx = arg0.ctx
res = isl.isl_map_is_disjoint(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_empty(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_is_empty(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_equal(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).is_equal(arg1)
ctx = arg0.ctx
res = isl.isl_map_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_injective(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_is_injective(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_single_valued(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_is_single_valued(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_strict_subset(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).is_strict_subset(arg1)
ctx = arg0.ctx
res = isl.isl_map_is_strict_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_subset(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).is_subset(arg1)
ctx = arg0.ctx
res = isl.isl_map_is_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def lex_ge_at(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_map_lex_ge_at_multi_pw_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def lex_gt_at(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_map_lex_gt_at_multi_pw_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def lex_le_at(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_map_lex_le_at_multi_pw_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def lex_lt_at(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_map_lex_lt_at_multi_pw_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def lexmax(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_lexmax(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def lexmax_pw_multi_aff(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_lexmax_pw_multi_aff(isl.isl_map_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def lexmin(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_lexmin(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def lexmin_pw_multi_aff(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_lexmin_pw_multi_aff(isl.isl_map_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def lower_bound(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_map_lower_bound_multi_pw_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def max_multi_pw_aff(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_max_multi_pw_aff(isl.isl_map_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def min_multi_pw_aff(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_min_multi_pw_aff(isl.isl_map_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def polyhedral_hull(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_polyhedral_hull(isl.isl_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def preimage_domain(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_map_preimage_domain_multi_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_map_preimage_domain_multi_pw_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_map_preimage_domain_pw_multi_aff(isl.isl_map_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def preimage_range(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_map_preimage_range_multi_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_map_preimage_range_pw_multi_aff(isl.isl_map_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def product(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).product(arg1)
ctx = arg0.ctx
res = isl.isl_map_product(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def project_out_all_params(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_project_out_all_params(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def range(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_range(isl.isl_map_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def range_factor_domain(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_range_factor_domain(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def range_factor_range(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_range_factor_range(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def range_lattice_tile(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_get_range_lattice_tile(arg0.ptr)
obj = fixed_box(ctx=ctx, ptr=res)
return obj
def get_range_lattice_tile(arg0):
return arg0.range_lattice_tile()
def range_product(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).range_product(arg1)
ctx = arg0.ctx
res = isl.isl_map_range_product(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def range_reverse(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_range_reverse(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def range_simple_fixed_box_hull(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_get_range_simple_fixed_box_hull(arg0.ptr)
obj = fixed_box(ctx=ctx, ptr=res)
return obj
def get_range_simple_fixed_box_hull(arg0):
return arg0.range_simple_fixed_box_hull()
def range_tuple_dim(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_range_tuple_dim(arg0.ptr)
if res < 0:
raise
return int(res)
def range_tuple_id(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_get_range_tuple_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_range_tuple_id(arg0):
return arg0.range_tuple_id()
def reverse(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_reverse(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def sample(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_sample(isl.isl_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def set_domain_tuple(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_map_set_domain_tuple_id(isl.isl_map_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def set_range_tuple(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_map_set_range_tuple_id(isl.isl_map_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def space(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def subtract(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).subtract(arg1)
ctx = arg0.ctx
res = isl.isl_map_subtract(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def to_list(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_to_list(isl.isl_map_copy(arg0.ptr))
obj = map_list(ctx=ctx, ptr=res)
return obj
def to_union_map(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_to_union_map(isl.isl_map_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def uncurry(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_uncurry(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def union(arg0, arg1):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_map(arg0).union(arg1)
ctx = arg0.ctx
res = isl.isl_map_union(isl.isl_map_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
@staticmethod
def universe(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_universe(isl.isl_space_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def unshifted_simple_hull(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_unshifted_simple_hull(isl.isl_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def upper_bound(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_map_upper_bound_multi_pw_aff(isl.isl_map_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = map(ctx=ctx, ptr=res)
return obj
raise Error
def wrap(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_wrap(isl.isl_map_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def zip(arg0):
try:
if not arg0.__class__ is map:
arg0 = map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_zip(isl.isl_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
isl.isl_map_from_basic_map.restype = c_void_p
isl.isl_map_from_basic_map.argtypes = [c_void_p]
isl.isl_map_read_from_str.restype = c_void_p
isl.isl_map_read_from_str.argtypes = [Context, c_char_p]
isl.isl_map_affine_hull.restype = c_void_p
isl.isl_map_affine_hull.argtypes = [c_void_p]
isl.isl_map_apply_domain.restype = c_void_p
isl.isl_map_apply_domain.argtypes = [c_void_p, c_void_p]
isl.isl_map_apply_range.restype = c_void_p
isl.isl_map_apply_range.argtypes = [c_void_p, c_void_p]
isl.isl_map_as_pw_multi_aff.restype = c_void_p
isl.isl_map_as_pw_multi_aff.argtypes = [c_void_p]
isl.isl_map_bind_domain.restype = c_void_p
isl.isl_map_bind_domain.argtypes = [c_void_p, c_void_p]
isl.isl_map_bind_range.restype = c_void_p
isl.isl_map_bind_range.argtypes = [c_void_p, c_void_p]
isl.isl_map_coalesce.restype = c_void_p
isl.isl_map_coalesce.argtypes = [c_void_p]
isl.isl_map_complement.restype = c_void_p
isl.isl_map_complement.argtypes = [c_void_p]
isl.isl_map_curry.restype = c_void_p
isl.isl_map_curry.argtypes = [c_void_p]
isl.isl_map_deltas.restype = c_void_p
isl.isl_map_deltas.argtypes = [c_void_p]
isl.isl_map_detect_equalities.restype = c_void_p
isl.isl_map_detect_equalities.argtypes = [c_void_p]
isl.isl_map_domain.restype = c_void_p
isl.isl_map_domain.argtypes = [c_void_p]
isl.isl_map_domain_factor_domain.restype = c_void_p
isl.isl_map_domain_factor_domain.argtypes = [c_void_p]
isl.isl_map_domain_factor_range.restype = c_void_p
isl.isl_map_domain_factor_range.argtypes = [c_void_p]
isl.isl_map_domain_product.restype = c_void_p
isl.isl_map_domain_product.argtypes = [c_void_p, c_void_p]
isl.isl_map_domain_tuple_dim.argtypes = [c_void_p]
isl.isl_map_get_domain_tuple_id.restype = c_void_p
isl.isl_map_get_domain_tuple_id.argtypes = [c_void_p]
isl.isl_map_empty.restype = c_void_p
isl.isl_map_empty.argtypes = [c_void_p]
isl.isl_map_eq_at_multi_pw_aff.restype = c_void_p
isl.isl_map_eq_at_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_factor_domain.restype = c_void_p
isl.isl_map_factor_domain.argtypes = [c_void_p]
isl.isl_map_factor_range.restype = c_void_p
isl.isl_map_factor_range.argtypes = [c_void_p]
isl.isl_map_flatten.restype = c_void_p
isl.isl_map_flatten.argtypes = [c_void_p]
isl.isl_map_flatten_domain.restype = c_void_p
isl.isl_map_flatten_domain.argtypes = [c_void_p]
isl.isl_map_flatten_range.restype = c_void_p
isl.isl_map_flatten_range.argtypes = [c_void_p]
isl.isl_map_foreach_basic_map.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_map_gist.restype = c_void_p
isl.isl_map_gist.argtypes = [c_void_p, c_void_p]
isl.isl_map_gist_domain.restype = c_void_p
isl.isl_map_gist_domain.argtypes = [c_void_p, c_void_p]
isl.isl_map_has_domain_tuple_id.argtypes = [c_void_p]
isl.isl_map_has_range_tuple_id.argtypes = [c_void_p]
isl.isl_map_intersect.restype = c_void_p
isl.isl_map_intersect.argtypes = [c_void_p, c_void_p]
isl.isl_map_intersect_domain.restype = c_void_p
isl.isl_map_intersect_domain.argtypes = [c_void_p, c_void_p]
isl.isl_map_intersect_domain_factor_domain.restype = c_void_p
isl.isl_map_intersect_domain_factor_domain.argtypes = [c_void_p, c_void_p]
isl.isl_map_intersect_domain_factor_range.restype = c_void_p
isl.isl_map_intersect_domain_factor_range.argtypes = [c_void_p, c_void_p]
isl.isl_map_intersect_params.restype = c_void_p
isl.isl_map_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_map_intersect_range.restype = c_void_p
isl.isl_map_intersect_range.argtypes = [c_void_p, c_void_p]
isl.isl_map_intersect_range_factor_domain.restype = c_void_p
isl.isl_map_intersect_range_factor_domain.argtypes = [c_void_p, c_void_p]
isl.isl_map_intersect_range_factor_range.restype = c_void_p
isl.isl_map_intersect_range_factor_range.argtypes = [c_void_p, c_void_p]
isl.isl_map_is_bijective.argtypes = [c_void_p]
isl.isl_map_is_disjoint.argtypes = [c_void_p, c_void_p]
isl.isl_map_is_empty.argtypes = [c_void_p]
isl.isl_map_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_map_is_injective.argtypes = [c_void_p]
isl.isl_map_is_single_valued.argtypes = [c_void_p]
isl.isl_map_is_strict_subset.argtypes = [c_void_p, c_void_p]
isl.isl_map_is_subset.argtypes = [c_void_p, c_void_p]
isl.isl_map_lex_ge_at_multi_pw_aff.restype = c_void_p
isl.isl_map_lex_ge_at_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_lex_gt_at_multi_pw_aff.restype = c_void_p
isl.isl_map_lex_gt_at_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_lex_le_at_multi_pw_aff.restype = c_void_p
isl.isl_map_lex_le_at_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_lex_lt_at_multi_pw_aff.restype = c_void_p
isl.isl_map_lex_lt_at_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_lexmax.restype = c_void_p
isl.isl_map_lexmax.argtypes = [c_void_p]
isl.isl_map_lexmax_pw_multi_aff.restype = c_void_p
isl.isl_map_lexmax_pw_multi_aff.argtypes = [c_void_p]
isl.isl_map_lexmin.restype = c_void_p
isl.isl_map_lexmin.argtypes = [c_void_p]
isl.isl_map_lexmin_pw_multi_aff.restype = c_void_p
isl.isl_map_lexmin_pw_multi_aff.argtypes = [c_void_p]
isl.isl_map_lower_bound_multi_pw_aff.restype = c_void_p
isl.isl_map_lower_bound_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_max_multi_pw_aff.restype = c_void_p
isl.isl_map_max_multi_pw_aff.argtypes = [c_void_p]
isl.isl_map_min_multi_pw_aff.restype = c_void_p
isl.isl_map_min_multi_pw_aff.argtypes = [c_void_p]
isl.isl_map_polyhedral_hull.restype = c_void_p
isl.isl_map_polyhedral_hull.argtypes = [c_void_p]
isl.isl_map_preimage_domain_multi_aff.restype = c_void_p
isl.isl_map_preimage_domain_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_preimage_domain_multi_pw_aff.restype = c_void_p
isl.isl_map_preimage_domain_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_preimage_domain_pw_multi_aff.restype = c_void_p
isl.isl_map_preimage_domain_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_preimage_range_multi_aff.restype = c_void_p
isl.isl_map_preimage_range_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_preimage_range_pw_multi_aff.restype = c_void_p
isl.isl_map_preimage_range_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_product.restype = c_void_p
isl.isl_map_product.argtypes = [c_void_p, c_void_p]
isl.isl_map_project_out_all_params.restype = c_void_p
isl.isl_map_project_out_all_params.argtypes = [c_void_p]
isl.isl_map_range.restype = c_void_p
isl.isl_map_range.argtypes = [c_void_p]
isl.isl_map_range_factor_domain.restype = c_void_p
isl.isl_map_range_factor_domain.argtypes = [c_void_p]
isl.isl_map_range_factor_range.restype = c_void_p
isl.isl_map_range_factor_range.argtypes = [c_void_p]
isl.isl_map_get_range_lattice_tile.restype = c_void_p
isl.isl_map_get_range_lattice_tile.argtypes = [c_void_p]
isl.isl_map_range_product.restype = c_void_p
isl.isl_map_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_map_range_reverse.restype = c_void_p
isl.isl_map_range_reverse.argtypes = [c_void_p]
isl.isl_map_get_range_simple_fixed_box_hull.restype = c_void_p
isl.isl_map_get_range_simple_fixed_box_hull.argtypes = [c_void_p]
isl.isl_map_range_tuple_dim.argtypes = [c_void_p]
isl.isl_map_get_range_tuple_id.restype = c_void_p
isl.isl_map_get_range_tuple_id.argtypes = [c_void_p]
isl.isl_map_reverse.restype = c_void_p
isl.isl_map_reverse.argtypes = [c_void_p]
isl.isl_map_sample.restype = c_void_p
isl.isl_map_sample.argtypes = [c_void_p]
isl.isl_map_set_domain_tuple_id.restype = c_void_p
isl.isl_map_set_domain_tuple_id.argtypes = [c_void_p, c_void_p]
isl.isl_map_set_range_tuple_id.restype = c_void_p
isl.isl_map_set_range_tuple_id.argtypes = [c_void_p, c_void_p]
isl.isl_map_get_space.restype = c_void_p
isl.isl_map_get_space.argtypes = [c_void_p]
isl.isl_map_subtract.restype = c_void_p
isl.isl_map_subtract.argtypes = [c_void_p, c_void_p]
isl.isl_map_to_list.restype = c_void_p
isl.isl_map_to_list.argtypes = [c_void_p]
isl.isl_map_to_union_map.restype = c_void_p
isl.isl_map_to_union_map.argtypes = [c_void_p]
isl.isl_map_uncurry.restype = c_void_p
isl.isl_map_uncurry.argtypes = [c_void_p]
isl.isl_map_union.restype = c_void_p
isl.isl_map_union.argtypes = [c_void_p, c_void_p]
isl.isl_map_universe.restype = c_void_p
isl.isl_map_universe.argtypes = [c_void_p]
isl.isl_map_unshifted_simple_hull.restype = c_void_p
isl.isl_map_unshifted_simple_hull.argtypes = [c_void_p]
isl.isl_map_upper_bound_multi_pw_aff.restype = c_void_p
isl.isl_map_upper_bound_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_map_wrap.restype = c_void_p
isl.isl_map_wrap.argtypes = [c_void_p]
isl.isl_map_zip.restype = c_void_p
isl.isl_map_zip.argtypes = [c_void_p]
isl.isl_map_copy.restype = c_void_p
isl.isl_map_copy.argtypes = [c_void_p]
isl.isl_map_free.restype = c_void_p
isl.isl_map_free.argtypes = [c_void_p]
isl.isl_map_to_str.restype = POINTER(c_char)
isl.isl_map_to_str.argtypes = [c_void_p]
class basic_map(map):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_basic_map_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_basic_map_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ptr = isl.isl_basic_map_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.basic_map("""%s""")' % s
else:
return 'isl.basic_map("%s")' % s
def affine_hull(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_affine_hull(isl.isl_basic_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def apply_domain(arg0, arg1):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
try:
if not arg1.__class__ is basic_map:
arg1 = basic_map(arg1)
except:
return map(arg0).apply_domain(arg1)
ctx = arg0.ctx
res = isl.isl_basic_map_apply_domain(isl.isl_basic_map_copy(arg0.ptr), isl.isl_basic_map_copy(arg1.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def apply_range(arg0, arg1):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
try:
if not arg1.__class__ is basic_map:
arg1 = basic_map(arg1)
except:
return map(arg0).apply_range(arg1)
ctx = arg0.ctx
res = isl.isl_basic_map_apply_range(isl.isl_basic_map_copy(arg0.ptr), isl.isl_basic_map_copy(arg1.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def deltas(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_deltas(isl.isl_basic_map_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def detect_equalities(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_detect_equalities(isl.isl_basic_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def flatten(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_flatten(isl.isl_basic_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def flatten_domain(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_flatten_domain(isl.isl_basic_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def flatten_range(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_flatten_range(isl.isl_basic_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
try:
if not arg1.__class__ is basic_map:
arg1 = basic_map(arg1)
except:
return map(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_basic_map_gist(isl.isl_basic_map_copy(arg0.ptr), isl.isl_basic_map_copy(arg1.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def intersect(arg0, arg1):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
try:
if not arg1.__class__ is basic_map:
arg1 = basic_map(arg1)
except:
return map(arg0).intersect(arg1)
ctx = arg0.ctx
res = isl.isl_basic_map_intersect(isl.isl_basic_map_copy(arg0.ptr), isl.isl_basic_map_copy(arg1.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def intersect_domain(arg0, arg1):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
try:
if not arg1.__class__ is basic_set:
arg1 = basic_set(arg1)
except:
return map(arg0).intersect_domain(arg1)
ctx = arg0.ctx
res = isl.isl_basic_map_intersect_domain(isl.isl_basic_map_copy(arg0.ptr), isl.isl_basic_set_copy(arg1.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def intersect_range(arg0, arg1):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
try:
if not arg1.__class__ is basic_set:
arg1 = basic_set(arg1)
except:
return map(arg0).intersect_range(arg1)
ctx = arg0.ctx
res = isl.isl_basic_map_intersect_range(isl.isl_basic_map_copy(arg0.ptr), isl.isl_basic_set_copy(arg1.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def is_empty(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_is_empty(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_equal(arg0, arg1):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
try:
if not arg1.__class__ is basic_map:
arg1 = basic_map(arg1)
except:
return map(arg0).is_equal(arg1)
ctx = arg0.ctx
res = isl.isl_basic_map_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_subset(arg0, arg1):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
try:
if not arg1.__class__ is basic_map:
arg1 = basic_map(arg1)
except:
return map(arg0).is_subset(arg1)
ctx = arg0.ctx
res = isl.isl_basic_map_is_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def lexmax(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_lexmax(isl.isl_basic_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def lexmin(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_lexmin(isl.isl_basic_map_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def reverse(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_reverse(isl.isl_basic_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def sample(arg0):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_map_sample(isl.isl_basic_map_copy(arg0.ptr))
obj = basic_map(ctx=ctx, ptr=res)
return obj
def union(arg0, arg1):
try:
if not arg0.__class__ is basic_map:
arg0 = basic_map(arg0)
except:
raise
try:
if not arg1.__class__ is basic_map:
arg1 = basic_map(arg1)
except:
return map(arg0).union(arg1)
ctx = arg0.ctx
res = isl.isl_basic_map_union(isl.isl_basic_map_copy(arg0.ptr), isl.isl_basic_map_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
isl.isl_basic_map_read_from_str.restype = c_void_p
isl.isl_basic_map_read_from_str.argtypes = [Context, c_char_p]
isl.isl_basic_map_affine_hull.restype = c_void_p
isl.isl_basic_map_affine_hull.argtypes = [c_void_p]
isl.isl_basic_map_apply_domain.restype = c_void_p
isl.isl_basic_map_apply_domain.argtypes = [c_void_p, c_void_p]
isl.isl_basic_map_apply_range.restype = c_void_p
isl.isl_basic_map_apply_range.argtypes = [c_void_p, c_void_p]
isl.isl_basic_map_deltas.restype = c_void_p
isl.isl_basic_map_deltas.argtypes = [c_void_p]
isl.isl_basic_map_detect_equalities.restype = c_void_p
isl.isl_basic_map_detect_equalities.argtypes = [c_void_p]
isl.isl_basic_map_flatten.restype = c_void_p
isl.isl_basic_map_flatten.argtypes = [c_void_p]
isl.isl_basic_map_flatten_domain.restype = c_void_p
isl.isl_basic_map_flatten_domain.argtypes = [c_void_p]
isl.isl_basic_map_flatten_range.restype = c_void_p
isl.isl_basic_map_flatten_range.argtypes = [c_void_p]
isl.isl_basic_map_gist.restype = c_void_p
isl.isl_basic_map_gist.argtypes = [c_void_p, c_void_p]
isl.isl_basic_map_intersect.restype = c_void_p
isl.isl_basic_map_intersect.argtypes = [c_void_p, c_void_p]
isl.isl_basic_map_intersect_domain.restype = c_void_p
isl.isl_basic_map_intersect_domain.argtypes = [c_void_p, c_void_p]
isl.isl_basic_map_intersect_range.restype = c_void_p
isl.isl_basic_map_intersect_range.argtypes = [c_void_p, c_void_p]
isl.isl_basic_map_is_empty.argtypes = [c_void_p]
isl.isl_basic_map_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_basic_map_is_subset.argtypes = [c_void_p, c_void_p]
isl.isl_basic_map_lexmax.restype = c_void_p
isl.isl_basic_map_lexmax.argtypes = [c_void_p]
isl.isl_basic_map_lexmin.restype = c_void_p
isl.isl_basic_map_lexmin.argtypes = [c_void_p]
isl.isl_basic_map_reverse.restype = c_void_p
isl.isl_basic_map_reverse.argtypes = [c_void_p]
isl.isl_basic_map_sample.restype = c_void_p
isl.isl_basic_map_sample.argtypes = [c_void_p]
isl.isl_basic_map_union.restype = c_void_p
isl.isl_basic_map_union.argtypes = [c_void_p, c_void_p]
isl.isl_basic_map_copy.restype = c_void_p
isl.isl_basic_map_copy.argtypes = [c_void_p]
isl.isl_basic_map_free.restype = c_void_p
isl.isl_basic_map_free.argtypes = [c_void_p]
isl.isl_basic_map_to_str.restype = POINTER(c_char)
isl.isl_basic_map_to_str.argtypes = [c_void_p]
class union_set(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is basic_set:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_set_from_basic_set(isl.isl_basic_set_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is point:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_set_from_point(isl.isl_point_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is set:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_set_from_set(isl.isl_set_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_set_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_union_set_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ptr = isl.isl_union_set_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.union_set("""%s""")' % s
else:
return 'isl.union_set("%s")' % s
def affine_hull(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_affine_hull(isl.isl_union_set_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def apply(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_apply(isl.isl_union_set_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def as_set(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_as_set(isl.isl_union_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def coalesce(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_coalesce(isl.isl_union_set_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def compute_divs(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_compute_divs(isl.isl_union_set_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def detect_equalities(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_detect_equalities(isl.isl_union_set_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
@staticmethod
def empty(*args):
if len(args) == 0:
ctx = Context.getDefaultInstance()
res = isl.isl_union_set_empty_ctx(ctx)
obj = union_set(ctx=ctx, ptr=res)
return obj
raise Error
def every_set(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = set(ctx=arg0.ctx, ptr=isl.isl_set_copy(cb_arg0))
try:
res = arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 1 if res else 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_union_set_every_set(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
return bool(res)
def extract_set(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_extract_set(arg0.ptr, isl.isl_space_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def foreach_point(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = point(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_union_set_foreach_point(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def foreach_set(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = set(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_union_set_foreach_set(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def gist(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_gist(isl.isl_union_set_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def gist_params(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_gist_params(isl.isl_union_set_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def identity(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_identity(isl.isl_union_set_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
def intersect(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_intersect(isl.isl_union_set_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_intersect_params(isl.isl_union_set_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def is_disjoint(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_is_disjoint(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_empty(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_is_empty(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_equal(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_strict_subset(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_is_strict_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_subset(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_is_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def isa_set(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_isa_set(arg0.ptr)
if res < 0:
raise
return bool(res)
def lexmax(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_lexmax(isl.isl_union_set_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def lexmin(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_lexmin(isl.isl_union_set_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def polyhedral_hull(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_polyhedral_hull(isl.isl_union_set_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def preimage(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_union_set_preimage_multi_aff(isl.isl_union_set_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_set_preimage_pw_multi_aff(isl.isl_union_set_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_union_set_preimage_union_pw_multi_aff(isl.isl_union_set_copy(args[0].ptr), isl.isl_union_pw_multi_aff_copy(args[1].ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
raise Error
def sample_point(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_sample_point(isl.isl_union_set_copy(arg0.ptr))
obj = point(ctx=ctx, ptr=res)
return obj
def set_list(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_get_set_list(arg0.ptr)
obj = set_list(ctx=ctx, ptr=res)
return obj
def get_set_list(arg0):
return arg0.set_list()
def space(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def subtract(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_subtract(isl.isl_union_set_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def to_list(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_to_list(isl.isl_union_set_copy(arg0.ptr))
obj = union_set_list(ctx=ctx, ptr=res)
return obj
def union(arg0, arg1):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_union(isl.isl_union_set_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def universe(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_universe(isl.isl_union_set_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def unwrap(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_unwrap(isl.isl_union_set_copy(arg0.ptr))
obj = union_map(ctx=ctx, ptr=res)
return obj
isl.isl_union_set_from_basic_set.restype = c_void_p
isl.isl_union_set_from_basic_set.argtypes = [c_void_p]
isl.isl_union_set_from_point.restype = c_void_p
isl.isl_union_set_from_point.argtypes = [c_void_p]
isl.isl_union_set_from_set.restype = c_void_p
isl.isl_union_set_from_set.argtypes = [c_void_p]
isl.isl_union_set_read_from_str.restype = c_void_p
isl.isl_union_set_read_from_str.argtypes = [Context, c_char_p]
isl.isl_union_set_affine_hull.restype = c_void_p
isl.isl_union_set_affine_hull.argtypes = [c_void_p]
isl.isl_union_set_apply.restype = c_void_p
isl.isl_union_set_apply.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_as_set.restype = c_void_p
isl.isl_union_set_as_set.argtypes = [c_void_p]
isl.isl_union_set_coalesce.restype = c_void_p
isl.isl_union_set_coalesce.argtypes = [c_void_p]
isl.isl_union_set_compute_divs.restype = c_void_p
isl.isl_union_set_compute_divs.argtypes = [c_void_p]
isl.isl_union_set_detect_equalities.restype = c_void_p
isl.isl_union_set_detect_equalities.argtypes = [c_void_p]
isl.isl_union_set_empty_ctx.restype = c_void_p
isl.isl_union_set_empty_ctx.argtypes = [Context]
isl.isl_union_set_every_set.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_union_set_extract_set.restype = c_void_p
isl.isl_union_set_extract_set.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_foreach_point.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_union_set_foreach_set.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_union_set_gist.restype = c_void_p
isl.isl_union_set_gist.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_gist_params.restype = c_void_p
isl.isl_union_set_gist_params.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_identity.restype = c_void_p
isl.isl_union_set_identity.argtypes = [c_void_p]
isl.isl_union_set_intersect.restype = c_void_p
isl.isl_union_set_intersect.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_intersect_params.restype = c_void_p
isl.isl_union_set_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_is_disjoint.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_is_empty.argtypes = [c_void_p]
isl.isl_union_set_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_is_strict_subset.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_is_subset.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_isa_set.argtypes = [c_void_p]
isl.isl_union_set_lexmax.restype = c_void_p
isl.isl_union_set_lexmax.argtypes = [c_void_p]
isl.isl_union_set_lexmin.restype = c_void_p
isl.isl_union_set_lexmin.argtypes = [c_void_p]
isl.isl_union_set_polyhedral_hull.restype = c_void_p
isl.isl_union_set_polyhedral_hull.argtypes = [c_void_p]
isl.isl_union_set_preimage_multi_aff.restype = c_void_p
isl.isl_union_set_preimage_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_preimage_pw_multi_aff.restype = c_void_p
isl.isl_union_set_preimage_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_preimage_union_pw_multi_aff.restype = c_void_p
isl.isl_union_set_preimage_union_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_sample_point.restype = c_void_p
isl.isl_union_set_sample_point.argtypes = [c_void_p]
isl.isl_union_set_get_set_list.restype = c_void_p
isl.isl_union_set_get_set_list.argtypes = [c_void_p]
isl.isl_union_set_get_space.restype = c_void_p
isl.isl_union_set_get_space.argtypes = [c_void_p]
isl.isl_union_set_subtract.restype = c_void_p
isl.isl_union_set_subtract.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_to_list.restype = c_void_p
isl.isl_union_set_to_list.argtypes = [c_void_p]
isl.isl_union_set_union.restype = c_void_p
isl.isl_union_set_union.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_universe.restype = c_void_p
isl.isl_union_set_universe.argtypes = [c_void_p]
isl.isl_union_set_unwrap.restype = c_void_p
isl.isl_union_set_unwrap.argtypes = [c_void_p]
isl.isl_union_set_copy.restype = c_void_p
isl.isl_union_set_copy.argtypes = [c_void_p]
isl.isl_union_set_free.restype = c_void_p
isl.isl_union_set_free.argtypes = [c_void_p]
isl.isl_union_set_to_str.restype = POINTER(c_char)
isl.isl_union_set_to_str.argtypes = [c_void_p]
class set(union_set):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is basic_set:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_set_from_basic_set(isl.isl_basic_set_copy(args[0].ptr))
return
if len(args) == 1 and args[0].__class__ is point:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_set_from_point(isl.isl_point_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_set_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_set_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ptr = isl.isl_set_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.set("""%s""")' % s
else:
return 'isl.set("%s")' % s
def affine_hull(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_affine_hull(isl.isl_set_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def apply(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
return union_set(arg0).apply(arg1)
ctx = arg0.ctx
res = isl.isl_set_apply(isl.isl_set_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def as_pw_multi_aff(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_as_pw_multi_aff(isl.isl_set_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def bind(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return union_set(arg0).bind(arg1)
ctx = arg0.ctx
res = isl.isl_set_bind(isl.isl_set_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def coalesce(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_coalesce(isl.isl_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def complement(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_complement(isl.isl_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def detect_equalities(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_detect_equalities(isl.isl_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def dim_max_val(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_dim_max_val(isl.isl_set_copy(arg0.ptr), arg1)
obj = val(ctx=ctx, ptr=res)
return obj
def dim_min_val(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_dim_min_val(isl.isl_set_copy(arg0.ptr), arg1)
obj = val(ctx=ctx, ptr=res)
return obj
@staticmethod
def empty(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_empty(isl.isl_space_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def flatten(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_flatten(isl.isl_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def foreach_basic_set(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = basic_set(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_set_foreach_basic_set(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def foreach_point(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = point(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_set_foreach_point(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def gist(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_set_gist(isl.isl_set_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def identity(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_identity(isl.isl_set_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def indicator_function(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_indicator_function(isl.isl_set_copy(arg0.ptr))
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def insert_domain(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
return union_set(arg0).insert_domain(arg1)
ctx = arg0.ctx
res = isl.isl_set_insert_domain(isl.isl_set_copy(arg0.ptr), isl.isl_space_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def intersect(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).intersect(arg1)
ctx = arg0.ctx
res = isl.isl_set_intersect(isl.isl_set_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).intersect_params(arg1)
ctx = arg0.ctx
res = isl.isl_set_intersect_params(isl.isl_set_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def involves_locals(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_involves_locals(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_disjoint(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).is_disjoint(arg1)
ctx = arg0.ctx
res = isl.isl_set_is_disjoint(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_empty(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_is_empty(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_equal(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).is_equal(arg1)
ctx = arg0.ctx
res = isl.isl_set_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_singleton(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_is_singleton(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_strict_subset(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).is_strict_subset(arg1)
ctx = arg0.ctx
res = isl.isl_set_is_strict_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_subset(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).is_subset(arg1)
ctx = arg0.ctx
res = isl.isl_set_is_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_wrapping(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_is_wrapping(arg0.ptr)
if res < 0:
raise
return bool(res)
def lexmax(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_lexmax(isl.isl_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def lexmax_pw_multi_aff(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_lexmax_pw_multi_aff(isl.isl_set_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def lexmin(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_lexmin(isl.isl_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def lexmin_pw_multi_aff(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_lexmin_pw_multi_aff(isl.isl_set_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def lower_bound(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_set_lower_bound_multi_pw_aff(isl.isl_set_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_set_lower_bound_multi_val(isl.isl_set_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
raise Error
def max_multi_pw_aff(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_max_multi_pw_aff(isl.isl_set_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def max_val(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return union_set(arg0).max_val(arg1)
ctx = arg0.ctx
res = isl.isl_set_max_val(arg0.ptr, arg1.ptr)
obj = val(ctx=ctx, ptr=res)
return obj
def min_multi_pw_aff(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_min_multi_pw_aff(isl.isl_set_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def min_val(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is aff:
arg1 = aff(arg1)
except:
return union_set(arg0).min_val(arg1)
ctx = arg0.ctx
res = isl.isl_set_min_val(arg0.ptr, arg1.ptr)
obj = val(ctx=ctx, ptr=res)
return obj
def params(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_params(isl.isl_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def plain_multi_val_if_fixed(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_get_plain_multi_val_if_fixed(arg0.ptr)
obj = multi_val(ctx=ctx, ptr=res)
return obj
def get_plain_multi_val_if_fixed(arg0):
return arg0.plain_multi_val_if_fixed()
def polyhedral_hull(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_polyhedral_hull(isl.isl_set_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def preimage(*args):
if len(args) == 2 and args[1].__class__ is multi_aff:
ctx = args[0].ctx
res = isl.isl_set_preimage_multi_aff(isl.isl_set_copy(args[0].ptr), isl.isl_multi_aff_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_set_preimage_multi_pw_aff(isl.isl_set_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_set_preimage_pw_multi_aff(isl.isl_set_copy(args[0].ptr), isl.isl_pw_multi_aff_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
raise Error
def product(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).product(arg1)
ctx = arg0.ctx
res = isl.isl_set_product(isl.isl_set_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def project_out_all_params(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_project_out_all_params(isl.isl_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def project_out_param(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_set_project_out_param_id(isl.isl_set_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is id_list:
ctx = args[0].ctx
res = isl.isl_set_project_out_param_id_list(isl.isl_set_copy(args[0].ptr), isl.isl_id_list_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
raise Error
def pw_multi_aff_on_domain(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_set_pw_multi_aff_on_domain_multi_val(isl.isl_set_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def sample(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_sample(isl.isl_set_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def sample_point(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_sample_point(isl.isl_set_copy(arg0.ptr))
obj = point(ctx=ctx, ptr=res)
return obj
def simple_fixed_box_hull(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_get_simple_fixed_box_hull(arg0.ptr)
obj = fixed_box(ctx=ctx, ptr=res)
return obj
def get_simple_fixed_box_hull(arg0):
return arg0.simple_fixed_box_hull()
def space(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def stride(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_get_stride(arg0.ptr, arg1)
obj = val(ctx=ctx, ptr=res)
return obj
def get_stride(arg0, arg1):
return arg0.stride(arg1)
def subtract(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).subtract(arg1)
ctx = arg0.ctx
res = isl.isl_set_subtract(isl.isl_set_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def to_list(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_to_list(isl.isl_set_copy(arg0.ptr))
obj = set_list(ctx=ctx, ptr=res)
return obj
def to_union_set(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_to_union_set(isl.isl_set_copy(arg0.ptr))
obj = union_set(ctx=ctx, ptr=res)
return obj
def translation(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_translation(isl.isl_set_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def tuple_dim(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_tuple_dim(arg0.ptr)
if res < 0:
raise
return int(res)
def unbind_params(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return union_set(arg0).unbind_params(arg1)
ctx = arg0.ctx
res = isl.isl_set_unbind_params(isl.isl_set_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def unbind_params_insert_domain(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
return union_set(arg0).unbind_params_insert_domain(arg1)
ctx = arg0.ctx
res = isl.isl_set_unbind_params_insert_domain(isl.isl_set_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def union(arg0, arg1):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
return union_set(arg0).union(arg1)
ctx = arg0.ctx
res = isl.isl_set_union(isl.isl_set_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
@staticmethod
def universe(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_universe(isl.isl_space_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def unshifted_simple_hull(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_unshifted_simple_hull(isl.isl_set_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def unwrap(arg0):
try:
if not arg0.__class__ is set:
arg0 = set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_unwrap(isl.isl_set_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def upper_bound(*args):
if len(args) == 2 and args[1].__class__ is multi_pw_aff:
ctx = args[0].ctx
res = isl.isl_set_upper_bound_multi_pw_aff(isl.isl_set_copy(args[0].ptr), isl.isl_multi_pw_aff_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_set_upper_bound_multi_val(isl.isl_set_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = set(ctx=ctx, ptr=res)
return obj
raise Error
isl.isl_set_from_basic_set.restype = c_void_p
isl.isl_set_from_basic_set.argtypes = [c_void_p]
isl.isl_set_from_point.restype = c_void_p
isl.isl_set_from_point.argtypes = [c_void_p]
isl.isl_set_read_from_str.restype = c_void_p
isl.isl_set_read_from_str.argtypes = [Context, c_char_p]
isl.isl_set_affine_hull.restype = c_void_p
isl.isl_set_affine_hull.argtypes = [c_void_p]
isl.isl_set_apply.restype = c_void_p
isl.isl_set_apply.argtypes = [c_void_p, c_void_p]
isl.isl_set_as_pw_multi_aff.restype = c_void_p
isl.isl_set_as_pw_multi_aff.argtypes = [c_void_p]
isl.isl_set_bind.restype = c_void_p
isl.isl_set_bind.argtypes = [c_void_p, c_void_p]
isl.isl_set_coalesce.restype = c_void_p
isl.isl_set_coalesce.argtypes = [c_void_p]
isl.isl_set_complement.restype = c_void_p
isl.isl_set_complement.argtypes = [c_void_p]
isl.isl_set_detect_equalities.restype = c_void_p
isl.isl_set_detect_equalities.argtypes = [c_void_p]
isl.isl_set_dim_max_val.restype = c_void_p
isl.isl_set_dim_max_val.argtypes = [c_void_p, c_int]
isl.isl_set_dim_min_val.restype = c_void_p
isl.isl_set_dim_min_val.argtypes = [c_void_p, c_int]
isl.isl_set_empty.restype = c_void_p
isl.isl_set_empty.argtypes = [c_void_p]
isl.isl_set_flatten.restype = c_void_p
isl.isl_set_flatten.argtypes = [c_void_p]
isl.isl_set_foreach_basic_set.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_set_foreach_point.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_set_gist.restype = c_void_p
isl.isl_set_gist.argtypes = [c_void_p, c_void_p]
isl.isl_set_identity.restype = c_void_p
isl.isl_set_identity.argtypes = [c_void_p]
isl.isl_set_indicator_function.restype = c_void_p
isl.isl_set_indicator_function.argtypes = [c_void_p]
isl.isl_set_insert_domain.restype = c_void_p
isl.isl_set_insert_domain.argtypes = [c_void_p, c_void_p]
isl.isl_set_intersect.restype = c_void_p
isl.isl_set_intersect.argtypes = [c_void_p, c_void_p]
isl.isl_set_intersect_params.restype = c_void_p
isl.isl_set_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_set_involves_locals.argtypes = [c_void_p]
isl.isl_set_is_disjoint.argtypes = [c_void_p, c_void_p]
isl.isl_set_is_empty.argtypes = [c_void_p]
isl.isl_set_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_set_is_singleton.argtypes = [c_void_p]
isl.isl_set_is_strict_subset.argtypes = [c_void_p, c_void_p]
isl.isl_set_is_subset.argtypes = [c_void_p, c_void_p]
isl.isl_set_is_wrapping.argtypes = [c_void_p]
isl.isl_set_lexmax.restype = c_void_p
isl.isl_set_lexmax.argtypes = [c_void_p]
isl.isl_set_lexmax_pw_multi_aff.restype = c_void_p
isl.isl_set_lexmax_pw_multi_aff.argtypes = [c_void_p]
isl.isl_set_lexmin.restype = c_void_p
isl.isl_set_lexmin.argtypes = [c_void_p]
isl.isl_set_lexmin_pw_multi_aff.restype = c_void_p
isl.isl_set_lexmin_pw_multi_aff.argtypes = [c_void_p]
isl.isl_set_lower_bound_multi_pw_aff.restype = c_void_p
isl.isl_set_lower_bound_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_set_lower_bound_multi_val.restype = c_void_p
isl.isl_set_lower_bound_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_set_max_multi_pw_aff.restype = c_void_p
isl.isl_set_max_multi_pw_aff.argtypes = [c_void_p]
isl.isl_set_max_val.restype = c_void_p
isl.isl_set_max_val.argtypes = [c_void_p, c_void_p]
isl.isl_set_min_multi_pw_aff.restype = c_void_p
isl.isl_set_min_multi_pw_aff.argtypes = [c_void_p]
isl.isl_set_min_val.restype = c_void_p
isl.isl_set_min_val.argtypes = [c_void_p, c_void_p]
isl.isl_set_params.restype = c_void_p
isl.isl_set_params.argtypes = [c_void_p]
isl.isl_set_get_plain_multi_val_if_fixed.restype = c_void_p
isl.isl_set_get_plain_multi_val_if_fixed.argtypes = [c_void_p]
isl.isl_set_polyhedral_hull.restype = c_void_p
isl.isl_set_polyhedral_hull.argtypes = [c_void_p]
isl.isl_set_preimage_multi_aff.restype = c_void_p
isl.isl_set_preimage_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_set_preimage_multi_pw_aff.restype = c_void_p
isl.isl_set_preimage_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_set_preimage_pw_multi_aff.restype = c_void_p
isl.isl_set_preimage_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_set_product.restype = c_void_p
isl.isl_set_product.argtypes = [c_void_p, c_void_p]
isl.isl_set_project_out_all_params.restype = c_void_p
isl.isl_set_project_out_all_params.argtypes = [c_void_p]
isl.isl_set_project_out_param_id.restype = c_void_p
isl.isl_set_project_out_param_id.argtypes = [c_void_p, c_void_p]
isl.isl_set_project_out_param_id_list.restype = c_void_p
isl.isl_set_project_out_param_id_list.argtypes = [c_void_p, c_void_p]
isl.isl_set_pw_multi_aff_on_domain_multi_val.restype = c_void_p
isl.isl_set_pw_multi_aff_on_domain_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_set_sample.restype = c_void_p
isl.isl_set_sample.argtypes = [c_void_p]
isl.isl_set_sample_point.restype = c_void_p
isl.isl_set_sample_point.argtypes = [c_void_p]
isl.isl_set_get_simple_fixed_box_hull.restype = c_void_p
isl.isl_set_get_simple_fixed_box_hull.argtypes = [c_void_p]
isl.isl_set_get_space.restype = c_void_p
isl.isl_set_get_space.argtypes = [c_void_p]
isl.isl_set_get_stride.restype = c_void_p
isl.isl_set_get_stride.argtypes = [c_void_p, c_int]
isl.isl_set_subtract.restype = c_void_p
isl.isl_set_subtract.argtypes = [c_void_p, c_void_p]
isl.isl_set_to_list.restype = c_void_p
isl.isl_set_to_list.argtypes = [c_void_p]
isl.isl_set_to_union_set.restype = c_void_p
isl.isl_set_to_union_set.argtypes = [c_void_p]
isl.isl_set_translation.restype = c_void_p
isl.isl_set_translation.argtypes = [c_void_p]
isl.isl_set_tuple_dim.argtypes = [c_void_p]
isl.isl_set_unbind_params.restype = c_void_p
isl.isl_set_unbind_params.argtypes = [c_void_p, c_void_p]
isl.isl_set_unbind_params_insert_domain.restype = c_void_p
isl.isl_set_unbind_params_insert_domain.argtypes = [c_void_p, c_void_p]
isl.isl_set_union.restype = c_void_p
isl.isl_set_union.argtypes = [c_void_p, c_void_p]
isl.isl_set_universe.restype = c_void_p
isl.isl_set_universe.argtypes = [c_void_p]
isl.isl_set_unshifted_simple_hull.restype = c_void_p
isl.isl_set_unshifted_simple_hull.argtypes = [c_void_p]
isl.isl_set_unwrap.restype = c_void_p
isl.isl_set_unwrap.argtypes = [c_void_p]
isl.isl_set_upper_bound_multi_pw_aff.restype = c_void_p
isl.isl_set_upper_bound_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_set_upper_bound_multi_val.restype = c_void_p
isl.isl_set_upper_bound_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_set_copy.restype = c_void_p
isl.isl_set_copy.argtypes = [c_void_p]
isl.isl_set_free.restype = c_void_p
isl.isl_set_free.argtypes = [c_void_p]
isl.isl_set_to_str.restype = POINTER(c_char)
isl.isl_set_to_str.argtypes = [c_void_p]
class basic_set(set):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is point:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_basic_set_from_point(isl.isl_point_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_basic_set_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_basic_set_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ptr = isl.isl_basic_set_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.basic_set("""%s""")' % s
else:
return 'isl.basic_set("%s")' % s
def affine_hull(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_affine_hull(isl.isl_basic_set_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def apply(arg0, arg1):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
try:
if not arg1.__class__ is basic_map:
arg1 = basic_map(arg1)
except:
return set(arg0).apply(arg1)
ctx = arg0.ctx
res = isl.isl_basic_set_apply(isl.isl_basic_set_copy(arg0.ptr), isl.isl_basic_map_copy(arg1.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def detect_equalities(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_detect_equalities(isl.isl_basic_set_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def dim_max_val(arg0, arg1):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_dim_max_val(isl.isl_basic_set_copy(arg0.ptr), arg1)
obj = val(ctx=ctx, ptr=res)
return obj
def flatten(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_flatten(isl.isl_basic_set_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def gist(arg0, arg1):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
try:
if not arg1.__class__ is basic_set:
arg1 = basic_set(arg1)
except:
return set(arg0).gist(arg1)
ctx = arg0.ctx
res = isl.isl_basic_set_gist(isl.isl_basic_set_copy(arg0.ptr), isl.isl_basic_set_copy(arg1.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def intersect(arg0, arg1):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
try:
if not arg1.__class__ is basic_set:
arg1 = basic_set(arg1)
except:
return set(arg0).intersect(arg1)
ctx = arg0.ctx
res = isl.isl_basic_set_intersect(isl.isl_basic_set_copy(arg0.ptr), isl.isl_basic_set_copy(arg1.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def intersect_params(arg0, arg1):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
try:
if not arg1.__class__ is basic_set:
arg1 = basic_set(arg1)
except:
return set(arg0).intersect_params(arg1)
ctx = arg0.ctx
res = isl.isl_basic_set_intersect_params(isl.isl_basic_set_copy(arg0.ptr), isl.isl_basic_set_copy(arg1.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def is_empty(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_is_empty(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_equal(arg0, arg1):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
try:
if not arg1.__class__ is basic_set:
arg1 = basic_set(arg1)
except:
return set(arg0).is_equal(arg1)
ctx = arg0.ctx
res = isl.isl_basic_set_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_subset(arg0, arg1):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
try:
if not arg1.__class__ is basic_set:
arg1 = basic_set(arg1)
except:
return set(arg0).is_subset(arg1)
ctx = arg0.ctx
res = isl.isl_basic_set_is_subset(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_wrapping(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_is_wrapping(arg0.ptr)
if res < 0:
raise
return bool(res)
def lexmax(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_lexmax(isl.isl_basic_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def lexmin(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_lexmin(isl.isl_basic_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def params(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_params(isl.isl_basic_set_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def sample(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_sample(isl.isl_basic_set_copy(arg0.ptr))
obj = basic_set(ctx=ctx, ptr=res)
return obj
def sample_point(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_sample_point(isl.isl_basic_set_copy(arg0.ptr))
obj = point(ctx=ctx, ptr=res)
return obj
def to_set(arg0):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_basic_set_to_set(isl.isl_basic_set_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def union(arg0, arg1):
try:
if not arg0.__class__ is basic_set:
arg0 = basic_set(arg0)
except:
raise
try:
if not arg1.__class__ is basic_set:
arg1 = basic_set(arg1)
except:
return set(arg0).union(arg1)
ctx = arg0.ctx
res = isl.isl_basic_set_union(isl.isl_basic_set_copy(arg0.ptr), isl.isl_basic_set_copy(arg1.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
isl.isl_basic_set_from_point.restype = c_void_p
isl.isl_basic_set_from_point.argtypes = [c_void_p]
isl.isl_basic_set_read_from_str.restype = c_void_p
isl.isl_basic_set_read_from_str.argtypes = [Context, c_char_p]
isl.isl_basic_set_affine_hull.restype = c_void_p
isl.isl_basic_set_affine_hull.argtypes = [c_void_p]
isl.isl_basic_set_apply.restype = c_void_p
isl.isl_basic_set_apply.argtypes = [c_void_p, c_void_p]
isl.isl_basic_set_detect_equalities.restype = c_void_p
isl.isl_basic_set_detect_equalities.argtypes = [c_void_p]
isl.isl_basic_set_dim_max_val.restype = c_void_p
isl.isl_basic_set_dim_max_val.argtypes = [c_void_p, c_int]
isl.isl_basic_set_flatten.restype = c_void_p
isl.isl_basic_set_flatten.argtypes = [c_void_p]
isl.isl_basic_set_gist.restype = c_void_p
isl.isl_basic_set_gist.argtypes = [c_void_p, c_void_p]
isl.isl_basic_set_intersect.restype = c_void_p
isl.isl_basic_set_intersect.argtypes = [c_void_p, c_void_p]
isl.isl_basic_set_intersect_params.restype = c_void_p
isl.isl_basic_set_intersect_params.argtypes = [c_void_p, c_void_p]
isl.isl_basic_set_is_empty.argtypes = [c_void_p]
isl.isl_basic_set_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_basic_set_is_subset.argtypes = [c_void_p, c_void_p]
isl.isl_basic_set_is_wrapping.argtypes = [c_void_p]
isl.isl_basic_set_lexmax.restype = c_void_p
isl.isl_basic_set_lexmax.argtypes = [c_void_p]
isl.isl_basic_set_lexmin.restype = c_void_p
isl.isl_basic_set_lexmin.argtypes = [c_void_p]
isl.isl_basic_set_params.restype = c_void_p
isl.isl_basic_set_params.argtypes = [c_void_p]
isl.isl_basic_set_sample.restype = c_void_p
isl.isl_basic_set_sample.argtypes = [c_void_p]
isl.isl_basic_set_sample_point.restype = c_void_p
isl.isl_basic_set_sample_point.argtypes = [c_void_p]
isl.isl_basic_set_to_set.restype = c_void_p
isl.isl_basic_set_to_set.argtypes = [c_void_p]
isl.isl_basic_set_union.restype = c_void_p
isl.isl_basic_set_union.argtypes = [c_void_p, c_void_p]
isl.isl_basic_set_copy.restype = c_void_p
isl.isl_basic_set_copy.argtypes = [c_void_p]
isl.isl_basic_set_free.restype = c_void_p
isl.isl_basic_set_free.argtypes = [c_void_p]
isl.isl_basic_set_to_str.restype = POINTER(c_char)
isl.isl_basic_set_to_str.argtypes = [c_void_p]
class fixed_box(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_fixed_box_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is fixed_box:
arg0 = fixed_box(arg0)
except:
raise
ptr = isl.isl_fixed_box_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.fixed_box("""%s""")' % s
else:
return 'isl.fixed_box("%s")' % s
def is_valid(arg0):
try:
if not arg0.__class__ is fixed_box:
arg0 = fixed_box(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_fixed_box_is_valid(arg0.ptr)
if res < 0:
raise
return bool(res)
def offset(arg0):
try:
if not arg0.__class__ is fixed_box:
arg0 = fixed_box(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_fixed_box_get_offset(arg0.ptr)
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def get_offset(arg0):
return arg0.offset()
def size(arg0):
try:
if not arg0.__class__ is fixed_box:
arg0 = fixed_box(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_fixed_box_get_size(arg0.ptr)
obj = multi_val(ctx=ctx, ptr=res)
return obj
def get_size(arg0):
return arg0.size()
def space(arg0):
try:
if not arg0.__class__ is fixed_box:
arg0 = fixed_box(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_fixed_box_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
isl.isl_fixed_box_is_valid.argtypes = [c_void_p]
isl.isl_fixed_box_get_offset.restype = c_void_p
isl.isl_fixed_box_get_offset.argtypes = [c_void_p]
isl.isl_fixed_box_get_size.restype = c_void_p
isl.isl_fixed_box_get_size.argtypes = [c_void_p]
isl.isl_fixed_box_get_space.restype = c_void_p
isl.isl_fixed_box_get_space.argtypes = [c_void_p]
isl.isl_fixed_box_copy.restype = c_void_p
isl.isl_fixed_box_copy.argtypes = [c_void_p]
isl.isl_fixed_box_free.restype = c_void_p
isl.isl_fixed_box_free.argtypes = [c_void_p]
isl.isl_fixed_box_to_str.restype = POINTER(c_char)
isl.isl_fixed_box_to_str.argtypes = [c_void_p]
class id(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_id_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_id_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is id:
arg0 = id(arg0)
except:
raise
ptr = isl.isl_id_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.id("""%s""")' % s
else:
return 'isl.id("%s")' % s
def name(arg0):
try:
if not arg0.__class__ is id:
arg0 = id(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_id_get_name(arg0.ptr)
if res == 0:
raise
string = cast(res, c_char_p).value.decode('ascii')
return string
def get_name(arg0):
return arg0.name()
def to_list(arg0):
try:
if not arg0.__class__ is id:
arg0 = id(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_id_to_list(isl.isl_id_copy(arg0.ptr))
obj = id_list(ctx=ctx, ptr=res)
return obj
isl.isl_id_read_from_str.restype = c_void_p
isl.isl_id_read_from_str.argtypes = [Context, c_char_p]
isl.isl_id_get_name.restype = POINTER(c_char)
isl.isl_id_get_name.argtypes = [c_void_p]
isl.isl_id_to_list.restype = c_void_p
isl.isl_id_to_list.argtypes = [c_void_p]
isl.isl_id_copy.restype = c_void_p
isl.isl_id_copy.argtypes = [c_void_p]
isl.isl_id_free.restype = c_void_p
isl.isl_id_free.argtypes = [c_void_p]
isl.isl_id_to_str.restype = POINTER(c_char)
isl.isl_id_to_str.argtypes = [c_void_p]
class id_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_id_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and (args[0].__class__ is id or type(args[0]) == str):
args = list(args)
try:
if not args[0].__class__ is id:
args[0] = id(args[0])
except:
raise
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_id_list_from_id(isl.isl_id_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_id_list_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_id_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is id_list:
arg0 = id_list(arg0)
except:
raise
ptr = isl.isl_id_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.id_list("""%s""")' % s
else:
return 'isl.id_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is id_list:
arg0 = id_list(arg0)
except:
raise
try:
if not arg1.__class__ is id:
arg1 = id(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_id_list_add(isl.isl_id_list_copy(arg0.ptr), isl.isl_id_copy(arg1.ptr))
obj = id_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is id_list:
arg0 = id_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_id_list_get_at(arg0.ptr, arg1)
obj = id(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is id_list:
arg0 = id_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_id_list_clear(isl.isl_id_list_copy(arg0.ptr))
obj = id_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is id_list:
arg0 = id_list(arg0)
except:
raise
try:
if not arg1.__class__ is id_list:
arg1 = id_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_id_list_concat(isl.isl_id_list_copy(arg0.ptr), isl.isl_id_list_copy(arg1.ptr))
obj = id_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is id_list:
arg0 = id_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_id_list_drop(isl.isl_id_list_copy(arg0.ptr), arg1, arg2)
obj = id_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is id_list:
arg0 = id_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = id(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_id_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is id_list:
arg0 = id_list(arg0)
except:
raise
try:
if not arg2.__class__ is id:
arg2 = id(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_id_list_insert(isl.isl_id_list_copy(arg0.ptr), arg1, isl.isl_id_copy(arg2.ptr))
obj = id_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is id_list:
arg0 = id_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_id_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_id_list_alloc.restype = c_void_p
isl.isl_id_list_alloc.argtypes = [Context, c_int]
isl.isl_id_list_from_id.restype = c_void_p
isl.isl_id_list_from_id.argtypes = [c_void_p]
isl.isl_id_list_read_from_str.restype = c_void_p
isl.isl_id_list_read_from_str.argtypes = [Context, c_char_p]
isl.isl_id_list_add.restype = c_void_p
isl.isl_id_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_id_list_get_at.restype = c_void_p
isl.isl_id_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_id_list_clear.restype = c_void_p
isl.isl_id_list_clear.argtypes = [c_void_p]
isl.isl_id_list_concat.restype = c_void_p
isl.isl_id_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_id_list_drop.restype = c_void_p
isl.isl_id_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_id_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_id_list_insert.restype = c_void_p
isl.isl_id_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_id_list_size.argtypes = [c_void_p]
isl.isl_id_list_copy.restype = c_void_p
isl.isl_id_list_copy.argtypes = [c_void_p]
isl.isl_id_list_free.restype = c_void_p
isl.isl_id_list_free.argtypes = [c_void_p]
isl.isl_id_list_to_str.restype = POINTER(c_char)
isl.isl_id_list_to_str.argtypes = [c_void_p]
class map_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_map_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and args[0].__class__ is map:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_map_list_from_map(isl.isl_map_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_map_list_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_map_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is map_list:
arg0 = map_list(arg0)
except:
raise
ptr = isl.isl_map_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.map_list("""%s""")' % s
else:
return 'isl.map_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is map_list:
arg0 = map_list(arg0)
except:
raise
try:
if not arg1.__class__ is map:
arg1 = map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_list_add(isl.isl_map_list_copy(arg0.ptr), isl.isl_map_copy(arg1.ptr))
obj = map_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is map_list:
arg0 = map_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_list_get_at(arg0.ptr, arg1)
obj = map(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is map_list:
arg0 = map_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_list_clear(isl.isl_map_list_copy(arg0.ptr))
obj = map_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is map_list:
arg0 = map_list(arg0)
except:
raise
try:
if not arg1.__class__ is map_list:
arg1 = map_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_list_concat(isl.isl_map_list_copy(arg0.ptr), isl.isl_map_list_copy(arg1.ptr))
obj = map_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is map_list:
arg0 = map_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_list_drop(isl.isl_map_list_copy(arg0.ptr), arg1, arg2)
obj = map_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is map_list:
arg0 = map_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = map(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_map_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is map_list:
arg0 = map_list(arg0)
except:
raise
try:
if not arg2.__class__ is map:
arg2 = map(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_list_insert(isl.isl_map_list_copy(arg0.ptr), arg1, isl.isl_map_copy(arg2.ptr))
obj = map_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is map_list:
arg0 = map_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_map_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_map_list_alloc.restype = c_void_p
isl.isl_map_list_alloc.argtypes = [Context, c_int]
isl.isl_map_list_from_map.restype = c_void_p
isl.isl_map_list_from_map.argtypes = [c_void_p]
isl.isl_map_list_read_from_str.restype = c_void_p
isl.isl_map_list_read_from_str.argtypes = [Context, c_char_p]
isl.isl_map_list_add.restype = c_void_p
isl.isl_map_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_map_list_get_at.restype = c_void_p
isl.isl_map_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_map_list_clear.restype = c_void_p
isl.isl_map_list_clear.argtypes = [c_void_p]
isl.isl_map_list_concat.restype = c_void_p
isl.isl_map_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_map_list_drop.restype = c_void_p
isl.isl_map_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_map_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_map_list_insert.restype = c_void_p
isl.isl_map_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_map_list_size.argtypes = [c_void_p]
isl.isl_map_list_copy.restype = c_void_p
isl.isl_map_list_copy.argtypes = [c_void_p]
isl.isl_map_list_free.restype = c_void_p
isl.isl_map_list_free.argtypes = [c_void_p]
isl.isl_map_list_to_str.restype = POINTER(c_char)
isl.isl_map_list_to_str.argtypes = [c_void_p]
class multi_id(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 2 and args[0].__class__ is space and args[1].__class__ is id_list:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_id_from_id_list(isl.isl_space_copy(args[0].ptr), isl.isl_id_list_copy(args[1].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_id_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_multi_id_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is multi_id:
arg0 = multi_id(arg0)
except:
raise
ptr = isl.isl_multi_id_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.multi_id("""%s""")' % s
else:
return 'isl.multi_id("%s")' % s
def at(arg0, arg1):
try:
if not arg0.__class__ is multi_id:
arg0 = multi_id(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_id_get_at(arg0.ptr, arg1)
obj = id(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def flat_range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_id:
arg0 = multi_id(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_id_flat_range_product(isl.isl_multi_id_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = multi_id(ctx=ctx, ptr=res)
return obj
def list(arg0):
try:
if not arg0.__class__ is multi_id:
arg0 = multi_id(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_id_get_list(arg0.ptr)
obj = id_list(ctx=ctx, ptr=res)
return obj
def get_list(arg0):
return arg0.list()
def plain_is_equal(arg0, arg1):
try:
if not arg0.__class__ is multi_id:
arg0 = multi_id(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_id_plain_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_id:
arg0 = multi_id(arg0)
except:
raise
try:
if not arg1.__class__ is multi_id:
arg1 = multi_id(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_id_range_product(isl.isl_multi_id_copy(arg0.ptr), isl.isl_multi_id_copy(arg1.ptr))
obj = multi_id(ctx=ctx, ptr=res)
return obj
def set_at(arg0, arg1, arg2):
try:
if not arg0.__class__ is multi_id:
arg0 = multi_id(arg0)
except:
raise
try:
if not arg2.__class__ is id:
arg2 = id(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_id_set_at(isl.isl_multi_id_copy(arg0.ptr), arg1, isl.isl_id_copy(arg2.ptr))
obj = multi_id(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is multi_id:
arg0 = multi_id(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_id_size(arg0.ptr)
if res < 0:
raise
return int(res)
def space(arg0):
try:
if not arg0.__class__ is multi_id:
arg0 = multi_id(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_id_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
isl.isl_multi_id_from_id_list.restype = c_void_p
isl.isl_multi_id_from_id_list.argtypes = [c_void_p, c_void_p]
isl.isl_multi_id_read_from_str.restype = c_void_p
isl.isl_multi_id_read_from_str.argtypes = [Context, c_char_p]
isl.isl_multi_id_get_at.restype = c_void_p
isl.isl_multi_id_get_at.argtypes = [c_void_p, c_int]
isl.isl_multi_id_flat_range_product.restype = c_void_p
isl.isl_multi_id_flat_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_id_get_list.restype = c_void_p
isl.isl_multi_id_get_list.argtypes = [c_void_p]
isl.isl_multi_id_plain_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_multi_id_range_product.restype = c_void_p
isl.isl_multi_id_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_id_set_at.restype = c_void_p
isl.isl_multi_id_set_at.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_multi_id_size.argtypes = [c_void_p]
isl.isl_multi_id_get_space.restype = c_void_p
isl.isl_multi_id_get_space.argtypes = [c_void_p]
isl.isl_multi_id_copy.restype = c_void_p
isl.isl_multi_id_copy.argtypes = [c_void_p]
isl.isl_multi_id_free.restype = c_void_p
isl.isl_multi_id_free.argtypes = [c_void_p]
isl.isl_multi_id_to_str.restype = POINTER(c_char)
isl.isl_multi_id_to_str.argtypes = [c_void_p]
class multi_val(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 2 and args[0].__class__ is space and args[1].__class__ is val_list:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_val_from_val_list(isl.isl_space_copy(args[0].ptr), isl.isl_val_list_copy(args[1].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_multi_val_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_multi_val_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ptr = isl.isl_multi_val_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.multi_val("""%s""")' % s
else:
return 'isl.multi_val("%s")' % s
def add(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_val_add(isl.isl_multi_val_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_val_add_val(isl.isl_multi_val_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
raise Error
def at(arg0, arg1):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_get_at(arg0.ptr, arg1)
obj = val(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def flat_range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_flat_range_product(isl.isl_multi_val_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def has_range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_has_range_tuple_id(arg0.ptr)
if res < 0:
raise
return bool(res)
def involves_nan(arg0):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_involves_nan(arg0.ptr)
if res < 0:
raise
return bool(res)
def list(arg0):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_get_list(arg0.ptr)
obj = val_list(ctx=ctx, ptr=res)
return obj
def get_list(arg0):
return arg0.list()
def max(arg0, arg1):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_max(isl.isl_multi_val_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def min(arg0, arg1):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_min(isl.isl_multi_val_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def neg(arg0):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_neg(isl.isl_multi_val_copy(arg0.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def plain_is_equal(arg0, arg1):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_plain_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def product(arg0, arg1):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_product(isl.isl_multi_val_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def range_product(arg0, arg1):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_range_product(isl.isl_multi_val_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_get_range_tuple_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_range_tuple_id(arg0):
return arg0.range_tuple_id()
def reset_range_tuple_id(arg0):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_reset_range_tuple_id(isl.isl_multi_val_copy(arg0.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def scale(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_val_scale_multi_val(isl.isl_multi_val_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_val_scale_val(isl.isl_multi_val_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
raise Error
def scale_down(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_multi_val_scale_down_multi_val(isl.isl_multi_val_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
if len(args) == 2 and (args[1].__class__ is val or type(args[1]) == int):
args = list(args)
try:
if not args[1].__class__ is val:
args[1] = val(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_val_scale_down_val(isl.isl_multi_val_copy(args[0].ptr), isl.isl_val_copy(args[1].ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
raise Error
def set_at(arg0, arg1, arg2):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
try:
if not arg2.__class__ is val:
arg2 = val(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_set_at(isl.isl_multi_val_copy(arg0.ptr), arg1, isl.isl_val_copy(arg2.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def set_range_tuple(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_multi_val_set_range_tuple_id(isl.isl_multi_val_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
raise Error
def size(arg0):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_size(arg0.ptr)
if res < 0:
raise
return int(res)
def space(arg0):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_get_space(arg0.ptr)
obj = space(ctx=ctx, ptr=res)
return obj
def get_space(arg0):
return arg0.space()
def sub(arg0, arg1):
try:
if not arg0.__class__ is multi_val:
arg0 = multi_val(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_sub(isl.isl_multi_val_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
@staticmethod
def zero(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_multi_val_zero(isl.isl_space_copy(arg0.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
isl.isl_multi_val_from_val_list.restype = c_void_p
isl.isl_multi_val_from_val_list.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_read_from_str.restype = c_void_p
isl.isl_multi_val_read_from_str.argtypes = [Context, c_char_p]
isl.isl_multi_val_add.restype = c_void_p
isl.isl_multi_val_add.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_add_val.restype = c_void_p
isl.isl_multi_val_add_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_get_at.restype = c_void_p
isl.isl_multi_val_get_at.argtypes = [c_void_p, c_int]
isl.isl_multi_val_flat_range_product.restype = c_void_p
isl.isl_multi_val_flat_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_has_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_val_involves_nan.argtypes = [c_void_p]
isl.isl_multi_val_get_list.restype = c_void_p
isl.isl_multi_val_get_list.argtypes = [c_void_p]
isl.isl_multi_val_max.restype = c_void_p
isl.isl_multi_val_max.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_min.restype = c_void_p
isl.isl_multi_val_min.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_neg.restype = c_void_p
isl.isl_multi_val_neg.argtypes = [c_void_p]
isl.isl_multi_val_plain_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_product.restype = c_void_p
isl.isl_multi_val_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_range_product.restype = c_void_p
isl.isl_multi_val_range_product.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_get_range_tuple_id.restype = c_void_p
isl.isl_multi_val_get_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_val_reset_range_tuple_id.restype = c_void_p
isl.isl_multi_val_reset_range_tuple_id.argtypes = [c_void_p]
isl.isl_multi_val_scale_multi_val.restype = c_void_p
isl.isl_multi_val_scale_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_scale_val.restype = c_void_p
isl.isl_multi_val_scale_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_scale_down_multi_val.restype = c_void_p
isl.isl_multi_val_scale_down_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_scale_down_val.restype = c_void_p
isl.isl_multi_val_scale_down_val.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_set_at.restype = c_void_p
isl.isl_multi_val_set_at.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_multi_val_set_range_tuple_id.restype = c_void_p
isl.isl_multi_val_set_range_tuple_id.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_size.argtypes = [c_void_p]
isl.isl_multi_val_get_space.restype = c_void_p
isl.isl_multi_val_get_space.argtypes = [c_void_p]
isl.isl_multi_val_sub.restype = c_void_p
isl.isl_multi_val_sub.argtypes = [c_void_p, c_void_p]
isl.isl_multi_val_zero.restype = c_void_p
isl.isl_multi_val_zero.argtypes = [c_void_p]
isl.isl_multi_val_copy.restype = c_void_p
isl.isl_multi_val_copy.argtypes = [c_void_p]
isl.isl_multi_val_free.restype = c_void_p
isl.isl_multi_val_free.argtypes = [c_void_p]
isl.isl_multi_val_to_str.restype = POINTER(c_char)
isl.isl_multi_val_to_str.argtypes = [c_void_p]
class point(basic_set):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_point_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is point:
arg0 = point(arg0)
except:
raise
ptr = isl.isl_point_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.point("""%s""")' % s
else:
return 'isl.point("%s")' % s
def multi_val(arg0):
try:
if not arg0.__class__ is point:
arg0 = point(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_point_get_multi_val(arg0.ptr)
obj = multi_val(ctx=ctx, ptr=res)
return obj
def get_multi_val(arg0):
return arg0.multi_val()
def to_set(arg0):
try:
if not arg0.__class__ is point:
arg0 = point(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_point_to_set(isl.isl_point_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
isl.isl_point_get_multi_val.restype = c_void_p
isl.isl_point_get_multi_val.argtypes = [c_void_p]
isl.isl_point_to_set.restype = c_void_p
isl.isl_point_to_set.argtypes = [c_void_p]
isl.isl_point_copy.restype = c_void_p
isl.isl_point_copy.argtypes = [c_void_p]
isl.isl_point_free.restype = c_void_p
isl.isl_point_free.argtypes = [c_void_p]
isl.isl_point_to_str.restype = POINTER(c_char)
isl.isl_point_to_str.argtypes = [c_void_p]
class pw_aff_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_aff_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and args[0].__class__ is pw_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_aff_list_from_pw_aff(isl.isl_pw_aff_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_aff_list_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_pw_aff_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is pw_aff_list:
arg0 = pw_aff_list(arg0)
except:
raise
ptr = isl.isl_pw_aff_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.pw_aff_list("""%s""")' % s
else:
return 'isl.pw_aff_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is pw_aff_list:
arg0 = pw_aff_list(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff:
arg1 = pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_list_add(isl.isl_pw_aff_list_copy(arg0.ptr), isl.isl_pw_aff_copy(arg1.ptr))
obj = pw_aff_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is pw_aff_list:
arg0 = pw_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_list_get_at(arg0.ptr, arg1)
obj = pw_aff(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is pw_aff_list:
arg0 = pw_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_list_clear(isl.isl_pw_aff_list_copy(arg0.ptr))
obj = pw_aff_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is pw_aff_list:
arg0 = pw_aff_list(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff_list:
arg1 = pw_aff_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_list_concat(isl.isl_pw_aff_list_copy(arg0.ptr), isl.isl_pw_aff_list_copy(arg1.ptr))
obj = pw_aff_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is pw_aff_list:
arg0 = pw_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_list_drop(isl.isl_pw_aff_list_copy(arg0.ptr), arg1, arg2)
obj = pw_aff_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is pw_aff_list:
arg0 = pw_aff_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = pw_aff(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_pw_aff_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is pw_aff_list:
arg0 = pw_aff_list(arg0)
except:
raise
try:
if not arg2.__class__ is pw_aff:
arg2 = pw_aff(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_list_insert(isl.isl_pw_aff_list_copy(arg0.ptr), arg1, isl.isl_pw_aff_copy(arg2.ptr))
obj = pw_aff_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is pw_aff_list:
arg0 = pw_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_aff_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_pw_aff_list_alloc.restype = c_void_p
isl.isl_pw_aff_list_alloc.argtypes = [Context, c_int]
isl.isl_pw_aff_list_from_pw_aff.restype = c_void_p
isl.isl_pw_aff_list_from_pw_aff.argtypes = [c_void_p]
isl.isl_pw_aff_list_read_from_str.restype = c_void_p
isl.isl_pw_aff_list_read_from_str.argtypes = [Context, c_char_p]
isl.isl_pw_aff_list_add.restype = c_void_p
isl.isl_pw_aff_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_list_get_at.restype = c_void_p
isl.isl_pw_aff_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_pw_aff_list_clear.restype = c_void_p
isl.isl_pw_aff_list_clear.argtypes = [c_void_p]
isl.isl_pw_aff_list_concat.restype = c_void_p
isl.isl_pw_aff_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_pw_aff_list_drop.restype = c_void_p
isl.isl_pw_aff_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_pw_aff_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_pw_aff_list_insert.restype = c_void_p
isl.isl_pw_aff_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_pw_aff_list_size.argtypes = [c_void_p]
isl.isl_pw_aff_list_copy.restype = c_void_p
isl.isl_pw_aff_list_copy.argtypes = [c_void_p]
isl.isl_pw_aff_list_free.restype = c_void_p
isl.isl_pw_aff_list_free.argtypes = [c_void_p]
isl.isl_pw_aff_list_to_str.restype = POINTER(c_char)
isl.isl_pw_aff_list_to_str.argtypes = [c_void_p]
class pw_multi_aff_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_multi_aff_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and args[0].__class__ is pw_multi_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_multi_aff_list_from_pw_multi_aff(isl.isl_pw_multi_aff_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_pw_multi_aff_list_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_pw_multi_aff_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is pw_multi_aff_list:
arg0 = pw_multi_aff_list(arg0)
except:
raise
ptr = isl.isl_pw_multi_aff_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.pw_multi_aff_list("""%s""")' % s
else:
return 'isl.pw_multi_aff_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff_list:
arg0 = pw_multi_aff_list(arg0)
except:
raise
try:
if not arg1.__class__ is pw_multi_aff:
arg1 = pw_multi_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_list_add(isl.isl_pw_multi_aff_list_copy(arg0.ptr), isl.isl_pw_multi_aff_copy(arg1.ptr))
obj = pw_multi_aff_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff_list:
arg0 = pw_multi_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_list_get_at(arg0.ptr, arg1)
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is pw_multi_aff_list:
arg0 = pw_multi_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_list_clear(isl.isl_pw_multi_aff_list_copy(arg0.ptr))
obj = pw_multi_aff_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff_list:
arg0 = pw_multi_aff_list(arg0)
except:
raise
try:
if not arg1.__class__ is pw_multi_aff_list:
arg1 = pw_multi_aff_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_list_concat(isl.isl_pw_multi_aff_list_copy(arg0.ptr), isl.isl_pw_multi_aff_list_copy(arg1.ptr))
obj = pw_multi_aff_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is pw_multi_aff_list:
arg0 = pw_multi_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_list_drop(isl.isl_pw_multi_aff_list_copy(arg0.ptr), arg1, arg2)
obj = pw_multi_aff_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is pw_multi_aff_list:
arg0 = pw_multi_aff_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = pw_multi_aff(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is pw_multi_aff_list:
arg0 = pw_multi_aff_list(arg0)
except:
raise
try:
if not arg2.__class__ is pw_multi_aff:
arg2 = pw_multi_aff(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_list_insert(isl.isl_pw_multi_aff_list_copy(arg0.ptr), arg1, isl.isl_pw_multi_aff_copy(arg2.ptr))
obj = pw_multi_aff_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is pw_multi_aff_list:
arg0 = pw_multi_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_pw_multi_aff_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_pw_multi_aff_list_alloc.restype = c_void_p
isl.isl_pw_multi_aff_list_alloc.argtypes = [Context, c_int]
isl.isl_pw_multi_aff_list_from_pw_multi_aff.restype = c_void_p
isl.isl_pw_multi_aff_list_from_pw_multi_aff.argtypes = [c_void_p]
isl.isl_pw_multi_aff_list_read_from_str.restype = c_void_p
isl.isl_pw_multi_aff_list_read_from_str.argtypes = [Context, c_char_p]
isl.isl_pw_multi_aff_list_add.restype = c_void_p
isl.isl_pw_multi_aff_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_list_get_at.restype = c_void_p
isl.isl_pw_multi_aff_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_pw_multi_aff_list_clear.restype = c_void_p
isl.isl_pw_multi_aff_list_clear.argtypes = [c_void_p]
isl.isl_pw_multi_aff_list_concat.restype = c_void_p
isl.isl_pw_multi_aff_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_pw_multi_aff_list_drop.restype = c_void_p
isl.isl_pw_multi_aff_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_pw_multi_aff_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_pw_multi_aff_list_insert.restype = c_void_p
isl.isl_pw_multi_aff_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_pw_multi_aff_list_size.argtypes = [c_void_p]
isl.isl_pw_multi_aff_list_copy.restype = c_void_p
isl.isl_pw_multi_aff_list_copy.argtypes = [c_void_p]
isl.isl_pw_multi_aff_list_free.restype = c_void_p
isl.isl_pw_multi_aff_list_free.argtypes = [c_void_p]
isl.isl_pw_multi_aff_list_to_str.restype = POINTER(c_char)
isl.isl_pw_multi_aff_list_to_str.argtypes = [c_void_p]
class schedule(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_schedule_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is schedule:
arg0 = schedule(arg0)
except:
raise
ptr = isl.isl_schedule_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule("""%s""")' % s
else:
return 'isl.schedule("%s")' % s
def domain(arg0):
try:
if not arg0.__class__ is schedule:
arg0 = schedule(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_get_domain(arg0.ptr)
obj = union_set(ctx=ctx, ptr=res)
return obj
def get_domain(arg0):
return arg0.domain()
@staticmethod
def from_domain(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_from_domain(isl.isl_union_set_copy(arg0.ptr))
obj = schedule(ctx=ctx, ptr=res)
return obj
def map(arg0):
try:
if not arg0.__class__ is schedule:
arg0 = schedule(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_get_map(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_map(arg0):
return arg0.map()
def pullback(*args):
if len(args) == 2 and args[1].__class__ is union_pw_multi_aff:
ctx = args[0].ctx
res = isl.isl_schedule_pullback_union_pw_multi_aff(isl.isl_schedule_copy(args[0].ptr), isl.isl_union_pw_multi_aff_copy(args[1].ptr))
obj = schedule(ctx=ctx, ptr=res)
return obj
raise Error
def root(arg0):
try:
if not arg0.__class__ is schedule:
arg0 = schedule(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_get_root(arg0.ptr)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def get_root(arg0):
return arg0.root()
isl.isl_schedule_read_from_str.restype = c_void_p
isl.isl_schedule_read_from_str.argtypes = [Context, c_char_p]
isl.isl_schedule_get_domain.restype = c_void_p
isl.isl_schedule_get_domain.argtypes = [c_void_p]
isl.isl_schedule_from_domain.restype = c_void_p
isl.isl_schedule_from_domain.argtypes = [c_void_p]
isl.isl_schedule_get_map.restype = c_void_p
isl.isl_schedule_get_map.argtypes = [c_void_p]
isl.isl_schedule_pullback_union_pw_multi_aff.restype = c_void_p
isl.isl_schedule_pullback_union_pw_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_get_root.restype = c_void_p
isl.isl_schedule_get_root.argtypes = [c_void_p]
isl.isl_schedule_copy.restype = c_void_p
isl.isl_schedule_copy.argtypes = [c_void_p]
isl.isl_schedule_free.restype = c_void_p
isl.isl_schedule_free.argtypes = [c_void_p]
isl.isl_schedule_to_str.restype = POINTER(c_char)
isl.isl_schedule_to_str.argtypes = [c_void_p]
class schedule_constraints(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_schedule_constraints_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_constraints_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
ptr = isl.isl_schedule_constraints_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_constraints("""%s""")' % s
else:
return 'isl.schedule_constraints("%s")' % s
def coincidence(arg0):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_get_coincidence(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_coincidence(arg0):
return arg0.coincidence()
def compute_schedule(arg0):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_compute_schedule(isl.isl_schedule_constraints_copy(arg0.ptr))
obj = schedule(ctx=ctx, ptr=res)
return obj
def conditional_validity(arg0):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_get_conditional_validity(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_conditional_validity(arg0):
return arg0.conditional_validity()
def conditional_validity_condition(arg0):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_get_conditional_validity_condition(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_conditional_validity_condition(arg0):
return arg0.conditional_validity_condition()
def context(arg0):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_get_context(arg0.ptr)
obj = set(ctx=ctx, ptr=res)
return obj
def get_context(arg0):
return arg0.context()
def domain(arg0):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_get_domain(arg0.ptr)
obj = union_set(ctx=ctx, ptr=res)
return obj
def get_domain(arg0):
return arg0.domain()
@staticmethod
def on_domain(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_on_domain(isl.isl_union_set_copy(arg0.ptr))
obj = schedule_constraints(ctx=ctx, ptr=res)
return obj
def proximity(arg0):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_get_proximity(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_proximity(arg0):
return arg0.proximity()
def set_coincidence(arg0, arg1):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_set_coincidence(isl.isl_schedule_constraints_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = schedule_constraints(ctx=ctx, ptr=res)
return obj
def set_conditional_validity(arg0, arg1, arg2):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
try:
if not arg2.__class__ is union_map:
arg2 = union_map(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_set_conditional_validity(isl.isl_schedule_constraints_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr), isl.isl_union_map_copy(arg2.ptr))
obj = schedule_constraints(ctx=ctx, ptr=res)
return obj
def set_context(arg0, arg1):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_set_context(isl.isl_schedule_constraints_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = schedule_constraints(ctx=ctx, ptr=res)
return obj
def set_proximity(arg0, arg1):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_set_proximity(isl.isl_schedule_constraints_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = schedule_constraints(ctx=ctx, ptr=res)
return obj
def set_validity(arg0, arg1):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_set_validity(isl.isl_schedule_constraints_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = schedule_constraints(ctx=ctx, ptr=res)
return obj
def validity(arg0):
try:
if not arg0.__class__ is schedule_constraints:
arg0 = schedule_constraints(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_constraints_get_validity(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_validity(arg0):
return arg0.validity()
isl.isl_schedule_constraints_read_from_str.restype = c_void_p
isl.isl_schedule_constraints_read_from_str.argtypes = [Context, c_char_p]
isl.isl_schedule_constraints_get_coincidence.restype = c_void_p
isl.isl_schedule_constraints_get_coincidence.argtypes = [c_void_p]
isl.isl_schedule_constraints_compute_schedule.restype = c_void_p
isl.isl_schedule_constraints_compute_schedule.argtypes = [c_void_p]
isl.isl_schedule_constraints_get_conditional_validity.restype = c_void_p
isl.isl_schedule_constraints_get_conditional_validity.argtypes = [c_void_p]
isl.isl_schedule_constraints_get_conditional_validity_condition.restype = c_void_p
isl.isl_schedule_constraints_get_conditional_validity_condition.argtypes = [c_void_p]
isl.isl_schedule_constraints_get_context.restype = c_void_p
isl.isl_schedule_constraints_get_context.argtypes = [c_void_p]
isl.isl_schedule_constraints_get_domain.restype = c_void_p
isl.isl_schedule_constraints_get_domain.argtypes = [c_void_p]
isl.isl_schedule_constraints_on_domain.restype = c_void_p
isl.isl_schedule_constraints_on_domain.argtypes = [c_void_p]
isl.isl_schedule_constraints_get_proximity.restype = c_void_p
isl.isl_schedule_constraints_get_proximity.argtypes = [c_void_p]
isl.isl_schedule_constraints_set_coincidence.restype = c_void_p
isl.isl_schedule_constraints_set_coincidence.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_constraints_set_conditional_validity.restype = c_void_p
isl.isl_schedule_constraints_set_conditional_validity.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_schedule_constraints_set_context.restype = c_void_p
isl.isl_schedule_constraints_set_context.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_constraints_set_proximity.restype = c_void_p
isl.isl_schedule_constraints_set_proximity.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_constraints_set_validity.restype = c_void_p
isl.isl_schedule_constraints_set_validity.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_constraints_get_validity.restype = c_void_p
isl.isl_schedule_constraints_get_validity.argtypes = [c_void_p]
isl.isl_schedule_constraints_copy.restype = c_void_p
isl.isl_schedule_constraints_copy.argtypes = [c_void_p]
isl.isl_schedule_constraints_free.restype = c_void_p
isl.isl_schedule_constraints_free.argtypes = [c_void_p]
isl.isl_schedule_constraints_to_str.restype = POINTER(c_char)
isl.isl_schedule_constraints_to_str.argtypes = [c_void_p]
class schedule_node(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and isinstance(args[0], schedule_node_band):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_context):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_domain):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_expansion):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_extension):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_filter):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_leaf):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_guard):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_mark):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_sequence):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
if len(args) == 1 and isinstance(args[0], schedule_node_set):
self.ctx = args[0].ctx
self.ptr = isl.isl_schedule_node_copy(args[0].ptr)
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
if "ptr" in keywords:
type = isl.isl_schedule_node_get_type(keywords["ptr"])
if type == 0:
return schedule_node_band(**keywords)
if type == 1:
return schedule_node_context(**keywords)
if type == 2:
return schedule_node_domain(**keywords)
if type == 3:
return schedule_node_expansion(**keywords)
if type == 4:
return schedule_node_extension(**keywords)
if type == 5:
return schedule_node_filter(**keywords)
if type == 6:
return schedule_node_leaf(**keywords)
if type == 7:
return schedule_node_guard(**keywords)
if type == 8:
return schedule_node_mark(**keywords)
if type == 9:
return schedule_node_sequence(**keywords)
if type == 10:
return schedule_node_set(**keywords)
raise
return super(schedule_node, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node("""%s""")' % s
else:
return 'isl.schedule_node("%s")' % s
def ancestor(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_ancestor(isl.isl_schedule_node_copy(arg0.ptr), arg1)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def ancestor_child_position(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is schedule_node:
arg1 = schedule_node(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_get_ancestor_child_position(arg0.ptr, arg1.ptr)
if res < 0:
raise
return int(res)
def get_ancestor_child_position(arg0, arg1):
return arg0.ancestor_child_position(arg1)
def child(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_child(isl.isl_schedule_node_copy(arg0.ptr), arg1)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def child_position(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_get_child_position(arg0.ptr)
if res < 0:
raise
return int(res)
def get_child_position(arg0):
return arg0.child_position()
def every_descendant(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = schedule_node(ctx=arg0.ctx, ptr=isl.isl_schedule_node_copy(cb_arg0))
try:
res = arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 1 if res else 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_schedule_node_every_descendant(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
return bool(res)
def first_child(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_first_child(isl.isl_schedule_node_copy(arg0.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def foreach_ancestor_top_down(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = schedule_node(ctx=arg0.ctx, ptr=isl.isl_schedule_node_copy(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_schedule_node_foreach_ancestor_top_down(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def foreach_descendant_top_down(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = schedule_node(ctx=arg0.ctx, ptr=isl.isl_schedule_node_copy(cb_arg0))
try:
res = arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 1 if res else 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_schedule_node_foreach_descendant_top_down(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
@staticmethod
def from_domain(arg0):
try:
if not arg0.__class__ is union_set:
arg0 = union_set(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_from_domain(isl.isl_union_set_copy(arg0.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
@staticmethod
def from_extension(arg0):
try:
if not arg0.__class__ is union_map:
arg0 = union_map(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_from_extension(isl.isl_union_map_copy(arg0.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def graft_after(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is schedule_node:
arg1 = schedule_node(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_graft_after(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_schedule_node_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def graft_before(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is schedule_node:
arg1 = schedule_node(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_graft_before(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_schedule_node_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def has_children(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_has_children(arg0.ptr)
if res < 0:
raise
return bool(res)
def has_next_sibling(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_has_next_sibling(arg0.ptr)
if res < 0:
raise
return bool(res)
def has_parent(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_has_parent(arg0.ptr)
if res < 0:
raise
return bool(res)
def has_previous_sibling(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_has_previous_sibling(arg0.ptr)
if res < 0:
raise
return bool(res)
def insert_context(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_insert_context(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def insert_filter(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_insert_filter(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def insert_guard(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_insert_guard(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def insert_mark(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is id:
arg1 = id(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_insert_mark(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_id_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def insert_partial_schedule(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is multi_union_pw_aff:
arg1 = multi_union_pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_insert_partial_schedule(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_multi_union_pw_aff_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def insert_sequence(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is union_set_list:
arg1 = union_set_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_insert_sequence(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_union_set_list_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def insert_set(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is union_set_list:
arg1 = union_set_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_insert_set(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_union_set_list_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def is_equal(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is schedule_node:
arg1 = schedule_node(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_subtree_anchored(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_is_subtree_anchored(arg0.ptr)
if res < 0:
raise
return bool(res)
def map_descendant_bottom_up(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_void_p, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = schedule_node(ctx=arg0.ctx, ptr=(cb_arg0))
try:
res = arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return None
return isl.isl_schedule_node_copy(res.ptr)
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_schedule_node_map_descendant_bottom_up(isl.isl_schedule_node_copy(arg0.ptr), cb, None)
if exc_info[0] is not None:
raise exc_info[0]
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def n_children(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_n_children(arg0.ptr)
if res < 0:
raise
return int(res)
def next_sibling(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_next_sibling(isl.isl_schedule_node_copy(arg0.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def order_after(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_order_after(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def order_before(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_order_before(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def parent(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_parent(isl.isl_schedule_node_copy(arg0.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def prefix_schedule_multi_union_pw_aff(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_get_prefix_schedule_multi_union_pw_aff(arg0.ptr)
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def get_prefix_schedule_multi_union_pw_aff(arg0):
return arg0.prefix_schedule_multi_union_pw_aff()
def prefix_schedule_union_map(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_get_prefix_schedule_union_map(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_prefix_schedule_union_map(arg0):
return arg0.prefix_schedule_union_map()
def prefix_schedule_union_pw_multi_aff(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_get_prefix_schedule_union_pw_multi_aff(arg0.ptr)
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def get_prefix_schedule_union_pw_multi_aff(arg0):
return arg0.prefix_schedule_union_pw_multi_aff()
def previous_sibling(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_previous_sibling(isl.isl_schedule_node_copy(arg0.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def root(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_root(isl.isl_schedule_node_copy(arg0.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def schedule(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_get_schedule(arg0.ptr)
obj = schedule(ctx=ctx, ptr=res)
return obj
def get_schedule(arg0):
return arg0.schedule()
def shared_ancestor(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is schedule_node:
arg1 = schedule_node(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_get_shared_ancestor(arg0.ptr, arg1.ptr)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def get_shared_ancestor(arg0, arg1):
return arg0.shared_ancestor(arg1)
def tree_depth(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_get_tree_depth(arg0.ptr)
if res < 0:
raise
return int(res)
def get_tree_depth(arg0):
return arg0.tree_depth()
isl.isl_schedule_node_ancestor.restype = c_void_p
isl.isl_schedule_node_ancestor.argtypes = [c_void_p, c_int]
isl.isl_schedule_node_get_ancestor_child_position.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_child.restype = c_void_p
isl.isl_schedule_node_child.argtypes = [c_void_p, c_int]
isl.isl_schedule_node_get_child_position.argtypes = [c_void_p]
isl.isl_schedule_node_every_descendant.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_schedule_node_first_child.restype = c_void_p
isl.isl_schedule_node_first_child.argtypes = [c_void_p]
isl.isl_schedule_node_foreach_ancestor_top_down.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_schedule_node_foreach_descendant_top_down.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_schedule_node_from_domain.restype = c_void_p
isl.isl_schedule_node_from_domain.argtypes = [c_void_p]
isl.isl_schedule_node_from_extension.restype = c_void_p
isl.isl_schedule_node_from_extension.argtypes = [c_void_p]
isl.isl_schedule_node_graft_after.restype = c_void_p
isl.isl_schedule_node_graft_after.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_graft_before.restype = c_void_p
isl.isl_schedule_node_graft_before.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_has_children.argtypes = [c_void_p]
isl.isl_schedule_node_has_next_sibling.argtypes = [c_void_p]
isl.isl_schedule_node_has_parent.argtypes = [c_void_p]
isl.isl_schedule_node_has_previous_sibling.argtypes = [c_void_p]
isl.isl_schedule_node_insert_context.restype = c_void_p
isl.isl_schedule_node_insert_context.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_insert_filter.restype = c_void_p
isl.isl_schedule_node_insert_filter.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_insert_guard.restype = c_void_p
isl.isl_schedule_node_insert_guard.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_insert_mark.restype = c_void_p
isl.isl_schedule_node_insert_mark.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_insert_partial_schedule.restype = c_void_p
isl.isl_schedule_node_insert_partial_schedule.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_insert_sequence.restype = c_void_p
isl.isl_schedule_node_insert_sequence.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_insert_set.restype = c_void_p
isl.isl_schedule_node_insert_set.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_is_subtree_anchored.argtypes = [c_void_p]
isl.isl_schedule_node_map_descendant_bottom_up.restype = c_void_p
isl.isl_schedule_node_map_descendant_bottom_up.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_schedule_node_n_children.argtypes = [c_void_p]
isl.isl_schedule_node_next_sibling.restype = c_void_p
isl.isl_schedule_node_next_sibling.argtypes = [c_void_p]
isl.isl_schedule_node_order_after.restype = c_void_p
isl.isl_schedule_node_order_after.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_order_before.restype = c_void_p
isl.isl_schedule_node_order_before.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_parent.restype = c_void_p
isl.isl_schedule_node_parent.argtypes = [c_void_p]
isl.isl_schedule_node_get_prefix_schedule_multi_union_pw_aff.restype = c_void_p
isl.isl_schedule_node_get_prefix_schedule_multi_union_pw_aff.argtypes = [c_void_p]
isl.isl_schedule_node_get_prefix_schedule_union_map.restype = c_void_p
isl.isl_schedule_node_get_prefix_schedule_union_map.argtypes = [c_void_p]
isl.isl_schedule_node_get_prefix_schedule_union_pw_multi_aff.restype = c_void_p
isl.isl_schedule_node_get_prefix_schedule_union_pw_multi_aff.argtypes = [c_void_p]
isl.isl_schedule_node_previous_sibling.restype = c_void_p
isl.isl_schedule_node_previous_sibling.argtypes = [c_void_p]
isl.isl_schedule_node_root.restype = c_void_p
isl.isl_schedule_node_root.argtypes = [c_void_p]
isl.isl_schedule_node_get_schedule.restype = c_void_p
isl.isl_schedule_node_get_schedule.argtypes = [c_void_p]
isl.isl_schedule_node_get_shared_ancestor.restype = c_void_p
isl.isl_schedule_node_get_shared_ancestor.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_get_tree_depth.argtypes = [c_void_p]
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
isl.isl_schedule_node_get_type.argtypes = [c_void_p]
class schedule_node_band(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_band, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_band:
arg0 = schedule_node_band(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_band("""%s""")' % s
else:
return 'isl.schedule_node_band("%s")' % s
def ast_build_options(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_get_ast_build_options(arg0.ptr)
obj = union_set(ctx=ctx, ptr=res)
return obj
def get_ast_build_options(arg0):
return arg0.ast_build_options()
def ast_isolate_option(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_get_ast_isolate_option(arg0.ptr)
obj = set(ctx=ctx, ptr=res)
return obj
def get_ast_isolate_option(arg0):
return arg0.ast_isolate_option()
def member_get_coincident(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_member_get_coincident(arg0.ptr, arg1)
if res < 0:
raise
return bool(res)
def member_set_coincident(arg0, arg1, arg2):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_member_set_coincident(isl.isl_schedule_node_copy(arg0.ptr), arg1, arg2)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def mod(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_mod(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def n_member(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_n_member(arg0.ptr)
if res < 0:
raise
return int(res)
def partial_schedule(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_get_partial_schedule(arg0.ptr)
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def get_partial_schedule(arg0):
return arg0.partial_schedule()
def permutable(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_get_permutable(arg0.ptr)
if res < 0:
raise
return bool(res)
def get_permutable(arg0):
return arg0.permutable()
def scale(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_scale(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def scale_down(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_scale_down(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def set_ast_build_options(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_set_ast_build_options(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def set_permutable(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_set_permutable(isl.isl_schedule_node_copy(arg0.ptr), arg1)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def shift(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is multi_union_pw_aff:
arg1 = multi_union_pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_shift(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_multi_union_pw_aff_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def split(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_split(isl.isl_schedule_node_copy(arg0.ptr), arg1)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def tile(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
try:
if not arg1.__class__ is multi_val:
arg1 = multi_val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_tile(isl.isl_schedule_node_copy(arg0.ptr), isl.isl_multi_val_copy(arg1.ptr))
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def member_set_ast_loop_default(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_member_set_ast_loop_type(isl.isl_schedule_node_copy(arg0.ptr), arg1, 0)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def member_set_ast_loop_atomic(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_member_set_ast_loop_type(isl.isl_schedule_node_copy(arg0.ptr), arg1, 1)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def member_set_ast_loop_unroll(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_member_set_ast_loop_type(isl.isl_schedule_node_copy(arg0.ptr), arg1, 2)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
def member_set_ast_loop_separate(arg0, arg1):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_band_member_set_ast_loop_type(isl.isl_schedule_node_copy(arg0.ptr), arg1, 3)
obj = schedule_node(ctx=ctx, ptr=res)
return obj
isl.isl_schedule_node_band_get_ast_build_options.restype = c_void_p
isl.isl_schedule_node_band_get_ast_build_options.argtypes = [c_void_p]
isl.isl_schedule_node_band_get_ast_isolate_option.restype = c_void_p
isl.isl_schedule_node_band_get_ast_isolate_option.argtypes = [c_void_p]
isl.isl_schedule_node_band_member_get_coincident.argtypes = [c_void_p, c_int]
isl.isl_schedule_node_band_member_set_coincident.restype = c_void_p
isl.isl_schedule_node_band_member_set_coincident.argtypes = [c_void_p, c_int, c_int]
isl.isl_schedule_node_band_mod.restype = c_void_p
isl.isl_schedule_node_band_mod.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_band_n_member.argtypes = [c_void_p]
isl.isl_schedule_node_band_get_partial_schedule.restype = c_void_p
isl.isl_schedule_node_band_get_partial_schedule.argtypes = [c_void_p]
isl.isl_schedule_node_band_get_permutable.argtypes = [c_void_p]
isl.isl_schedule_node_band_scale.restype = c_void_p
isl.isl_schedule_node_band_scale.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_band_scale_down.restype = c_void_p
isl.isl_schedule_node_band_scale_down.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_band_set_ast_build_options.restype = c_void_p
isl.isl_schedule_node_band_set_ast_build_options.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_band_set_permutable.restype = c_void_p
isl.isl_schedule_node_band_set_permutable.argtypes = [c_void_p, c_int]
isl.isl_schedule_node_band_shift.restype = c_void_p
isl.isl_schedule_node_band_shift.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_band_split.restype = c_void_p
isl.isl_schedule_node_band_split.argtypes = [c_void_p, c_int]
isl.isl_schedule_node_band_tile.restype = c_void_p
isl.isl_schedule_node_band_tile.argtypes = [c_void_p, c_void_p]
isl.isl_schedule_node_band_member_set_ast_loop_type.restype = c_void_p
isl.isl_schedule_node_band_member_set_ast_loop_type.argtypes = [c_void_p, c_int, c_int]
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_context(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_context, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_context:
arg0 = schedule_node_context(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_context("""%s""")' % s
else:
return 'isl.schedule_node_context("%s")' % s
def context(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_context_get_context(arg0.ptr)
obj = set(ctx=ctx, ptr=res)
return obj
def get_context(arg0):
return arg0.context()
isl.isl_schedule_node_context_get_context.restype = c_void_p
isl.isl_schedule_node_context_get_context.argtypes = [c_void_p]
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_domain(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_domain, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_domain:
arg0 = schedule_node_domain(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_domain("""%s""")' % s
else:
return 'isl.schedule_node_domain("%s")' % s
def domain(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_domain_get_domain(arg0.ptr)
obj = union_set(ctx=ctx, ptr=res)
return obj
def get_domain(arg0):
return arg0.domain()
isl.isl_schedule_node_domain_get_domain.restype = c_void_p
isl.isl_schedule_node_domain_get_domain.argtypes = [c_void_p]
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_expansion(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_expansion, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_expansion:
arg0 = schedule_node_expansion(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_expansion("""%s""")' % s
else:
return 'isl.schedule_node_expansion("%s")' % s
def contraction(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_expansion_get_contraction(arg0.ptr)
obj = union_pw_multi_aff(ctx=ctx, ptr=res)
return obj
def get_contraction(arg0):
return arg0.contraction()
def expansion(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_expansion_get_expansion(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_expansion(arg0):
return arg0.expansion()
isl.isl_schedule_node_expansion_get_contraction.restype = c_void_p
isl.isl_schedule_node_expansion_get_contraction.argtypes = [c_void_p]
isl.isl_schedule_node_expansion_get_expansion.restype = c_void_p
isl.isl_schedule_node_expansion_get_expansion.argtypes = [c_void_p]
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_extension(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_extension, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_extension:
arg0 = schedule_node_extension(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_extension("""%s""")' % s
else:
return 'isl.schedule_node_extension("%s")' % s
def extension(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_extension_get_extension(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_extension(arg0):
return arg0.extension()
isl.isl_schedule_node_extension_get_extension.restype = c_void_p
isl.isl_schedule_node_extension_get_extension.argtypes = [c_void_p]
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_filter(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_filter, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_filter:
arg0 = schedule_node_filter(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_filter("""%s""")' % s
else:
return 'isl.schedule_node_filter("%s")' % s
def filter(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_filter_get_filter(arg0.ptr)
obj = union_set(ctx=ctx, ptr=res)
return obj
def get_filter(arg0):
return arg0.filter()
isl.isl_schedule_node_filter_get_filter.restype = c_void_p
isl.isl_schedule_node_filter_get_filter.argtypes = [c_void_p]
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_guard(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_guard, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_guard:
arg0 = schedule_node_guard(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_guard("""%s""")' % s
else:
return 'isl.schedule_node_guard("%s")' % s
def guard(arg0):
try:
if not arg0.__class__ is schedule_node:
arg0 = schedule_node(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_schedule_node_guard_get_guard(arg0.ptr)
obj = set(ctx=ctx, ptr=res)
return obj
def get_guard(arg0):
return arg0.guard()
isl.isl_schedule_node_guard_get_guard.restype = c_void_p
isl.isl_schedule_node_guard_get_guard.argtypes = [c_void_p]
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_leaf(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_leaf, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_leaf:
arg0 = schedule_node_leaf(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_leaf("""%s""")' % s
else:
return 'isl.schedule_node_leaf("%s")' % s
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_mark(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_mark, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_mark:
arg0 = schedule_node_mark(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_mark("""%s""")' % s
else:
return 'isl.schedule_node_mark("%s")' % s
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_sequence(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_sequence, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_sequence:
arg0 = schedule_node_sequence(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_sequence("""%s""")' % s
else:
return 'isl.schedule_node_sequence("%s")' % s
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class schedule_node_set(schedule_node):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_schedule_node_free(self.ptr)
def __new__(cls, *args, **keywords):
return super(schedule_node_set, cls).__new__(cls)
def __str__(arg0):
try:
if not arg0.__class__ is schedule_node_set:
arg0 = schedule_node_set(arg0)
except:
raise
ptr = isl.isl_schedule_node_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.schedule_node_set("""%s""")' % s
else:
return 'isl.schedule_node_set("%s")' % s
isl.isl_schedule_node_copy.restype = c_void_p
isl.isl_schedule_node_copy.argtypes = [c_void_p]
isl.isl_schedule_node_free.restype = c_void_p
isl.isl_schedule_node_free.argtypes = [c_void_p]
isl.isl_schedule_node_to_str.restype = POINTER(c_char)
isl.isl_schedule_node_to_str.argtypes = [c_void_p]
class set_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_set_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and args[0].__class__ is set:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_set_list_from_set(isl.isl_set_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_set_list_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_set_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is set_list:
arg0 = set_list(arg0)
except:
raise
ptr = isl.isl_set_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.set_list("""%s""")' % s
else:
return 'isl.set_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is set_list:
arg0 = set_list(arg0)
except:
raise
try:
if not arg1.__class__ is set:
arg1 = set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_list_add(isl.isl_set_list_copy(arg0.ptr), isl.isl_set_copy(arg1.ptr))
obj = set_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is set_list:
arg0 = set_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_list_get_at(arg0.ptr, arg1)
obj = set(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is set_list:
arg0 = set_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_list_clear(isl.isl_set_list_copy(arg0.ptr))
obj = set_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is set_list:
arg0 = set_list(arg0)
except:
raise
try:
if not arg1.__class__ is set_list:
arg1 = set_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_list_concat(isl.isl_set_list_copy(arg0.ptr), isl.isl_set_list_copy(arg1.ptr))
obj = set_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is set_list:
arg0 = set_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_list_drop(isl.isl_set_list_copy(arg0.ptr), arg1, arg2)
obj = set_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is set_list:
arg0 = set_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = set(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_set_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is set_list:
arg0 = set_list(arg0)
except:
raise
try:
if not arg2.__class__ is set:
arg2 = set(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_list_insert(isl.isl_set_list_copy(arg0.ptr), arg1, isl.isl_set_copy(arg2.ptr))
obj = set_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is set_list:
arg0 = set_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_set_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_set_list_alloc.restype = c_void_p
isl.isl_set_list_alloc.argtypes = [Context, c_int]
isl.isl_set_list_from_set.restype = c_void_p
isl.isl_set_list_from_set.argtypes = [c_void_p]
isl.isl_set_list_read_from_str.restype = c_void_p
isl.isl_set_list_read_from_str.argtypes = [Context, c_char_p]
isl.isl_set_list_add.restype = c_void_p
isl.isl_set_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_set_list_get_at.restype = c_void_p
isl.isl_set_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_set_list_clear.restype = c_void_p
isl.isl_set_list_clear.argtypes = [c_void_p]
isl.isl_set_list_concat.restype = c_void_p
isl.isl_set_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_set_list_drop.restype = c_void_p
isl.isl_set_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_set_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_set_list_insert.restype = c_void_p
isl.isl_set_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_set_list_size.argtypes = [c_void_p]
isl.isl_set_list_copy.restype = c_void_p
isl.isl_set_list_copy.argtypes = [c_void_p]
isl.isl_set_list_free.restype = c_void_p
isl.isl_set_list_free.argtypes = [c_void_p]
isl.isl_set_list_to_str.restype = POINTER(c_char)
isl.isl_set_list_to_str.argtypes = [c_void_p]
class space(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_space_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ptr = isl.isl_space_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.space("""%s""")' % s
else:
return 'isl.space("%s")' % s
def add_named_tuple(*args):
if len(args) == 3 and (args[1].__class__ is id or type(args[1]) == str) and type(args[2]) == int:
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_space_add_named_tuple_id_ui(isl.isl_space_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr), args[2])
obj = space(ctx=ctx, ptr=res)
return obj
raise Error
def add_param(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_space_add_param_id(isl.isl_space_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = space(ctx=ctx, ptr=res)
return obj
raise Error
def add_unnamed_tuple(*args):
if len(args) == 2 and type(args[1]) == int:
ctx = args[0].ctx
res = isl.isl_space_add_unnamed_tuple_ui(isl.isl_space_copy(args[0].ptr), args[1])
obj = space(ctx=ctx, ptr=res)
return obj
raise Error
def curry(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_curry(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def domain(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_domain(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def domain_map_multi_aff(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_domain_map_multi_aff(isl.isl_space_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def domain_map_pw_multi_aff(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_domain_map_pw_multi_aff(isl.isl_space_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def domain_tuple_id(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_get_domain_tuple_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_domain_tuple_id(arg0):
return arg0.domain_tuple_id()
def flatten_domain(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_flatten_domain(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def flatten_range(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_flatten_range(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def has_domain_tuple_id(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_has_domain_tuple_id(arg0.ptr)
if res < 0:
raise
return bool(res)
def has_range_tuple_id(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_has_range_tuple_id(arg0.ptr)
if res < 0:
raise
return bool(res)
def identity_multi_aff_on_domain(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_identity_multi_aff_on_domain(isl.isl_space_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def identity_multi_pw_aff_on_domain(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_identity_multi_pw_aff_on_domain(isl.isl_space_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def identity_pw_multi_aff_on_domain(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_identity_pw_multi_aff_on_domain(isl.isl_space_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def is_equal(arg0, arg1):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_is_equal(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_wrapping(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_is_wrapping(arg0.ptr)
if res < 0:
raise
return bool(res)
def map_from_set(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_map_from_set(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def multi_aff(arg0, arg1):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
try:
if not arg1.__class__ is aff_list:
arg1 = aff_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_multi_aff(isl.isl_space_copy(arg0.ptr), isl.isl_aff_list_copy(arg1.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def multi_aff_on_domain(*args):
if len(args) == 2 and args[1].__class__ is multi_val:
ctx = args[0].ctx
res = isl.isl_space_multi_aff_on_domain_multi_val(isl.isl_space_copy(args[0].ptr), isl.isl_multi_val_copy(args[1].ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
raise Error
def multi_id(arg0, arg1):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
try:
if not arg1.__class__ is id_list:
arg1 = id_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_multi_id(isl.isl_space_copy(arg0.ptr), isl.isl_id_list_copy(arg1.ptr))
obj = multi_id(ctx=ctx, ptr=res)
return obj
def multi_pw_aff(arg0, arg1):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
try:
if not arg1.__class__ is pw_aff_list:
arg1 = pw_aff_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_multi_pw_aff(isl.isl_space_copy(arg0.ptr), isl.isl_pw_aff_list_copy(arg1.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def multi_union_pw_aff(arg0, arg1):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_aff_list:
arg1 = union_pw_aff_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_multi_union_pw_aff(isl.isl_space_copy(arg0.ptr), isl.isl_union_pw_aff_list_copy(arg1.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def multi_val(arg0, arg1):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
try:
if not arg1.__class__ is val_list:
arg1 = val_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_multi_val(isl.isl_space_copy(arg0.ptr), isl.isl_val_list_copy(arg1.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
def param_aff_on_domain(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_space_param_aff_on_domain_id(isl.isl_space_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
raise Error
def params(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_params(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def product(arg0, arg1):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
try:
if not arg1.__class__ is space:
arg1 = space(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_product(isl.isl_space_copy(arg0.ptr), isl.isl_space_copy(arg1.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def range(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_range(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def range_map_multi_aff(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_range_map_multi_aff(isl.isl_space_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def range_map_pw_multi_aff(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_range_map_pw_multi_aff(isl.isl_space_copy(arg0.ptr))
obj = pw_multi_aff(ctx=ctx, ptr=res)
return obj
def range_reverse(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_range_reverse(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def range_tuple_id(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_get_range_tuple_id(arg0.ptr)
obj = id(ctx=ctx, ptr=res)
return obj
def get_range_tuple_id(arg0):
return arg0.range_tuple_id()
def reverse(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_reverse(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def set_domain_tuple(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_space_set_domain_tuple_id(isl.isl_space_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = space(ctx=ctx, ptr=res)
return obj
raise Error
def set_range_tuple(*args):
if len(args) == 2 and (args[1].__class__ is id or type(args[1]) == str):
args = list(args)
try:
if not args[1].__class__ is id:
args[1] = id(args[1])
except:
raise
ctx = args[0].ctx
res = isl.isl_space_set_range_tuple_id(isl.isl_space_copy(args[0].ptr), isl.isl_id_copy(args[1].ptr))
obj = space(ctx=ctx, ptr=res)
return obj
raise Error
def uncurry(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_uncurry(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
@staticmethod
def unit():
ctx = Context.getDefaultInstance()
res = isl.isl_space_unit(ctx)
obj = space(ctx=ctx, ptr=res)
return obj
def universe_map(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_universe_map(isl.isl_space_copy(arg0.ptr))
obj = map(ctx=ctx, ptr=res)
return obj
def universe_set(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_universe_set(isl.isl_space_copy(arg0.ptr))
obj = set(ctx=ctx, ptr=res)
return obj
def unwrap(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_unwrap(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def wrap(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_wrap(isl.isl_space_copy(arg0.ptr))
obj = space(ctx=ctx, ptr=res)
return obj
def zero_aff_on_domain(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_zero_aff_on_domain(isl.isl_space_copy(arg0.ptr))
obj = aff(ctx=ctx, ptr=res)
return obj
def zero_multi_aff(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_zero_multi_aff(isl.isl_space_copy(arg0.ptr))
obj = multi_aff(ctx=ctx, ptr=res)
return obj
def zero_multi_pw_aff(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_zero_multi_pw_aff(isl.isl_space_copy(arg0.ptr))
obj = multi_pw_aff(ctx=ctx, ptr=res)
return obj
def zero_multi_union_pw_aff(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_zero_multi_union_pw_aff(isl.isl_space_copy(arg0.ptr))
obj = multi_union_pw_aff(ctx=ctx, ptr=res)
return obj
def zero_multi_val(arg0):
try:
if not arg0.__class__ is space:
arg0 = space(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_space_zero_multi_val(isl.isl_space_copy(arg0.ptr))
obj = multi_val(ctx=ctx, ptr=res)
return obj
isl.isl_space_add_named_tuple_id_ui.restype = c_void_p
isl.isl_space_add_named_tuple_id_ui.argtypes = [c_void_p, c_void_p, c_int]
isl.isl_space_add_param_id.restype = c_void_p
isl.isl_space_add_param_id.argtypes = [c_void_p, c_void_p]
isl.isl_space_add_unnamed_tuple_ui.restype = c_void_p
isl.isl_space_add_unnamed_tuple_ui.argtypes = [c_void_p, c_int]
isl.isl_space_curry.restype = c_void_p
isl.isl_space_curry.argtypes = [c_void_p]
isl.isl_space_domain.restype = c_void_p
isl.isl_space_domain.argtypes = [c_void_p]
isl.isl_space_domain_map_multi_aff.restype = c_void_p
isl.isl_space_domain_map_multi_aff.argtypes = [c_void_p]
isl.isl_space_domain_map_pw_multi_aff.restype = c_void_p
isl.isl_space_domain_map_pw_multi_aff.argtypes = [c_void_p]
isl.isl_space_get_domain_tuple_id.restype = c_void_p
isl.isl_space_get_domain_tuple_id.argtypes = [c_void_p]
isl.isl_space_flatten_domain.restype = c_void_p
isl.isl_space_flatten_domain.argtypes = [c_void_p]
isl.isl_space_flatten_range.restype = c_void_p
isl.isl_space_flatten_range.argtypes = [c_void_p]
isl.isl_space_has_domain_tuple_id.argtypes = [c_void_p]
isl.isl_space_has_range_tuple_id.argtypes = [c_void_p]
isl.isl_space_identity_multi_aff_on_domain.restype = c_void_p
isl.isl_space_identity_multi_aff_on_domain.argtypes = [c_void_p]
isl.isl_space_identity_multi_pw_aff_on_domain.restype = c_void_p
isl.isl_space_identity_multi_pw_aff_on_domain.argtypes = [c_void_p]
isl.isl_space_identity_pw_multi_aff_on_domain.restype = c_void_p
isl.isl_space_identity_pw_multi_aff_on_domain.argtypes = [c_void_p]
isl.isl_space_is_equal.argtypes = [c_void_p, c_void_p]
isl.isl_space_is_wrapping.argtypes = [c_void_p]
isl.isl_space_map_from_set.restype = c_void_p
isl.isl_space_map_from_set.argtypes = [c_void_p]
isl.isl_space_multi_aff.restype = c_void_p
isl.isl_space_multi_aff.argtypes = [c_void_p, c_void_p]
isl.isl_space_multi_aff_on_domain_multi_val.restype = c_void_p
isl.isl_space_multi_aff_on_domain_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_space_multi_id.restype = c_void_p
isl.isl_space_multi_id.argtypes = [c_void_p, c_void_p]
isl.isl_space_multi_pw_aff.restype = c_void_p
isl.isl_space_multi_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_space_multi_union_pw_aff.restype = c_void_p
isl.isl_space_multi_union_pw_aff.argtypes = [c_void_p, c_void_p]
isl.isl_space_multi_val.restype = c_void_p
isl.isl_space_multi_val.argtypes = [c_void_p, c_void_p]
isl.isl_space_param_aff_on_domain_id.restype = c_void_p
isl.isl_space_param_aff_on_domain_id.argtypes = [c_void_p, c_void_p]
isl.isl_space_params.restype = c_void_p
isl.isl_space_params.argtypes = [c_void_p]
isl.isl_space_product.restype = c_void_p
isl.isl_space_product.argtypes = [c_void_p, c_void_p]
isl.isl_space_range.restype = c_void_p
isl.isl_space_range.argtypes = [c_void_p]
isl.isl_space_range_map_multi_aff.restype = c_void_p
isl.isl_space_range_map_multi_aff.argtypes = [c_void_p]
isl.isl_space_range_map_pw_multi_aff.restype = c_void_p
isl.isl_space_range_map_pw_multi_aff.argtypes = [c_void_p]
isl.isl_space_range_reverse.restype = c_void_p
isl.isl_space_range_reverse.argtypes = [c_void_p]
isl.isl_space_get_range_tuple_id.restype = c_void_p
isl.isl_space_get_range_tuple_id.argtypes = [c_void_p]
isl.isl_space_reverse.restype = c_void_p
isl.isl_space_reverse.argtypes = [c_void_p]
isl.isl_space_set_domain_tuple_id.restype = c_void_p
isl.isl_space_set_domain_tuple_id.argtypes = [c_void_p, c_void_p]
isl.isl_space_set_range_tuple_id.restype = c_void_p
isl.isl_space_set_range_tuple_id.argtypes = [c_void_p, c_void_p]
isl.isl_space_uncurry.restype = c_void_p
isl.isl_space_uncurry.argtypes = [c_void_p]
isl.isl_space_unit.restype = c_void_p
isl.isl_space_unit.argtypes = [Context]
isl.isl_space_universe_map.restype = c_void_p
isl.isl_space_universe_map.argtypes = [c_void_p]
isl.isl_space_universe_set.restype = c_void_p
isl.isl_space_universe_set.argtypes = [c_void_p]
isl.isl_space_unwrap.restype = c_void_p
isl.isl_space_unwrap.argtypes = [c_void_p]
isl.isl_space_wrap.restype = c_void_p
isl.isl_space_wrap.argtypes = [c_void_p]
isl.isl_space_zero_aff_on_domain.restype = c_void_p
isl.isl_space_zero_aff_on_domain.argtypes = [c_void_p]
isl.isl_space_zero_multi_aff.restype = c_void_p
isl.isl_space_zero_multi_aff.argtypes = [c_void_p]
isl.isl_space_zero_multi_pw_aff.restype = c_void_p
isl.isl_space_zero_multi_pw_aff.argtypes = [c_void_p]
isl.isl_space_zero_multi_union_pw_aff.restype = c_void_p
isl.isl_space_zero_multi_union_pw_aff.argtypes = [c_void_p]
isl.isl_space_zero_multi_val.restype = c_void_p
isl.isl_space_zero_multi_val.argtypes = [c_void_p]
isl.isl_space_copy.restype = c_void_p
isl.isl_space_copy.argtypes = [c_void_p]
isl.isl_space_free.restype = c_void_p
isl.isl_space_free.argtypes = [c_void_p]
isl.isl_space_to_str.restype = POINTER(c_char)
isl.isl_space_to_str.argtypes = [c_void_p]
class union_access_info(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and args[0].__class__ is union_map:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_access_info_from_sink(isl.isl_union_map_copy(args[0].ptr))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_union_access_info_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is union_access_info:
arg0 = union_access_info(arg0)
except:
raise
ptr = isl.isl_union_access_info_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.union_access_info("""%s""")' % s
else:
return 'isl.union_access_info("%s")' % s
def compute_flow(arg0):
try:
if not arg0.__class__ is union_access_info:
arg0 = union_access_info(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_access_info_compute_flow(isl.isl_union_access_info_copy(arg0.ptr))
obj = union_flow(ctx=ctx, ptr=res)
return obj
def set_kill(arg0, arg1):
try:
if not arg0.__class__ is union_access_info:
arg0 = union_access_info(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_access_info_set_kill(isl.isl_union_access_info_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_access_info(ctx=ctx, ptr=res)
return obj
def set_may_source(arg0, arg1):
try:
if not arg0.__class__ is union_access_info:
arg0 = union_access_info(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_access_info_set_may_source(isl.isl_union_access_info_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_access_info(ctx=ctx, ptr=res)
return obj
def set_must_source(arg0, arg1):
try:
if not arg0.__class__ is union_access_info:
arg0 = union_access_info(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_access_info_set_must_source(isl.isl_union_access_info_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_access_info(ctx=ctx, ptr=res)
return obj
def set_schedule(arg0, arg1):
try:
if not arg0.__class__ is union_access_info:
arg0 = union_access_info(arg0)
except:
raise
try:
if not arg1.__class__ is schedule:
arg1 = schedule(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_access_info_set_schedule(isl.isl_union_access_info_copy(arg0.ptr), isl.isl_schedule_copy(arg1.ptr))
obj = union_access_info(ctx=ctx, ptr=res)
return obj
def set_schedule_map(arg0, arg1):
try:
if not arg0.__class__ is union_access_info:
arg0 = union_access_info(arg0)
except:
raise
try:
if not arg1.__class__ is union_map:
arg1 = union_map(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_access_info_set_schedule_map(isl.isl_union_access_info_copy(arg0.ptr), isl.isl_union_map_copy(arg1.ptr))
obj = union_access_info(ctx=ctx, ptr=res)
return obj
isl.isl_union_access_info_from_sink.restype = c_void_p
isl.isl_union_access_info_from_sink.argtypes = [c_void_p]
isl.isl_union_access_info_compute_flow.restype = c_void_p
isl.isl_union_access_info_compute_flow.argtypes = [c_void_p]
isl.isl_union_access_info_set_kill.restype = c_void_p
isl.isl_union_access_info_set_kill.argtypes = [c_void_p, c_void_p]
isl.isl_union_access_info_set_may_source.restype = c_void_p
isl.isl_union_access_info_set_may_source.argtypes = [c_void_p, c_void_p]
isl.isl_union_access_info_set_must_source.restype = c_void_p
isl.isl_union_access_info_set_must_source.argtypes = [c_void_p, c_void_p]
isl.isl_union_access_info_set_schedule.restype = c_void_p
isl.isl_union_access_info_set_schedule.argtypes = [c_void_p, c_void_p]
isl.isl_union_access_info_set_schedule_map.restype = c_void_p
isl.isl_union_access_info_set_schedule_map.argtypes = [c_void_p, c_void_p]
isl.isl_union_access_info_copy.restype = c_void_p
isl.isl_union_access_info_copy.argtypes = [c_void_p]
isl.isl_union_access_info_free.restype = c_void_p
isl.isl_union_access_info_free.argtypes = [c_void_p]
isl.isl_union_access_info_to_str.restype = POINTER(c_char)
isl.isl_union_access_info_to_str.argtypes = [c_void_p]
class union_flow(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_union_flow_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is union_flow:
arg0 = union_flow(arg0)
except:
raise
ptr = isl.isl_union_flow_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.union_flow("""%s""")' % s
else:
return 'isl.union_flow("%s")' % s
def full_may_dependence(arg0):
try:
if not arg0.__class__ is union_flow:
arg0 = union_flow(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_flow_get_full_may_dependence(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_full_may_dependence(arg0):
return arg0.full_may_dependence()
def full_must_dependence(arg0):
try:
if not arg0.__class__ is union_flow:
arg0 = union_flow(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_flow_get_full_must_dependence(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_full_must_dependence(arg0):
return arg0.full_must_dependence()
def may_dependence(arg0):
try:
if not arg0.__class__ is union_flow:
arg0 = union_flow(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_flow_get_may_dependence(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_may_dependence(arg0):
return arg0.may_dependence()
def may_no_source(arg0):
try:
if not arg0.__class__ is union_flow:
arg0 = union_flow(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_flow_get_may_no_source(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_may_no_source(arg0):
return arg0.may_no_source()
def must_dependence(arg0):
try:
if not arg0.__class__ is union_flow:
arg0 = union_flow(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_flow_get_must_dependence(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_must_dependence(arg0):
return arg0.must_dependence()
def must_no_source(arg0):
try:
if not arg0.__class__ is union_flow:
arg0 = union_flow(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_flow_get_must_no_source(arg0.ptr)
obj = union_map(ctx=ctx, ptr=res)
return obj
def get_must_no_source(arg0):
return arg0.must_no_source()
isl.isl_union_flow_get_full_may_dependence.restype = c_void_p
isl.isl_union_flow_get_full_may_dependence.argtypes = [c_void_p]
isl.isl_union_flow_get_full_must_dependence.restype = c_void_p
isl.isl_union_flow_get_full_must_dependence.argtypes = [c_void_p]
isl.isl_union_flow_get_may_dependence.restype = c_void_p
isl.isl_union_flow_get_may_dependence.argtypes = [c_void_p]
isl.isl_union_flow_get_may_no_source.restype = c_void_p
isl.isl_union_flow_get_may_no_source.argtypes = [c_void_p]
isl.isl_union_flow_get_must_dependence.restype = c_void_p
isl.isl_union_flow_get_must_dependence.argtypes = [c_void_p]
isl.isl_union_flow_get_must_no_source.restype = c_void_p
isl.isl_union_flow_get_must_no_source.argtypes = [c_void_p]
isl.isl_union_flow_copy.restype = c_void_p
isl.isl_union_flow_copy.argtypes = [c_void_p]
isl.isl_union_flow_free.restype = c_void_p
isl.isl_union_flow_free.argtypes = [c_void_p]
isl.isl_union_flow_to_str.restype = POINTER(c_char)
isl.isl_union_flow_to_str.argtypes = [c_void_p]
class union_pw_aff_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_aff_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and args[0].__class__ is union_pw_aff:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_aff_list_from_union_pw_aff(isl.isl_union_pw_aff_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_pw_aff_list_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_union_pw_aff_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is union_pw_aff_list:
arg0 = union_pw_aff_list(arg0)
except:
raise
ptr = isl.isl_union_pw_aff_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.union_pw_aff_list("""%s""")' % s
else:
return 'isl.union_pw_aff_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff_list:
arg0 = union_pw_aff_list(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_aff:
arg1 = union_pw_aff(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_list_add(isl.isl_union_pw_aff_list_copy(arg0.ptr), isl.isl_union_pw_aff_copy(arg1.ptr))
obj = union_pw_aff_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff_list:
arg0 = union_pw_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_list_get_at(arg0.ptr, arg1)
obj = union_pw_aff(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is union_pw_aff_list:
arg0 = union_pw_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_list_clear(isl.isl_union_pw_aff_list_copy(arg0.ptr))
obj = union_pw_aff_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff_list:
arg0 = union_pw_aff_list(arg0)
except:
raise
try:
if not arg1.__class__ is union_pw_aff_list:
arg1 = union_pw_aff_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_list_concat(isl.isl_union_pw_aff_list_copy(arg0.ptr), isl.isl_union_pw_aff_list_copy(arg1.ptr))
obj = union_pw_aff_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is union_pw_aff_list:
arg0 = union_pw_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_list_drop(isl.isl_union_pw_aff_list_copy(arg0.ptr), arg1, arg2)
obj = union_pw_aff_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is union_pw_aff_list:
arg0 = union_pw_aff_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = union_pw_aff(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_union_pw_aff_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is union_pw_aff_list:
arg0 = union_pw_aff_list(arg0)
except:
raise
try:
if not arg2.__class__ is union_pw_aff:
arg2 = union_pw_aff(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_list_insert(isl.isl_union_pw_aff_list_copy(arg0.ptr), arg1, isl.isl_union_pw_aff_copy(arg2.ptr))
obj = union_pw_aff_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is union_pw_aff_list:
arg0 = union_pw_aff_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_pw_aff_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_union_pw_aff_list_alloc.restype = c_void_p
isl.isl_union_pw_aff_list_alloc.argtypes = [Context, c_int]
isl.isl_union_pw_aff_list_from_union_pw_aff.restype = c_void_p
isl.isl_union_pw_aff_list_from_union_pw_aff.argtypes = [c_void_p]
isl.isl_union_pw_aff_list_read_from_str.restype = c_void_p
isl.isl_union_pw_aff_list_read_from_str.argtypes = [Context, c_char_p]
isl.isl_union_pw_aff_list_add.restype = c_void_p
isl.isl_union_pw_aff_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_list_get_at.restype = c_void_p
isl.isl_union_pw_aff_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_union_pw_aff_list_clear.restype = c_void_p
isl.isl_union_pw_aff_list_clear.argtypes = [c_void_p]
isl.isl_union_pw_aff_list_concat.restype = c_void_p
isl.isl_union_pw_aff_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_union_pw_aff_list_drop.restype = c_void_p
isl.isl_union_pw_aff_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_union_pw_aff_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_union_pw_aff_list_insert.restype = c_void_p
isl.isl_union_pw_aff_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_union_pw_aff_list_size.argtypes = [c_void_p]
isl.isl_union_pw_aff_list_copy.restype = c_void_p
isl.isl_union_pw_aff_list_copy.argtypes = [c_void_p]
isl.isl_union_pw_aff_list_free.restype = c_void_p
isl.isl_union_pw_aff_list_free.argtypes = [c_void_p]
isl.isl_union_pw_aff_list_to_str.restype = POINTER(c_char)
isl.isl_union_pw_aff_list_to_str.argtypes = [c_void_p]
class union_set_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_set_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and args[0].__class__ is union_set:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_set_list_from_union_set(isl.isl_union_set_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_union_set_list_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_union_set_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is union_set_list:
arg0 = union_set_list(arg0)
except:
raise
ptr = isl.isl_union_set_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.union_set_list("""%s""")' % s
else:
return 'isl.union_set_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is union_set_list:
arg0 = union_set_list(arg0)
except:
raise
try:
if not arg1.__class__ is union_set:
arg1 = union_set(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_list_add(isl.isl_union_set_list_copy(arg0.ptr), isl.isl_union_set_copy(arg1.ptr))
obj = union_set_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is union_set_list:
arg0 = union_set_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_list_get_at(arg0.ptr, arg1)
obj = union_set(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is union_set_list:
arg0 = union_set_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_list_clear(isl.isl_union_set_list_copy(arg0.ptr))
obj = union_set_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is union_set_list:
arg0 = union_set_list(arg0)
except:
raise
try:
if not arg1.__class__ is union_set_list:
arg1 = union_set_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_list_concat(isl.isl_union_set_list_copy(arg0.ptr), isl.isl_union_set_list_copy(arg1.ptr))
obj = union_set_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is union_set_list:
arg0 = union_set_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_list_drop(isl.isl_union_set_list_copy(arg0.ptr), arg1, arg2)
obj = union_set_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is union_set_list:
arg0 = union_set_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = union_set(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_union_set_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is union_set_list:
arg0 = union_set_list(arg0)
except:
raise
try:
if not arg2.__class__ is union_set:
arg2 = union_set(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_list_insert(isl.isl_union_set_list_copy(arg0.ptr), arg1, isl.isl_union_set_copy(arg2.ptr))
obj = union_set_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is union_set_list:
arg0 = union_set_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_union_set_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_union_set_list_alloc.restype = c_void_p
isl.isl_union_set_list_alloc.argtypes = [Context, c_int]
isl.isl_union_set_list_from_union_set.restype = c_void_p
isl.isl_union_set_list_from_union_set.argtypes = [c_void_p]
isl.isl_union_set_list_read_from_str.restype = c_void_p
isl.isl_union_set_list_read_from_str.argtypes = [Context, c_char_p]
isl.isl_union_set_list_add.restype = c_void_p
isl.isl_union_set_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_list_get_at.restype = c_void_p
isl.isl_union_set_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_union_set_list_clear.restype = c_void_p
isl.isl_union_set_list_clear.argtypes = [c_void_p]
isl.isl_union_set_list_concat.restype = c_void_p
isl.isl_union_set_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_union_set_list_drop.restype = c_void_p
isl.isl_union_set_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_union_set_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_union_set_list_insert.restype = c_void_p
isl.isl_union_set_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_union_set_list_size.argtypes = [c_void_p]
isl.isl_union_set_list_copy.restype = c_void_p
isl.isl_union_set_list_copy.argtypes = [c_void_p]
isl.isl_union_set_list_free.restype = c_void_p
isl.isl_union_set_list_free.argtypes = [c_void_p]
isl.isl_union_set_list_to_str.restype = POINTER(c_char)
isl.isl_union_set_list_to_str.argtypes = [c_void_p]
class val(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_val_int_from_si(self.ctx, args[0])
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_val_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_val_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ptr = isl.isl_val_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.val("""%s""")' % s
else:
return 'isl.val("%s")' % s
def abs(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_abs(isl.isl_val_copy(arg0.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def abs_eq(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_abs_eq(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def add(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_add(isl.isl_val_copy(arg0.ptr), isl.isl_val_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def ceil(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_ceil(isl.isl_val_copy(arg0.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def cmp_si(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_cmp_si(arg0.ptr, arg1)
return res
def den_si(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_get_den_si(arg0.ptr)
return res
def get_den_si(arg0):
return arg0.den_si()
def div(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_div(isl.isl_val_copy(arg0.ptr), isl.isl_val_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def eq(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_eq(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def floor(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_floor(isl.isl_val_copy(arg0.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def gcd(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_gcd(isl.isl_val_copy(arg0.ptr), isl.isl_val_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def ge(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_ge(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def gt(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_gt(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
@staticmethod
def infty():
ctx = Context.getDefaultInstance()
res = isl.isl_val_infty(ctx)
obj = val(ctx=ctx, ptr=res)
return obj
def inv(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_inv(isl.isl_val_copy(arg0.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def is_divisible_by(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_divisible_by(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def is_infty(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_infty(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_int(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_int(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_nan(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_nan(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_neg(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_neg(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_neginfty(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_neginfty(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_negone(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_negone(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_nonneg(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_nonneg(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_nonpos(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_nonpos(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_one(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_one(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_pos(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_pos(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_rat(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_rat(arg0.ptr)
if res < 0:
raise
return bool(res)
def is_zero(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_is_zero(arg0.ptr)
if res < 0:
raise
return bool(res)
def le(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_le(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def lt(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_lt(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def max(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_max(isl.isl_val_copy(arg0.ptr), isl.isl_val_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def min(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_min(isl.isl_val_copy(arg0.ptr), isl.isl_val_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def mod(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_mod(isl.isl_val_copy(arg0.ptr), isl.isl_val_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def mul(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_mul(isl.isl_val_copy(arg0.ptr), isl.isl_val_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
@staticmethod
def nan():
ctx = Context.getDefaultInstance()
res = isl.isl_val_nan(ctx)
obj = val(ctx=ctx, ptr=res)
return obj
def ne(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_ne(arg0.ptr, arg1.ptr)
if res < 0:
raise
return bool(res)
def neg(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_neg(isl.isl_val_copy(arg0.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
@staticmethod
def neginfty():
ctx = Context.getDefaultInstance()
res = isl.isl_val_neginfty(ctx)
obj = val(ctx=ctx, ptr=res)
return obj
@staticmethod
def negone():
ctx = Context.getDefaultInstance()
res = isl.isl_val_negone(ctx)
obj = val(ctx=ctx, ptr=res)
return obj
def num_si(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_get_num_si(arg0.ptr)
return res
def get_num_si(arg0):
return arg0.num_si()
@staticmethod
def one():
ctx = Context.getDefaultInstance()
res = isl.isl_val_one(ctx)
obj = val(ctx=ctx, ptr=res)
return obj
def pow2(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_pow2(isl.isl_val_copy(arg0.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def sgn(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_sgn(arg0.ptr)
return res
def sub(arg0, arg1):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_sub(isl.isl_val_copy(arg0.ptr), isl.isl_val_copy(arg1.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
def to_list(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_to_list(isl.isl_val_copy(arg0.ptr))
obj = val_list(ctx=ctx, ptr=res)
return obj
def trunc(arg0):
try:
if not arg0.__class__ is val:
arg0 = val(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_trunc(isl.isl_val_copy(arg0.ptr))
obj = val(ctx=ctx, ptr=res)
return obj
@staticmethod
def zero():
ctx = Context.getDefaultInstance()
res = isl.isl_val_zero(ctx)
obj = val(ctx=ctx, ptr=res)
return obj
isl.isl_val_int_from_si.restype = c_void_p
isl.isl_val_int_from_si.argtypes = [Context, c_long]
isl.isl_val_read_from_str.restype = c_void_p
isl.isl_val_read_from_str.argtypes = [Context, c_char_p]
isl.isl_val_abs.restype = c_void_p
isl.isl_val_abs.argtypes = [c_void_p]
isl.isl_val_abs_eq.argtypes = [c_void_p, c_void_p]
isl.isl_val_add.restype = c_void_p
isl.isl_val_add.argtypes = [c_void_p, c_void_p]
isl.isl_val_ceil.restype = c_void_p
isl.isl_val_ceil.argtypes = [c_void_p]
isl.isl_val_cmp_si.argtypes = [c_void_p, c_long]
isl.isl_val_get_den_si.argtypes = [c_void_p]
isl.isl_val_div.restype = c_void_p
isl.isl_val_div.argtypes = [c_void_p, c_void_p]
isl.isl_val_eq.argtypes = [c_void_p, c_void_p]
isl.isl_val_floor.restype = c_void_p
isl.isl_val_floor.argtypes = [c_void_p]
isl.isl_val_gcd.restype = c_void_p
isl.isl_val_gcd.argtypes = [c_void_p, c_void_p]
isl.isl_val_ge.argtypes = [c_void_p, c_void_p]
isl.isl_val_gt.argtypes = [c_void_p, c_void_p]
isl.isl_val_infty.restype = c_void_p
isl.isl_val_infty.argtypes = [Context]
isl.isl_val_inv.restype = c_void_p
isl.isl_val_inv.argtypes = [c_void_p]
isl.isl_val_is_divisible_by.argtypes = [c_void_p, c_void_p]
isl.isl_val_is_infty.argtypes = [c_void_p]
isl.isl_val_is_int.argtypes = [c_void_p]
isl.isl_val_is_nan.argtypes = [c_void_p]
isl.isl_val_is_neg.argtypes = [c_void_p]
isl.isl_val_is_neginfty.argtypes = [c_void_p]
isl.isl_val_is_negone.argtypes = [c_void_p]
isl.isl_val_is_nonneg.argtypes = [c_void_p]
isl.isl_val_is_nonpos.argtypes = [c_void_p]
isl.isl_val_is_one.argtypes = [c_void_p]
isl.isl_val_is_pos.argtypes = [c_void_p]
isl.isl_val_is_rat.argtypes = [c_void_p]
isl.isl_val_is_zero.argtypes = [c_void_p]
isl.isl_val_le.argtypes = [c_void_p, c_void_p]
isl.isl_val_lt.argtypes = [c_void_p, c_void_p]
isl.isl_val_max.restype = c_void_p
isl.isl_val_max.argtypes = [c_void_p, c_void_p]
isl.isl_val_min.restype = c_void_p
isl.isl_val_min.argtypes = [c_void_p, c_void_p]
isl.isl_val_mod.restype = c_void_p
isl.isl_val_mod.argtypes = [c_void_p, c_void_p]
isl.isl_val_mul.restype = c_void_p
isl.isl_val_mul.argtypes = [c_void_p, c_void_p]
isl.isl_val_nan.restype = c_void_p
isl.isl_val_nan.argtypes = [Context]
isl.isl_val_ne.argtypes = [c_void_p, c_void_p]
isl.isl_val_neg.restype = c_void_p
isl.isl_val_neg.argtypes = [c_void_p]
isl.isl_val_neginfty.restype = c_void_p
isl.isl_val_neginfty.argtypes = [Context]
isl.isl_val_negone.restype = c_void_p
isl.isl_val_negone.argtypes = [Context]
isl.isl_val_get_num_si.argtypes = [c_void_p]
isl.isl_val_one.restype = c_void_p
isl.isl_val_one.argtypes = [Context]
isl.isl_val_pow2.restype = c_void_p
isl.isl_val_pow2.argtypes = [c_void_p]
isl.isl_val_sgn.argtypes = [c_void_p]
isl.isl_val_sub.restype = c_void_p
isl.isl_val_sub.argtypes = [c_void_p, c_void_p]
isl.isl_val_to_list.restype = c_void_p
isl.isl_val_to_list.argtypes = [c_void_p]
isl.isl_val_trunc.restype = c_void_p
isl.isl_val_trunc.argtypes = [c_void_p]
isl.isl_val_zero.restype = c_void_p
isl.isl_val_zero.argtypes = [Context]
isl.isl_val_copy.restype = c_void_p
isl.isl_val_copy.argtypes = [c_void_p]
isl.isl_val_free.restype = c_void_p
isl.isl_val_free.argtypes = [c_void_p]
isl.isl_val_to_str.restype = POINTER(c_char)
isl.isl_val_to_str.argtypes = [c_void_p]
class val_list(object):
def __init__(self, *args, **keywords):
if "ptr" in keywords:
self.ctx = keywords["ctx"]
self.ptr = keywords["ptr"]
return
if len(args) == 1 and type(args[0]) == int:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_val_list_alloc(self.ctx, args[0])
return
if len(args) == 1 and (args[0].__class__ is val or type(args[0]) == int):
args = list(args)
try:
if not args[0].__class__ is val:
args[0] = val(args[0])
except:
raise
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_val_list_from_val(isl.isl_val_copy(args[0].ptr))
return
if len(args) == 1 and type(args[0]) == str:
self.ctx = Context.getDefaultInstance()
self.ptr = isl.isl_val_list_read_from_str(self.ctx, args[0].encode('ascii'))
return
raise Error
def __del__(self):
if hasattr(self, 'ptr'):
isl.isl_val_list_free(self.ptr)
def __str__(arg0):
try:
if not arg0.__class__ is val_list:
arg0 = val_list(arg0)
except:
raise
ptr = isl.isl_val_list_to_str(arg0.ptr)
res = cast(ptr, c_char_p).value.decode('ascii')
libc.free(ptr)
return res
def __repr__(self):
s = str(self)
if '"' in s:
return 'isl.val_list("""%s""")' % s
else:
return 'isl.val_list("%s")' % s
def add(arg0, arg1):
try:
if not arg0.__class__ is val_list:
arg0 = val_list(arg0)
except:
raise
try:
if not arg1.__class__ is val:
arg1 = val(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_list_add(isl.isl_val_list_copy(arg0.ptr), isl.isl_val_copy(arg1.ptr))
obj = val_list(ctx=ctx, ptr=res)
return obj
def at(arg0, arg1):
try:
if not arg0.__class__ is val_list:
arg0 = val_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_list_get_at(arg0.ptr, arg1)
obj = val(ctx=ctx, ptr=res)
return obj
def get_at(arg0, arg1):
return arg0.at(arg1)
def clear(arg0):
try:
if not arg0.__class__ is val_list:
arg0 = val_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_list_clear(isl.isl_val_list_copy(arg0.ptr))
obj = val_list(ctx=ctx, ptr=res)
return obj
def concat(arg0, arg1):
try:
if not arg0.__class__ is val_list:
arg0 = val_list(arg0)
except:
raise
try:
if not arg1.__class__ is val_list:
arg1 = val_list(arg1)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_list_concat(isl.isl_val_list_copy(arg0.ptr), isl.isl_val_list_copy(arg1.ptr))
obj = val_list(ctx=ctx, ptr=res)
return obj
def drop(arg0, arg1, arg2):
try:
if not arg0.__class__ is val_list:
arg0 = val_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_list_drop(isl.isl_val_list_copy(arg0.ptr), arg1, arg2)
obj = val_list(ctx=ctx, ptr=res)
return obj
def foreach(arg0, arg1):
try:
if not arg0.__class__ is val_list:
arg0 = val_list(arg0)
except:
raise
exc_info = [None]
fn = CFUNCTYPE(c_int, c_void_p, c_void_p)
def cb_func(cb_arg0, cb_arg1):
cb_arg0 = val(ctx=arg0.ctx, ptr=(cb_arg0))
try:
arg1(cb_arg0)
except BaseException as e:
exc_info[0] = e
return -1
return 0
cb = fn(cb_func)
ctx = arg0.ctx
res = isl.isl_val_list_foreach(arg0.ptr, cb, None)
if exc_info[0] is not None:
raise exc_info[0]
if res < 0:
raise
def insert(arg0, arg1, arg2):
try:
if not arg0.__class__ is val_list:
arg0 = val_list(arg0)
except:
raise
try:
if not arg2.__class__ is val:
arg2 = val(arg2)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_list_insert(isl.isl_val_list_copy(arg0.ptr), arg1, isl.isl_val_copy(arg2.ptr))
obj = val_list(ctx=ctx, ptr=res)
return obj
def size(arg0):
try:
if not arg0.__class__ is val_list:
arg0 = val_list(arg0)
except:
raise
ctx = arg0.ctx
res = isl.isl_val_list_size(arg0.ptr)
if res < 0:
raise
return int(res)
isl.isl_val_list_alloc.restype = c_void_p
isl.isl_val_list_alloc.argtypes = [Context, c_int]
isl.isl_val_list_from_val.restype = c_void_p
isl.isl_val_list_from_val.argtypes = [c_void_p]
isl.isl_val_list_read_from_str.restype = c_void_p
isl.isl_val_list_read_from_str.argtypes = [Context, c_char_p]
isl.isl_val_list_add.restype = c_void_p
isl.isl_val_list_add.argtypes = [c_void_p, c_void_p]
isl.isl_val_list_get_at.restype = c_void_p
isl.isl_val_list_get_at.argtypes = [c_void_p, c_int]
isl.isl_val_list_clear.restype = c_void_p
isl.isl_val_list_clear.argtypes = [c_void_p]
isl.isl_val_list_concat.restype = c_void_p
isl.isl_val_list_concat.argtypes = [c_void_p, c_void_p]
isl.isl_val_list_drop.restype = c_void_p
isl.isl_val_list_drop.argtypes = [c_void_p, c_int, c_int]
isl.isl_val_list_foreach.argtypes = [c_void_p, c_void_p, c_void_p]
isl.isl_val_list_insert.restype = c_void_p
isl.isl_val_list_insert.argtypes = [c_void_p, c_int, c_void_p]
isl.isl_val_list_size.argtypes = [c_void_p]
isl.isl_val_list_copy.restype = c_void_p
isl.isl_val_list_copy.argtypes = [c_void_p]
isl.isl_val_list_free.restype = c_void_p
isl.isl_val_list_free.argtypes = [c_void_p]
isl.isl_val_list_to_str.restype = POINTER(c_char)
isl.isl_val_list_to_str.argtypes = [c_void_p]
| 36.240071 | 186 | 0.609592 | 91,975 | 610,464 | 3.630084 | 0.003914 | 0.084858 | 0.050012 | 0.057309 | 0.983719 | 0.975266 | 0.968177 | 0.952459 | 0.913735 | 0.864106 | 0 | 0.021645 | 0.299263 | 610,464 | 16,844 | 187 | 36.242223 | 0.758852 | 0 | 0 | 0.743521 | 0 | 0 | 0.010418 | 0.006004 | 0 | 0 | 0 | 0 | 0 | 1 | 0.085013 | false | 0.00006 | 0.00018 | 0.009359 | 0.197084 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
13ae8bd6ffaeb2a8cc57771b93bc2c165a936385 | 43,406 | py | Python | JPS_DISPERSION/python/caresjpsadmsinputs/adms_apl_builder_test.py | mdhillmancmcl/TheWorldAvatar-CMCL-Fork | 011aee78c016b76762eaf511c78fabe3f98189f4 | [
"MIT"
] | null | null | null | JPS_DISPERSION/python/caresjpsadmsinputs/adms_apl_builder_test.py | mdhillmancmcl/TheWorldAvatar-CMCL-Fork | 011aee78c016b76762eaf511c78fabe3f98189f4 | [
"MIT"
] | null | null | null | JPS_DISPERSION/python/caresjpsadmsinputs/adms_apl_builder_test.py | mdhillmancmcl/TheWorldAvatar-CMCL-Fork | 011aee78c016b76762eaf511c78fabe3f98189f4 | [
"MIT"
] | null | null | null | import unittest
from adms_apl_builder import *
from adms_apl_test import AdmsAplTestHelper as helper
from config import Constants
class AplDirectorTest(unittest.TestCase):
def test_init(self):
ad = AplDirector()
self.assertIsNone(ad._AplDirector__builder)
def test_set_builder(self):
ad = AplDirector()
ab = AplBuilder({})
ad.set_builder(ab)
self.assertIsInstance(ad._AplDirector__builder, AplBuilder)
def test_get_apl(self):
ad = AplDirector()
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
ad.set_builder(ab)
apl = ad.get_apl()
# Values for pollutants tested in builder tests. Repeating tests is not necessary here.
apl.set_pollutants([])
self.assertEqual(apl.specification(), helper.get_default_apl_builder_specification(helper))
asb = AdmsAplShipBuilder(helper.get_default_apl_builder_data(helper))
ad.set_builder(asb)
apl = ad.get_apl()
# Values for pollutants tested in builder tests. Repeating tests is not necessary here.
apl.set_pollutants([])
self.assertEqual(apl.specification(), helper.get_default_apl_ship_builder_specification(helper))
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
ad.set_builder(apb)
apl = ad.get_apl()
# Values for pollutants tested in builder tests. Repeating tests is not necessary here.
apl.set_pollutants([])
self.maxDiff = None
self.assertEqual(apl.specification(), helper.get_default_apl_plant_builder_specification(helper))
class AplBuilderTest(unittest.TestCase):
def test_init(self):
ab = AplBuilder({})
self.assertEqual(ab.data, {})
self.assertEqual(ab.pollutant_names, helper.get_default_apl_pollutant_names())
def test_get_header(self):
self.assertEqual(AplBuilder.get_header().to_string(), AdmsHeader().to_string())
def test_get_sup(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_sup().to_string(), AdmsSup.to_string(AdmsSup()))
def test_get_met(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_met().to_string(), AdmsMet().to_string())
def test_get_bld(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_bld()._name, "&ADMS_PARAMETERS_BLD")
self.assertEqual(ab.get_bld().BldName, helper.get_default_apl_bld_data().BldName)
self.assertEqual(ab.get_bld().BldNumBuildings, helper.get_default_apl_bld_data().BldNumBuildings)
self.assertEqual(ab.get_bld().BldType, helper.get_default_apl_bld_data().BldType)
self.assertEqual(ab.get_bld().BldX, helper.get_default_apl_bld_data().BldX)
self.assertEqual(ab.get_bld().BldY, helper.get_default_apl_bld_data().BldY)
self.assertEqual(ab.get_bld().BldHeight, helper.get_default_apl_bld_data().BldHeight)
self.assertEqual(ab.get_bld().BldLength, helper.get_default_apl_bld_data().BldLength)
self.assertEqual(ab.get_bld().BldWidth, helper.get_default_apl_bld_data().BldWidth)
self.assertEqual(ab.get_bld().BldAngle, helper.get_default_apl_bld_data().BldAngle)
def test_get_hil(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_hil().to_string(), AdmsHil().to_string())
def test_get_cst(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_cst().to_string(), AdmsCst().to_string())
def test_get_flc(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_flc().to_string(), AdmsFlc().to_string())
def test_get_grd(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_grd()._name, "&ADMS_PARAMETERS_GRD")
self.assertEqual(ab.get_grd().GrdRegularMin[0], 0)
self.assertEqual(ab.get_grd().GrdRegularMin[1], 1)
self.assertEqual(ab.get_grd().GrdRegularMax[0], 2)
self.assertEqual(ab.get_grd().GrdRegularMax[1], 3)
self.assertEqual(ab.get_grd().GrdRegularNumPoints[0], 4)
self.assertEqual(ab.get_grd().GrdRegularNumPoints[1], 5)
def test_get_puf(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_puf().to_string(), AdmsPuf().to_string())
def test_get_gam(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_gam().to_string(), AdmsGam().to_string())
def test_get_opt(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_opt().to_string(), AdmsOpt().to_string())
def test_get_bkg(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_bkg().to_string(), AdmsBkg().to_string())
def test_get_chm(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_chm().to_string(), AdmsChm().to_string())
def test_get_etc(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_etc().to_string(), AdmsEtc().to_string())
def test_get_coordsys(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_coordsys().to_string(), AdmsCoordSys().to_string())
def test_get_mapper(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_mapper().to_string(), AdmsMapper().to_string())
def test_get_pollutants(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertIsNotNone(ab.get_pollutants())
self.assertEqual(len(ab.get_pollutants()), 18)
self.assertEqual(ab.get_pollutants()[0].PolName, Constants.POL_CO2)
self.assertEqual(ab.get_pollutants()[1].PolName, Constants.POL_NOX)
self.assertEqual(ab.get_pollutants()[2].PolName, Constants.POL_NO2)
self.assertEqual(ab.get_pollutants()[3].PolName, Constants.POL_NO)
self.assertEqual(ab.get_pollutants()[4].PolName, Constants.POL_PART_O3)
self.assertEqual(ab.get_pollutants()[5].PolName, Constants.POL_VOC)
self.assertEqual(ab.get_pollutants()[6].PolName, Constants.POL_PART_SO2)
self.assertEqual(ab.get_pollutants()[7].PolName, Constants.POL_PM10)
self.assertEqual(ab.get_pollutants()[8].PolName, Constants.POL_PM25)
self.assertEqual(ab.get_pollutants()[9].PolName, Constants.POL_CO)
self.assertEqual(ab.get_pollutants()[10].PolName, Constants.POL_BENZENE)
self.assertEqual(ab.get_pollutants()[11].PolName, Constants.POL_BUTADIENE)
self.assertEqual(ab.get_pollutants()[12].PolName, Constants.POL_HCl)
self.assertEqual(ab.get_pollutants()[13].PolName, Constants.POL_Cl2)
self.assertEqual(ab.get_pollutants()[14].PolName, Constants.POL_CH3Cl)
self.assertEqual(ab.get_pollutants()[15].PolName, Constants.POL_ISOBUTYLENE)
self.assertEqual(ab.get_pollutants()[16].PolName, Constants.POL_NH3)
self.assertEqual(ab.get_pollutants()[17].PolName, Constants.POL_HC)
self.assertEqual(ab.get_pollutants()[0].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[1].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[2].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[3].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[4].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[5].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[6].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[7].PolPollutantType, 1)
self.assertEqual(ab.get_pollutants()[8].PolPollutantType, 1)
self.assertEqual(ab.get_pollutants()[9].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[10].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[11].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[12].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[13].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[14].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[15].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[16].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[17].PolPollutantType, 0)
self.assertEqual(ab.get_pollutants()[0].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[1].PolGasDepVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[2].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[3].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[4].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[5].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[6].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[7].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[8].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[9].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[10].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[11].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[12].PolGasDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[13].PolGasDepVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[14].PolGasDepVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[15].PolGasDepVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[16].PolGasDepVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[17].PolGasDepVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[0].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[1].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[2].PolGasDepositionVelocity, 1.5e-3)
self.assertEqual(ab.get_pollutants()[3].PolGasDepositionVelocity, 1.5e-3)
self.assertEqual(ab.get_pollutants()[4].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[5].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[6].PolGasDepositionVelocity, 1.2e-2)
self.assertEqual(ab.get_pollutants()[7].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[8].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[9].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[10].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[11].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[12].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[13].PolGasDepositionVelocity, 5.0e+0)
self.assertEqual(ab.get_pollutants()[14].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[15].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[16].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[17].PolGasDepositionVelocity, 0.0e+0)
self.assertEqual(ab.get_pollutants()[0].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[1].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[2].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[3].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[4].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[5].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[6].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[7].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[8].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[9].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[10].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[11].PolGasType, 1)
self.assertEqual(ab.get_pollutants()[12].PolGasType, 0)
self.assertEqual(ab.get_pollutants()[13].PolGasType, 0)
self.assertEqual(ab.get_pollutants()[14].PolGasType, 0)
self.assertEqual(ab.get_pollutants()[15].PolGasType, 0)
self.assertEqual(ab.get_pollutants()[16].PolGasType, 0)
self.assertEqual(ab.get_pollutants()[17].PolGasType, 0)
self.assertEqual(ab.get_pollutants()[0].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[1].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[2].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[3].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[4].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[5].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[6].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[7].PolParDepVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[8].PolParDepVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[9].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[10].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[11].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[12].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[13].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[14].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[15].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[16].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[17].PolParDepVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[0].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[1].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[2].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[3].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[4].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[5].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[6].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[7].PolParTermVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[8].PolParTermVelocityKnown, 0)
self.assertEqual(ab.get_pollutants()[9].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[10].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[11].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[12].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[13].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[14].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[15].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[16].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[17].PolParTermVelocityKnown, 1)
self.assertEqual(ab.get_pollutants()[0].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[1].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[2].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[3].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[4].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[5].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[6].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[7].PolParDiameter, 1.0e-5)
self.assertEqual(ab.get_pollutants()[8].PolParDiameter, 2.5e-6)
self.assertEqual(ab.get_pollutants()[9].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[10].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[11].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[12].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[13].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[14].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[15].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[16].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[17].PolParDiameter, 1.0e-6)
self.assertEqual(ab.get_pollutants()[0].PolWetWashoutKnown, 0)
self.assertEqual(ab.get_pollutants()[1].PolWetWashoutKnown, 0)
self.assertEqual(ab.get_pollutants()[2].PolWetWashoutKnown, 0)
self.assertEqual(ab.get_pollutants()[3].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[4].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[5].PolWetWashoutKnown, 0)
self.assertEqual(ab.get_pollutants()[6].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[7].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[8].PolWetWashoutKnown, 0)
self.assertEqual(ab.get_pollutants()[9].PolWetWashoutKnown, 0)
self.assertEqual(ab.get_pollutants()[10].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[11].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[12].PolWetWashoutKnown, 0)
self.assertEqual(ab.get_pollutants()[13].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[14].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[15].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[16].PolWetWashoutKnown, 0)
self.assertEqual(ab.get_pollutants()[17].PolWetWashoutKnown, 1)
self.assertEqual(ab.get_pollutants()[0].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[1].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[2].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[3].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[4].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[5].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[6].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[7].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[8].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[9].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[10].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[11].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[12].PolWetWashout, 0.0e+0)
self.assertEqual(ab.get_pollutants()[13].PolWetWashout, 1.0e-4)
self.assertEqual(ab.get_pollutants()[14].PolWetWashout, 1.0e-4)
self.assertEqual(ab.get_pollutants()[15].PolWetWashout, 1.0e-4)
self.assertEqual(ab.get_pollutants()[16].PolWetWashout, 1.0e-4)
self.assertEqual(ab.get_pollutants()[17].PolWetWashout, 1.0e-4)
self.assertEqual(ab.get_pollutants()[0].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[1].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[2].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[3].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[4].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[5].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[6].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[7].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[8].PolWetWashoutA, 3.552e-1)
self.assertEqual(ab.get_pollutants()[9].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[10].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[11].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[12].PolWetWashoutA, 3.0e-4)
self.assertEqual(ab.get_pollutants()[13].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[14].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[15].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[16].PolWetWashoutA, 5.0e-3)
self.assertEqual(ab.get_pollutants()[17].PolWetWashoutA, 1.0e-4)
self.assertEqual(ab.get_pollutants()[0].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[1].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[2].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[3].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[4].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[5].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[6].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[7].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[8].PolWetWashoutB, 5.394e-1)
self.assertEqual(ab.get_pollutants()[9].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[10].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[11].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[12].PolWetWashoutB, 6.6e-1)
self.assertEqual(ab.get_pollutants()[13].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[14].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[15].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[16].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[17].PolWetWashoutB, 6.4e-1)
self.assertEqual(ab.get_pollutants()[0].PolConvFactor, 5.47e-1)
self.assertEqual(ab.get_pollutants()[1].PolConvFactor, 5.2e-1)
self.assertEqual(ab.get_pollutants()[2].PolConvFactor, 5.2e-1)
self.assertEqual(ab.get_pollutants()[3].PolConvFactor, 8.0e-1)
self.assertEqual(ab.get_pollutants()[4].PolConvFactor, 5.0e-1)
self.assertEqual(ab.get_pollutants()[5].PolConvFactor, 3.1e-1)
self.assertEqual(ab.get_pollutants()[6].PolConvFactor, 3.7e-1)
self.assertEqual(ab.get_pollutants()[7].PolConvFactor, 1.0e+0)
self.assertEqual(ab.get_pollutants()[8].PolConvFactor, 1.0e+0)
self.assertEqual(ab.get_pollutants()[9].PolConvFactor, 8.6e-1)
self.assertEqual(ab.get_pollutants()[10].PolConvFactor, 3.1e-1)
self.assertEqual(ab.get_pollutants()[11].PolConvFactor, 4.5e-1)
self.assertEqual(ab.get_pollutants()[12].PolConvFactor, 6.589e-1)
self.assertEqual(ab.get_pollutants()[13].PolConvFactor, 3.5e-1)
self.assertEqual(ab.get_pollutants()[14].PolConvFactor, 4.922e-1)
self.assertEqual(ab.get_pollutants()[15].PolConvFactor, 4.43e-1)
self.assertEqual(ab.get_pollutants()[16].PolConvFactor, 1.462e+0)
self.assertEqual(ab.get_pollutants()[17].PolConvFactor, 0.802e+0)
self.assertEqual(ab.get_pollutants()[0].PolBkgLevel, 4.14e+5)
self.assertEqual(ab.get_pollutants()[1].PolBkgLevel, 6.0e+1)
self.assertEqual(ab.get_pollutants()[2].PolBkgLevel, 4.41e+1)
self.assertEqual(ab.get_pollutants()[3].PolBkgLevel, 0.0e+0)
self.assertEqual(ab.get_pollutants()[4].PolBkgLevel, 6.899e+1)
self.assertEqual(ab.get_pollutants()[5].PolBkgLevel, 0.0e+0)
self.assertEqual(ab.get_pollutants()[6].PolBkgLevel, 1.513e+1)
self.assertEqual(ab.get_pollutants()[7].PolBkgLevel, 5.63e+1)
self.assertEqual(ab.get_pollutants()[8].PolBkgLevel, 8.0e+0)
self.assertEqual(ab.get_pollutants()[9].PolBkgLevel, 1.222e+3)
self.assertEqual(ab.get_pollutants()[10].PolBkgLevel, 0.0e+0)
self.assertEqual(ab.get_pollutants()[11].PolBkgLevel, 0.0e+0)
self.assertEqual(ab.get_pollutants()[12].PolBkgLevel, 0.0e+0)
self.assertEqual(ab.get_pollutants()[13].PolBkgLevel, 0.0e+0)
self.assertEqual(ab.get_pollutants()[14].PolBkgLevel, 6.0e-1)
self.assertEqual(ab.get_pollutants()[15].PolBkgLevel, 0.0e+0)
self.assertEqual(ab.get_pollutants()[16].PolBkgLevel, 6.0e+0)
self.assertEqual(ab.get_pollutants()[17].PolBkgLevel, 0.0e+0)
self.assertEqual(ab.get_pollutants()[0].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[1].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[2].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[3].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[4].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[5].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[6].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[7].PolBkgUnits, Constants.UNIT_UGM3)
self.assertEqual(ab.get_pollutants()[8].PolBkgUnits, Constants.UNIT_UGM3)
self.assertEqual(ab.get_pollutants()[9].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[10].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[11].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[12].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[13].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[14].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[15].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[16].PolBkgUnits, Constants.UNIT_PPB)
self.assertEqual(ab.get_pollutants()[17].PolBkgUnits, Constants.UNIT_PPB)
def test_get_sources(self):
ab = AplBuilder({Constants.KEY_SRC: [AdmsSrc()]})
self.assertIsNotNone(ab.get_sources())
self.assertEqual(len(ab.get_sources()), 1)
self.assertEqual(ab.get_sources()[0].to_string(), AdmsSrc().to_string())
def test_get_pol_type(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_type("test"), 0)
self.assertEqual(ab.get_pol_type(Constants.POL_PM10), 1)
self.assertEqual(ab.get_pol_type(Constants.POL_PM25), 1)
def test_get_pol_gas_dep_velocity_known(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_gas_dep_velocity_known("test"), 1)
type_0 = [Constants.POL_Cl2, Constants.POL_CH3Cl, Constants.POL_ISOBUTYLENE, Constants.POL_NH3,
Constants.POL_HC, Constants.POL_NOX]
for name in type_0:
self.assertEqual(ab.get_pol_gas_dep_velocity_known(name), 0)
def test_get_pol_gas_dep_velocity(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_gas_dep_velocity("test"), 0.0e+0)
self.assertEqual(ab.get_pol_gas_dep_velocity(Constants.POL_NO2), 1.5e-3)
self.assertEqual(ab.get_pol_gas_dep_velocity(Constants.POL_NO), 1.5e-3)
self.assertEqual(ab.get_pol_gas_dep_velocity(Constants.POL_PART_SO2), 1.2e-2)
self.assertEqual(ab.get_pol_gas_dep_velocity(Constants.POL_Cl2), 5.0e+0)
def test_get_pol_gas_type(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_gas_type("test"), 1)
type_0 = [Constants.POL_HCl, Constants.POL_Cl2, Constants.POL_CH3Cl, Constants.POL_ISOBUTYLENE,
Constants.POL_NH3, Constants.POL_HC]
for name in type_0:
self.assertEqual(ab.get_pol_gas_type(name), 0)
def test_get_pol_par_dep_velocity_known(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_par_dep_velocity_known("test"), 1)
self.assertEqual(ab.get_pol_par_dep_velocity_known(Constants.POL_PM10), 0)
self.assertEqual(ab.get_pol_par_dep_velocity_known(Constants.POL_PM25), 0)
def test_get_pol_par_term_velocity_known(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_par_term_velocity_known("test"), 1)
self.assertEqual(ab.get_pol_par_term_velocity_known(Constants.POL_PM10), 0)
self.assertEqual(ab.get_pol_par_term_velocity_known(Constants.POL_PM25), 0)
def test_get_pol_par_diameter(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_par_diameter("test"), 1.0e-6)
self.assertEqual(ab.get_pol_par_diameter(Constants.POL_PM10), 1.0e-5)
self.assertEqual(ab.get_pol_par_diameter(Constants.POL_PM25), 2.5e-6)
def test_get_pol_wet_washout_known(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_wet_washout_known("test"), 0)
type_1 = [Constants.POL_NO, Constants.POL_PART_O3, Constants.POL_PART_SO2, Constants.POL_PM10,
Constants.POL_BENZENE, Constants.POL_BUTADIENE, Constants.POL_Cl2, Constants.POL_CH3Cl,
Constants.POL_ISOBUTYLENE, Constants.POL_HC]
for name in type_1:
self.assertEqual(ab.get_pol_wet_washout_known(name), 1)
def test_get_pol_wet_washout(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_wet_washout("test"), 0.0e+0)
type_1 = [Constants.POL_Cl2, Constants.POL_CH3Cl, Constants.POL_ISOBUTYLENE, Constants.POL_NH3,
Constants.POL_HC]
for name in type_1:
self.assertEqual(ab.get_pol_wet_washout(name), 1.0e-4)
def test_get_pol_wet_washout_a(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_wet_washout_a("test"), 1.0e-4)
self.assertEqual(ab.get_pol_wet_washout_a(Constants.POL_PM25), 3.552e-1)
self.assertEqual(ab.get_pol_wet_washout_a(Constants.POL_HCl), 3.0e-4)
self.assertEqual(ab.get_pol_wet_washout_a(Constants.POL_NH3), 5.0e-3)
def test_get_pol_wet_washout_b(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_wet_washout_b("test"), 6.4e-1)
self.assertEqual(ab.get_pol_wet_washout_b(Constants.POL_PM25), 5.394e-1)
self.assertEqual(ab.get_pol_wet_washout_b(Constants.POL_HCl), 6.6e-1)
def test_get_pol_conv_factor(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_CO2), 5.47e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_NOX), 5.2e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_NO2), 5.2e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_NO), 8.0e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_PART_O3), 5.0e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_VOC), 3.1e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_PART_SO2), 3.7e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_PM10), 1.0e+0)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_PM25), 1.0e+0)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_CO), 8.6e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_BENZENE), 3.1e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_BUTADIENE), 4.5e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_HCl), 6.589e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_Cl2), 3.5e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_CH3Cl), 4.922e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_ISOBUTYLENE), 4.43e-1)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_NH3), 1.462e+0)
self.assertEqual(ab.get_pol_conv_factor(Constants.POL_HC), 0.802e+0)
def test_get_pol_bkg_level(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_CO2), 4.14e+5)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_NOX), 6.0e+1)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_NO2), 4.41e+1)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_NO), 0.0e+0)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_PART_O3), 6.899e+1)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_VOC), 0.0e+0)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_PART_SO2), 1.513e+1)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_PM10), 5.63e+1)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_PM25), 8.0e+0)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_CO), 1.222e+3)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_BENZENE), 0.0e+0)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_BUTADIENE), 0.0e+0)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_HCl), 0.0e+0)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_Cl2), 0.0e+0)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_CH3Cl), 6.0e-1)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_ISOBUTYLENE), 0.0e+0)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_NH3), 6.0e+0)
self.assertEqual(ab.get_pol_bkg_level(Constants.POL_HC), 0.0e+0)
def test_get_pol_bkg_units(self):
ab = AplBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(ab.get_pol_bkg_units("test"), Constants.UNIT_PPB)
self.assertEqual(ab.get_pol_bkg_units(Constants.POL_PM10), Constants.UNIT_UGM3)
self.assertEqual(ab.get_pol_bkg_units(Constants.POL_PM25), Constants.UNIT_UGM3)
class AplShipBuilderTest(unittest.TestCase):
def test_get_sup(self):
asb = AdmsAplShipBuilder(helper.get_default_apl_builder_data(helper))
tst_sup = AdmsSup()
tst_sup.SupModelComplexTerrain = 0
tst_sup.SupCalcChm = 0
tst_sup.SupUseAddInput = 0
tst_sup.SupAddInputPath = "test"
tst_sup.SupCalcWetDep = 0
self.assertEqual(asb.get_sup().to_string(), tst_sup.to_string())
def test_get_met(self):
asb = AdmsAplShipBuilder(helper.get_default_apl_builder_data(helper))
tst_met = AdmsMet()
tst_met.MetDataFileWellFormedPath = "test"
tst_met.MetLatitude = 1
self.assertEqual(asb.get_met().to_string(), tst_met.to_string())
def test_get_hil(self):
asb = AdmsAplShipBuilder(helper.get_default_apl_builder_data(helper))
tst_hil = AdmsHil()
tst_hil.HilTerrainPath = Constants.FILEPATH_HIL_HK
self.assertEqual(asb.get_hil().to_string(), tst_hil.to_string())
def test_get_bkg(self):
asb = AdmsAplShipBuilder(helper.get_default_apl_builder_data(helper))
tst_bkg = AdmsBkg()
tst_bkg.BkgFilePath = "test"
self.assertEqual(asb.get_bkg().to_string(), tst_bkg.to_string())
def test_get_bkg(self):
asb = AdmsAplShipBuilder(helper.get_default_apl_builder_data(helper))
tst_etc = AdmsEtc()
tst_etc.SrcNumSources = len(helper.get_default_apl_builder_data(helper)[Constants.KEY_SRC])
self.assertEqual(asb.get_etc().to_string(), tst_etc.to_string())
def test_get_pol_wet_washout(self):
asb = AdmsAplShipBuilder(helper.get_default_apl_builder_data(helper))
type_1 = [Constants.POL_Cl2, Constants.POL_CH3Cl, Constants.POL_ISOBUTYLENE, Constants.POL_NH3,
Constants.POL_HC]
for name in type_1:
self.assertEqual(asb.get_pol_wet_washout(name), 1.0e-4)
self.assertEqual(asb.get_pol_wet_washout("test"), 0.0e+0)
self.assertEqual(asb.get_pol_wet_washout(Constants.POL_PART_SO2), 2.0e-4)
self.assertEqual(asb.get_pol_wet_washout(Constants.POL_PM10), 3.0e-4)
def test_get_pollutants(self):
asb = AdmsAplShipBuilder(helper.get_default_apl_builder_data(helper))
polls = asb.get_pollutants()
self.assertIsNotNone(polls)
self.assertEqual(len(polls), 16)
self.assertEqual(polls[0].PolName, Constants.POL_CO2)
self.assertEqual(polls[1].PolName, Constants.POL_NOX)
self.assertEqual(polls[2].PolName, Constants.POL_NO2)
self.assertEqual(polls[3].PolName, Constants.POL_NO)
self.assertEqual(polls[4].PolName, Constants.POL_PART_O3)
self.assertEqual(polls[5].PolName, Constants.POL_VOC)
self.assertEqual(polls[6].PolName, Constants.POL_PART_SO2)
self.assertEqual(polls[7].PolName, Constants.POL_CO)
self.assertEqual(polls[8].PolName, Constants.POL_BENZENE)
self.assertEqual(polls[9].PolName, Constants.POL_BUTADIENE)
self.assertEqual(polls[10].PolName, Constants.POL_HCl)
self.assertEqual(polls[11].PolName, Constants.POL_Cl2)
self.assertEqual(polls[12].PolName, Constants.POL_CH3Cl)
self.assertEqual(polls[13].PolName, Constants.POL_ISOBUTYLENE)
self.assertEqual(polls[14].PolName, Constants.POL_NH3)
self.assertEqual(polls[15].PolName, Constants.POL_HC)
# Values for pollutants same as in superclass. Repeating tests is not necessary here.
class AplPlantBuilderTest(unittest.TestCase):
def test_get_sup(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
tst_sup = AdmsSup()
tst_sup.SupModelComplexTerrain = 0
tst_sup.SupCalcChm = 0
tst_sup.SupCalcWetDep = 0
tst_sup.SupCalcPlumeVisibility = 0
self.assertEqual(apb.get_sup().to_string(), tst_sup.to_string())
def test_get_met(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
tst_met = AdmsMet()
tst_met.Met_DS_Roughness = 1.5e+0
tst_met.MetDataFileWellFormedPath = "test"
tst_met.MetLatitude = 1.09e+0
self.assertEqual(apb.get_met().to_string(), tst_met.to_string())
def test_get_hil(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
tst_hil = AdmsHil()
tst_hil.HilTerrainPath = Constants.FILEPATH_HIL_SG
self.assertEqual(apb.get_hil().to_string(), tst_hil.to_string())
def test_get_bkg(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
tst_bkg = AdmsBkg()
tst_bkg.BkgFilePath = Constants.FILEPATH_HIL_BGD
tst_bkg.BkgFixedLevels = 1
self.assertEqual(apb.get_bkg().to_string(), tst_bkg.to_string())
def test_get_etc(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
tst_etc = AdmsEtc()
tst_etc.SrcNumSources = 1
tst_etc.PolNumPollutants = 18
self.assertEqual(apb.get_etc().to_string(), tst_etc.to_string())
def test_get_pol_wet_washout_known(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(apb.get_pol_wet_washout_known("test"), 0)
type_1 = [Constants.POL_NO, Constants.POL_PART_O3, Constants.POL_PART_SO2, Constants.POL_PM10,
Constants.POL_BENZENE, Constants.POL_BUTADIENE, Constants.POL_Cl2, Constants.POL_CH3Cl,
Constants.POL_ISOBUTYLENE, Constants.POL_HC, Constants.POL_NH3, Constants.POL_HCl, Constants.POL_CO,
Constants.POL_PM25, Constants.POL_NO2, Constants.POL_NOX, Constants.POL_CO2]
for name in type_1:
self.assertEqual(apb.get_pol_wet_washout_known(name), 1)
def test_get_pol_bkg_level(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_CO2), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_NOX), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_NO2), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_NO), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_PART_O3), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_VOC), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_PART_SO2), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_PM10), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_PM25), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_CO), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_BENZENE), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_BUTADIENE), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_HCl), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_Cl2), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_CH3Cl), 6.0e-1)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_ISOBUTYLENE), 0.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_NH3), 6.0e+0)
self.assertEqual(apb.get_pol_bkg_level(Constants.POL_HC), 0.0e+0)
def test_get_pol_gas_dep_velocity_known(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(apb.get_pol_gas_dep_velocity_known("test"), 1)
type_0 = [Constants.POL_Cl2, Constants.POL_CH3Cl, Constants.POL_ISOBUTYLENE, Constants.POL_NH3,
Constants.POL_HC]
for name in type_0:
self.assertEqual(apb.get_pol_gas_dep_velocity_known(name), 0)
def test_get_pol_gas_dep_velocity(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(apb.get_pol_gas_dep_velocity("test"), 0.0e+0)
self.assertEqual(apb.get_pol_gas_dep_velocity(Constants.POL_Cl2), 5.0e+0)
def test_get_pol_par_dep_velocity_known(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(apb.get_pol_par_dep_velocity_known("test"), 1)
def test_get_pol_par_term_velocity_known(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(apb.get_pol_par_term_velocity_known("test"), 1)
def test_get_pol_wet_washout_a(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(apb.get_pol_wet_washout_a("test"), 1.0e-4)
def test_get_pol_wet_washout_b(self):
apb = AdmsAplPlantBuilder(helper.get_default_apl_builder_data(helper))
self.assertEqual(apb.get_pol_wet_washout_b("test"), 6.4e-1)
| 61.568794 | 119 | 0.706054 | 5,771 | 43,406 | 5.058222 | 0.038815 | 0.225583 | 0.217807 | 0.254873 | 0.912096 | 0.88541 | 0.824192 | 0.580521 | 0.556884 | 0.337124 | 0 | 0.03791 | 0.165622 | 43,406 | 704 | 120 | 61.65625 | 0.768092 | 0.007856 | 0 | 0.218898 | 0 | 0 | 0.003211 | 0 | 0 | 0 | 0 | 0 | 0.699213 | 1 | 0.088189 | false | 0 | 0.006299 | 0 | 0.100787 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
13d0ea08d23dd08846e016df90137d9b4455cd53 | 24,129 | py | Python | tests/query_test/test_tpcds_queries.py | sp-mishra/impala | f602c3f80f5f61ccaebdf1493ff7c89230b77410 | [
"Apache-2.0"
] | null | null | null | tests/query_test/test_tpcds_queries.py | sp-mishra/impala | f602c3f80f5f61ccaebdf1493ff7c89230b77410 | [
"Apache-2.0"
] | null | null | null | tests/query_test/test_tpcds_queries.py | sp-mishra/impala | f602c3f80f5f61ccaebdf1493ff7c89230b77410 | [
"Apache-2.0"
] | null | null | null | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# Functional tests running the TPC-DS workload
#
import pytest
from tests.common.impala_test_suite import ImpalaTestSuite
from tests.common.test_dimensions import (
create_single_exec_option_dimension,
is_supported_insert_format)
class TestTpcdsQuery(ImpalaTestSuite):
@classmethod
def get_workload(cls):
return 'tpcds'
@classmethod
def add_test_dimensions(cls):
super(TestTpcdsQuery, cls).add_test_dimensions()
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('table_format').file_format not in ['rc', 'hbase', 'kudu'] and
v.get_value('table_format').compression_codec in ['none', 'snap'] and
v.get_value('table_format').compression_type != 'record')
cls.ImpalaTestMatrix.add_mandatory_exec_option('decimal_v2', 0)
if cls.exploration_strategy() != 'exhaustive':
# Cut down on the execution time for these tests in core by running only
# against parquet.
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('table_format').file_format in ['parquet'])
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('exec_option')['batch_size'] == 0)
@pytest.mark.execute_serially
# Marked serially to make sure it runs first.
def test_tpcds_count(self, vector):
self.run_test_case('count', vector)
def test_tpcds_q1(self, vector):
self.run_test_case(self.get_workload() + '-q1', vector)
def test_tpcds_q2(self, vector):
self.run_test_case(self.get_workload() + '-q2', vector)
def test_tpcds_q3(self, vector):
self.run_test_case(self.get_workload() + '-q3', vector)
def test_tpcds_q4(self, vector):
self.run_test_case(self.get_workload() + '-q4', vector)
def test_tpcds_q6(self, vector):
self.run_test_case(self.get_workload() + '-q6', vector)
def test_tpcds_q7(self, vector):
self.run_test_case(self.get_workload() + '-q7', vector)
def test_tpcds_q8(self, vector):
self.run_test_case(self.get_workload() + '-q8', vector)
def test_tpcds_q9(self, vector):
self.run_test_case(self.get_workload() + '-q9', vector)
def test_tpcds_q10a(self, vector):
self.run_test_case(self.get_workload() + '-q10a', vector)
def test_tpcds_q11(self, vector):
self.run_test_case(self.get_workload() + '-q11', vector)
def test_tpcds_q12(self, vector):
self.run_test_case(self.get_workload() + '-q12', vector)
def test_tpcds_q13(self, vector):
self.run_test_case(self.get_workload() + '-q13', vector)
def test_tpcds_q15(self, vector):
self.run_test_case(self.get_workload() + '-q15', vector)
def test_tpcds_q16(self, vector):
self.run_test_case(self.get_workload() + '-q16', vector)
def test_tpcds_q17(self, vector):
self.run_test_case(self.get_workload() + '-q17', vector)
def test_tpcds_q18a(self, vector):
self.run_test_case(self.get_workload() + '-q18a', vector)
def test_tpcds_q19(self, vector):
self.run_test_case(self.get_workload() + '-q19', vector)
def test_tpcds_q20(self, vector):
self.run_test_case(self.get_workload() + '-q20', vector)
def test_tpcds_q21(self, vector):
self.run_test_case(self.get_workload() + '-q21', vector)
def test_tpcds_q23_1(self, vector):
self.run_test_case(self.get_workload() + '-q23-1', vector)
def test_tpcds_q23_2(self, vector):
self.run_test_case(self.get_workload() + '-q23-2', vector)
def test_tpcds_q24_1(self, vector):
self.run_test_case(self.get_workload() + '-q24-1', vector)
def test_tpcds_q24_2(self, vector):
self.run_test_case(self.get_workload() + '-q24-2', vector)
def test_tpcds_q25(self, vector):
self.run_test_case(self.get_workload() + '-q25', vector)
def test_tpcds_q26(self, vector):
self.run_test_case(self.get_workload() + '-q26', vector)
def test_tpcds_q29(self, vector):
self.run_test_case(self.get_workload() + '-q29', vector)
def test_tpcds_q30(self, vector):
self.run_test_case(self.get_workload() + '-q30', vector)
def test_tpcds_q32(self, vector):
self.run_test_case(self.get_workload() + '-q32', vector)
def test_tpcds_q33(self, vector):
self.run_test_case(self.get_workload() + '-q33', vector)
def test_tpcds_q34(self, vector):
self.run_test_case(self.get_workload() + '-q34', vector)
def test_tpcds_q37(self, vector):
self.run_test_case(self.get_workload() + '-q37', vector)
def test_tpcds_q39_1(self, vector):
self.run_test_case(self.get_workload() + '-q39-1', vector)
def test_tpcds_q39_2(self, vector):
self.run_test_case(self.get_workload() + '-q39-2', vector)
def test_tpcds_q40(self, vector):
self.run_test_case(self.get_workload() + '-q40', vector)
def test_tpcds_q41(self, vector):
self.run_test_case(self.get_workload() + '-q41', vector)
def test_tpcds_q42(self, vector):
self.run_test_case(self.get_workload() + '-q42', vector)
def test_tpcds_q43(self, vector):
self.run_test_case(self.get_workload() + '-q43', vector)
def test_tpcds_q44(self, vector):
self.run_test_case(self.get_workload() + '-q44', vector)
def test_tpcds_q46(self, vector):
self.run_test_case(self.get_workload() + '-q46', vector)
def test_tpcds_q47(self, vector):
self.run_test_case(self.get_workload() + '-q47', vector)
def test_tpcds_q48(self, vector):
self.run_test_case(self.get_workload() + '-q48', vector)
def test_tpcds_q50(self, vector):
self.run_test_case(self.get_workload() + '-q50', vector)
def test_tpcds_q51(self, vector):
self.run_test_case(self.get_workload() + '-q51', vector)
def test_tpcds_q51a(self, vector):
self.run_test_case(self.get_workload() + '-q51a', vector)
def test_tpcds_q52(self, vector):
self.run_test_case(self.get_workload() + '-q52', vector)
def test_tpcds_q53(self, vector):
self.run_test_case(self.get_workload() + '-q53', vector)
def test_tpcds_q54(self, vector):
self.run_test_case(self.get_workload() + '-q54', vector)
def test_tpcds_q55(self, vector):
self.run_test_case(self.get_workload() + '-q55', vector)
def test_tpcds_q56(self, vector):
self.run_test_case(self.get_workload() + '-q56', vector)
def test_tpcds_q57(self, vector):
self.run_test_case(self.get_workload() + '-q57', vector)
def test_tpcds_q58(self, vector):
self.run_test_case(self.get_workload() + '-q58', vector)
def test_tpcds_q59(self, vector):
self.run_test_case(self.get_workload() + '-q59', vector)
def test_tpcds_q60(self, vector):
self.run_test_case(self.get_workload() + '-q60', vector)
def test_tpcds_q61(self, vector):
self.run_test_case(self.get_workload() + '-q61', vector)
def test_tpcds_q62(self, vector):
self.run_test_case(self.get_workload() + '-q62', vector)
def test_tpcds_q63(self, vector):
self.run_test_case(self.get_workload() + '-q63', vector)
def test_tpcds_q64(self, vector):
self.run_test_case(self.get_workload() + '-q64', vector)
def test_tpcds_q65(self, vector):
self.run_test_case(self.get_workload() + '-q65', vector)
def test_tpcds_q67a(self, vector):
self.run_test_case(self.get_workload() + '-q67a', vector)
def test_tpcds_q68(self, vector):
self.run_test_case(self.get_workload() + '-q68', vector)
def test_tpcds_q69(self, vector):
self.run_test_case(self.get_workload() + '-q69', vector)
def test_tpcds_q70a(self, vector):
self.run_test_case(self.get_workload() + '-q70a', vector)
def test_tpcds_q71(self, vector):
self.run_test_case(self.get_workload() + '-q71', vector)
def test_tpcds_q72(self, vector):
self.run_test_case(self.get_workload() + '-q72', vector)
def test_tpcds_q73(self, vector):
self.run_test_case(self.get_workload() + '-q73', vector)
def test_tpcds_q74(self, vector):
self.run_test_case(self.get_workload() + '-q74', vector)
def test_tpcds_q75(self, vector):
self.run_test_case(self.get_workload() + '-q75', vector)
def test_tpcds_q76(self, vector):
self.run_test_case(self.get_workload() + '-q76', vector)
def test_tpcds_q77a(self, vector):
self.run_test_case(self.get_workload() + '-q77a', vector)
def test_tpcds_q78(self, vector):
self.run_test_case(self.get_workload() + '-q78', vector)
def test_tpcds_q79(self, vector):
self.run_test_case(self.get_workload() + '-q79', vector)
def test_tpcds_q80a(self, vector):
self.run_test_case(self.get_workload() + '-q80a', vector)
def test_tpcds_q81(self, vector):
self.run_test_case(self.get_workload() + '-q81', vector)
def test_tpcds_q82(self, vector):
self.run_test_case(self.get_workload() + '-q82', vector)
def test_tpcds_q83(self, vector):
self.run_test_case(self.get_workload() + '-q83', vector)
def test_tpcds_q84(self, vector):
self.run_test_case(self.get_workload() + '-q84', vector)
def test_tpcds_q85(self, vector):
self.run_test_case(self.get_workload() + '-q85', vector)
def test_tpcds_q86a(self, vector):
self.run_test_case(self.get_workload() + '-q86a', vector)
def test_tpcds_q88(self, vector):
self.run_test_case(self.get_workload() + '-q88', vector)
def test_tpcds_q89(self, vector):
self.run_test_case(self.get_workload() + '-q89', vector)
def test_tpcds_q91(self, vector):
self.run_test_case(self.get_workload() + '-q91', vector)
def test_tpcds_q92(self, vector):
self.run_test_case(self.get_workload() + '-q92', vector)
def test_tpcds_q94(self, vector):
self.run_test_case(self.get_workload() + '-q94', vector)
def test_tpcds_q95(self, vector):
self.run_test_case(self.get_workload() + '-q95', vector)
def test_tpcds_q96(self, vector):
self.run_test_case(self.get_workload() + '-q96', vector)
def test_tpcds_q97(self, vector):
self.run_test_case(self.get_workload() + '-q97', vector)
def test_tpcds_q98(self, vector):
self.run_test_case(self.get_workload() + '-q98', vector)
def test_tpcds_q99(self, vector):
self.run_test_case(self.get_workload() + '-q99', vector)
class TestTpcdsDecimalV2Query(ImpalaTestSuite):
@classmethod
def get_workload(cls):
return 'tpcds'
@classmethod
def add_test_dimensions(cls):
super(TestTpcdsDecimalV2Query, cls).add_test_dimensions()
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('table_format').file_format not in ['rc', 'hbase', 'kudu'] and
v.get_value('table_format').compression_codec in ['none', 'snap'] and
v.get_value('table_format').compression_type != 'record')
if cls.exploration_strategy() != 'exhaustive':
# Cut down on the execution time for these tests in core by running only
# against parquet.
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('table_format').file_format in ['parquet'])
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('exec_option')['batch_size'] == 0)
def test_tpcds_q1(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q1', vector)
def test_tpcds_q2(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q2', vector)
def test_tpcds_q3(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q3', vector)
def test_tpcds_q4(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q4', vector)
def test_tpcds_q5(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q5', vector)
def test_tpcds_q6(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q6', vector)
def test_tpcds_q7(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q7', vector)
def test_tpcds_q8(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q8', vector)
def test_tpcds_q9(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q9', vector)
def test_tpcds_q10a(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q10a', vector)
def test_tpcds_q11(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q11', vector)
def test_tpcds_q12(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q12', vector)
def test_tpcds_q13(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q13', vector)
def test_tpcds_q15(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q15', vector)
def test_tpcds_q16(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q16', vector)
def test_tpcds_q17(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q17', vector)
def test_tpcds_q18(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q18', vector)
def test_tpcds_q18a(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q18a', vector)
def test_tpcds_q19(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q19', vector)
def test_tpcds_q20(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q20', vector)
def test_tpcds_q21(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q21', vector)
def test_tpcds_q22(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q22', vector)
def test_tpcds_q22a(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q22a', vector)
def test_tpcds_q25(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q25', vector)
def test_tpcds_q26(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q26', vector)
def test_tpcds_q27(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q27', vector)
def test_tpcds_q29(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q29', vector)
def test_tpcds_q30(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q30', vector)
def test_tpcds_q31(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q31', vector)
def test_tpcds_q32(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q32', vector)
def test_tpcds_q33(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q33', vector)
def test_tpcds_q34(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q34', vector)
def test_tpcds_q36(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q36', vector)
def test_tpcds_q37(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q37', vector)
def test_tpcds_q38(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q38-rewrite', vector)
def test_tpcds_q39_1(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q39-1', vector)
def test_tpcds_q39_2(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q39-2', vector)
def test_tpcds_q40(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q40', vector)
def test_tpcds_q41(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q41', vector)
def test_tpcds_q42(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q42', vector)
def test_tpcds_q43(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q43', vector)
def test_tpcds_q45(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q45', vector)
def test_tpcds_q46(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q46', vector)
def test_tpcds_q47(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q47', vector)
def test_tpcds_q48(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q48', vector)
def test_tpcds_q50(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q50', vector)
def test_tpcds_q51(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q51', vector)
def test_tpcds_q51a(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q51a', vector)
def test_tpcds_q52(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q52', vector)
def test_tpcds_q53(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q53', vector)
def test_tpcds_q54(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q54', vector)
def test_tpcds_q55(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q55', vector)
def test_tpcds_q56(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q56', vector)
def test_tpcds_q57(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q57', vector)
def test_tpcds_q58(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q58', vector)
def test_tpcds_q59(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q59', vector)
def test_tpcds_q60(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q60', vector)
def test_tpcds_q61(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q61', vector)
def test_tpcds_q62(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q62', vector)
def test_tpcds_q63(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q63', vector)
def test_tpcds_q64(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q64', vector)
def test_tpcds_q65(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q65', vector)
def test_tpcds_q67(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q67', vector)
def test_tpcds_q67a(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q67a', vector)
def test_tpcds_q68(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q68', vector)
def test_tpcds_q69(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q69', vector)
def test_tpcds_q70(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q70', vector)
def test_tpcds_q70a(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q70a', vector)
def test_tpcds_q71(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q71', vector)
def test_tpcds_q72(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q72', vector)
def test_tpcds_q73(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q73', vector)
def test_tpcds_q74(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q74', vector)
def test_tpcds_q75(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q75', vector)
def test_tpcds_q76(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q76', vector)
def test_tpcds_q77(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q77', vector)
def test_tpcds_q77a(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q77a', vector)
def test_tpcds_q78(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q78', vector)
def test_tpcds_q79(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q79', vector)
def test_tpcds_q80(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q80', vector)
def test_tpcds_q80a(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q80a', vector)
def test_tpcds_q81(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q81', vector)
def test_tpcds_q82(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q82', vector)
def test_tpcds_q83(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q83', vector)
def test_tpcds_q84(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q84', vector)
def test_tpcds_q85(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q85', vector)
def test_tpcds_q86(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q86', vector)
def test_tpcds_q86a(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q86a', vector)
def test_tpcds_q88(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q88', vector)
def test_tpcds_q89(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q89', vector)
def test_tpcds_q91(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q91', vector)
def test_tpcds_q92(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q92', vector)
def test_tpcds_q94(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q94', vector)
def test_tpcds_q95(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q95', vector)
def test_tpcds_q96(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q96', vector)
def test_tpcds_q97(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q97', vector)
def test_tpcds_q98(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q98', vector)
def test_tpcds_q99(self, vector):
self.run_test_case(self.get_workload() + '-decimal_v2-q99', vector)
class TestTpcdsInsert(ImpalaTestSuite):
@classmethod
def get_workload(self):
return TestTpcdsQuery.get_workload() + '-insert'
@classmethod
def add_test_dimensions(cls):
super(TestTpcdsInsert, cls).add_test_dimensions()
cls.ImpalaTestMatrix.add_dimension(create_single_exec_option_dimension())
cls.ImpalaTestMatrix.add_constraint(lambda v:
is_supported_insert_format(v.get_value('table_format')))
def test_tpcds_partitioned_insert(self, vector):
self.run_test_case('partitioned-insert', vector)
def test_expr_insert(self, vector):
self.run_test_case('expr-insert', vector)
class TestTpcdsUnmodified(ImpalaTestSuite):
@classmethod
def get_workload(cls):
return 'tpcds-unmodified'
@classmethod
def add_test_dimensions(cls):
super(TestTpcdsUnmodified, cls).add_test_dimensions()
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('table_format').file_format not in ['rc', 'hbase', 'kudu'] and
v.get_value('table_format').compression_codec in ['none', 'snap'] and
v.get_value('table_format').compression_type != 'record')
if cls.exploration_strategy() != 'exhaustive':
# Cut down on the execution time for these tests in core by running only
# against parquet.
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('table_format').file_format in ['parquet'])
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('exec_option')['batch_size'] == 0)
def test_tpcds_q31(self, vector):
self.run_test_case('tpcds-q31', vector)
def test_tpcds_q35a(self, vector):
self.run_test_case('tpcds-q35a', vector)
def test_tpcds_q48(self, vector):
self.run_test_case('tpcds-q48', vector)
def test_tpcds_q59(self, vector):
self.run_test_case('tpcds-q59', vector)
def test_tpcds_q89(self, vector):
self.run_test_case('tpcds-q89', vector)
| 35.020319 | 82 | 0.719176 | 3,643 | 24,129 | 4.443041 | 0.07137 | 0.083467 | 0.166934 | 0.202706 | 0.915544 | 0.881194 | 0.87724 | 0.862906 | 0.85957 | 0.857099 | 0 | 0.040974 | 0.140246 | 24,129 | 688 | 83 | 35.071221 | 0.739262 | 0.045796 | 0 | 0.478936 | 0 | 0 | 0.099965 | 0.001 | 0 | 0 | 0 | 0 | 0 | 1 | 0.445676 | false | 0 | 0.006652 | 0.008869 | 0.470067 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b921c2694692dacd5dd2ce88c803be331b6da452 | 60 | py | Python | gptest/__init__.py | dislabled/GoProCLI | 1dfdd500f77109dd957b71db0c2ea58f2feb4ca2 | [
"MIT"
] | null | null | null | gptest/__init__.py | dislabled/GoProCLI | 1dfdd500f77109dd957b71db0c2ea58f2feb4ca2 | [
"MIT"
] | null | null | null | gptest/__init__.py | dislabled/GoProCLI | 1dfdd500f77109dd957b71db0c2ea58f2feb4ca2 | [
"MIT"
] | null | null | null | from gptest import GoProCamera
from gptest import constants
| 20 | 30 | 0.866667 | 8 | 60 | 6.5 | 0.625 | 0.384615 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 60 | 2 | 31 | 30 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b92bc904f9cbe5cf3b996a47f3ef098d6092c69e | 198 | py | Python | codes/globo_videos_cuts/core/tests/models/__init__.py | lariodiniz/teste_meta | 3bf043df3ee76871d68a3f8aea7c3ecd53765fec | [
"MIT"
] | null | null | null | codes/globo_videos_cuts/core/tests/models/__init__.py | lariodiniz/teste_meta | 3bf043df3ee76871d68a3f8aea7c3ecd53765fec | [
"MIT"
] | null | null | null | codes/globo_videos_cuts/core/tests/models/__init__.py | lariodiniz/teste_meta | 3bf043df3ee76871d68a3f8aea7c3ecd53765fec | [
"MIT"
] | null | null | null | from .programs_model_test_case import ProgramsModelTestCase
from .cutting_job_model_test_case import CuttingJobsModelTestCase
from .processed_video_model_test_case import ProcessedVideoModelTestCase | 66 | 72 | 0.929293 | 23 | 198 | 7.521739 | 0.565217 | 0.156069 | 0.225434 | 0.32948 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 198 | 3 | 72 | 66 | 0.925134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b9697058b3c9e99abda376bc864bd590e42b8036 | 7,085 | py | Python | test_gamecards.py | encodis/gamecards | b265ab38b3fbbf8d2e92b1573de40119df7aeeed | [
"MIT"
] | 5 | 2018-05-23T18:39:04.000Z | 2022-03-22T00:59:20.000Z | test_gamecards.py | encodis/gamecards | b265ab38b3fbbf8d2e92b1573de40119df7aeeed | [
"MIT"
] | 3 | 2018-08-09T14:27:33.000Z | 2019-05-26T16:05:10.000Z | test_gamecards.py | encodis/gamecards | b265ab38b3fbbf8d2e92b1573de40119df7aeeed | [
"MIT"
] | null | null | null | import os
from gamecards import gamecards
def test_gamecards(tmpdir):
csv_file = tmpdir.join('cards.csv')
csv_file.write(f'''ID,Field1
1,Field 1 text line 1
2,Field 1 text line 2
3,Field 1 text line 3
4,Field 1 text line 4
5,Field 1 text line 5
''')
tpl_file = tmpdir.join('cards.tpl')
tpl_file.write('''<div class='card'>
<div class='field1'>
<p>${Field1}</p>
</div>
<div class='id1'>
<p>${ID}</p>
</div>
</div>
''')
out_file = tmpdir.join('cards.html')
gamecards(csv_file, tpl_file, 'cards.css', out_file, rows=2, cols=2)
# assert correct output files exist
assert os.path.exists(out_file)
# assert contents of page_one.md have a link to page_two.md
expect = '''<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<meta name="generator" content="gamecards" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
<title>Game Cards</title>
<style>
@media print {
.page { page-break-after: always; }
}
</style>
<link rel="stylesheet" href="cards.css"/>
</head>
<body>
<div id="gamecards">
<table class="page"><tr><td><div class='card'>
<div class='field1'>
<p>Field 1 text line 1</p>
</div>
<div class='id1'>
<p>1</p>
</div>
</div>
</td><td><div class='card'>
<div class='field1'>
<p>Field 1 text line 2</p>
</div>
<div class='id1'>
<p>2</p>
</div>
</div>
</td></tr><tr><td><div class='card'>
<div class='field1'>
<p>Field 1 text line 3</p>
</div>
<div class='id1'>
<p>3</p>
</div>
</div>
</td><td><div class='card'>
<div class='field1'>
<p>Field 1 text line 4</p>
</div>
<div class='id1'>
<p>4</p>
</div>
</div>
</td></tr></table><table class="page"><tr><td><div class='card'>
<div class='field1'>
<p>Field 1 text line 5</p>
</div>
<div class='id1'>
<p>5</p>
</div>
</div>
</td></tr></table>
</div>
</body>
</html>
'''
with open(out_file, 'r', encoding='utf8') as fh:
actual = fh.read()
assert expect == actual
def test_gamecards_single(tmpdir):
csv_file = tmpdir.join('cards.csv')
csv_file.write(f'''ID,Field1,Field2
1,Field 1 text,"Field 2, with comma"
''')
tpl_file = tmpdir.join('cards.tpl')
tpl_file.write('''<div class='card'>
<div class='field1'>
<p>${Field1}</p>
</div>
<div class='field2'>
<p>${Field2}</p>
</div>
<div class='id1'>
<p>${ID}</p>
</div>
</div>
''')
out_file = tmpdir.join('cards.html')
gamecards(csv_file, tpl_file, 'cards.css', out_file)
# assert correct output files exist
assert os.path.exists(out_file)
# assert contents of page_one.md have a link to page_two.md
expect = '''<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<meta name="generator" content="gamecards" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
<title>Game Cards</title>
<style>
@media print {
.page { page-break-after: always; }
}
</style>
<link rel="stylesheet" href="cards.css"/>
</head>
<body>
<div id="gamecards">
<table class="page"><tr><td><div class='card'>
<div class='field1'>
<p>Field 1 text</p>
</div>
<div class='field2'>
<p>Field 2, with comma</p>
</div>
<div class='id1'>
<p>1</p>
</div>
</div>
</td></tr></table>
</div>
</body>
</html>
'''
with open(out_file, 'r', encoding='utf8') as fh:
actual = fh.read()
assert expect == actual
def test_gamecards_multiple_styles(tmpdir):
csv_file = tmpdir.join('cards.csv')
csv_file.write(f'''ID,Field1,Field2
1,Field 1 text,"Field 2, with comma"
''')
tpl_file = tmpdir.join('cards.tpl')
tpl_file.write('''<div class='card'>
<div class='field1'>
<p>${Field1}</p>
</div>
<div class='field2'>
<p>${Field2}</p>
</div>
<div class='id1'>
<p>${ID}</p>
</div>
</div>
''')
out_file = tmpdir.join('cards.html')
gamecards(csv_file, tpl_file, 'cards1.css,cards2.css', out_file)
# assert correct output files exist
assert os.path.exists(out_file)
# assert contents of page_one.md have a link to page_two.md
expect = '''<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<meta name="generator" content="gamecards" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
<title>Game Cards</title>
<style>
@media print {
.page { page-break-after: always; }
}
</style>
<link rel="stylesheet" href="cards1.css"/>
<link rel="stylesheet" href="cards2.css"/>
</head>
<body>
<div id="gamecards">
<table class="page"><tr><td><div class='card'>
<div class='field1'>
<p>Field 1 text</p>
</div>
<div class='field2'>
<p>Field 2, with comma</p>
</div>
<div class='id1'>
<p>1</p>
</div>
</div>
</td></tr></table>
</div>
</body>
</html>
'''
with open(out_file, 'r', encoding='utf8') as fh:
actual = fh.read()
assert expect == actual
def test_gamecards_no_styles(tmpdir):
csv_file = tmpdir.join('cards.csv')
csv_file.write(f'''ID,Field1,Field2
1,Field 1 text,"Field 2, with comma"
''')
tpl_file = tmpdir.join('cards.tpl')
tpl_file.write('''<div class='card'>
<div class='field1'>
<p>${Field1}</p>
</div>
<div class='field2'>
<p>${Field2}</p>
</div>
<div class='id1'>
<p>${ID}</p>
</div>
</div>
''')
out_file = tmpdir.join('cards.html')
gamecards(csv_file, tpl_file, '', out_file)
# assert correct output files exist
assert os.path.exists(out_file)
# assert contents of page_one.md have a link to page_two.md
expect = '''<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<meta name="generator" content="gamecards" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
<title>Game Cards</title>
<style>
@media print {
.page { page-break-after: always; }
}
</style>
</head>
<body>
<div id="gamecards">
<table class="page"><tr><td><div class='card'>
<div class='field1'>
<p>Field 1 text</p>
</div>
<div class='field2'>
<p>Field 2, with comma</p>
</div>
<div class='id1'>
<p>1</p>
</div>
</div>
</td></tr></table>
</div>
</body>
</html>
'''
with open(out_file, 'r', encoding='utf8') as fh:
actual = fh.read()
assert expect == actual
| 23.078176 | 95 | 0.531828 | 949 | 7,085 | 3.910432 | 0.107482 | 0.090542 | 0.056589 | 0.058205 | 0.962813 | 0.938292 | 0.916734 | 0.916734 | 0.916734 | 0.916734 | 0 | 0.022283 | 0.277911 | 7,085 | 306 | 96 | 23.153595 | 0.703088 | 0.0518 | 0 | 0.871094 | 0 | 0.019531 | 0.746013 | 0.045312 | 0 | 0 | 0 | 0 | 0.03125 | 1 | 0.015625 | false | 0 | 0.007813 | 0 | 0.023438 | 0.015625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b97a04fb86f6dd338e8183f6b680a2480b71ecdb | 109 | py | Python | ast-transformations-core/src/test/resources/org/jetbrains/research/ml/ast/transformations/expressionUnification/data/in_1_basic.py | JetBrains-Research/ast-transformations | 0ab408af3275b520cc87a473f418c4b4dfcb0284 | [
"MIT"
] | 8 | 2021-01-19T21:15:54.000Z | 2022-02-23T19:16:25.000Z | ast-transformations-core/src/test/resources/org/jetbrains/research/ml/ast/transformations/expressionUnification/data/in_1_basic.py | JetBrains-Research/ast-transformations | 0ab408af3275b520cc87a473f418c4b4dfcb0284 | [
"MIT"
] | 4 | 2020-11-17T14:28:25.000Z | 2022-02-24T07:54:28.000Z | ast-transformations-core/src/test/resources/org/jetbrains/research/ml/ast/transformations/expressionUnification/data/in_1_basic.py | nbirillo/ast-transformations | 717706765a2da29087a0de768fc851698886dd65 | [
"MIT"
] | 1 | 2022-02-23T19:16:30.000Z | 2022-02-23T19:16:30.000Z | expr = (4 + 1) + (3 + 0)
a = 3 + 2 * 1
exp = 3 | 2 ^ 1
e = (4 & 1) | (3 & 0)
s = f'hello, {3 + 2 + 1 + 4}'
| 13.625 | 29 | 0.311927 | 25 | 109 | 1.36 | 0.48 | 0.176471 | 0.264706 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.276923 | 0.40367 | 109 | 7 | 30 | 15.571429 | 0.246154 | 0 | 0 | 0 | 0 | 0 | 0.201835 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b987359bbea9a7f24378d6e956b5928561266e7f | 14,374 | py | Python | tests/mock_graphql_payloads.py | hmrc/github_rename_utils | 158fd42f283994473bbf5fb99694a85aceaa271c | [
"Apache-2.0"
] | 1 | 2022-02-07T12:02:29.000Z | 2022-02-07T12:02:29.000Z | tests/mock_graphql_payloads.py | hmrc/github_rename_utils | 158fd42f283994473bbf5fb99694a85aceaa271c | [
"Apache-2.0"
] | null | null | null | tests/mock_graphql_payloads.py | hmrc/github_rename_utils | 158fd42f283994473bbf5fb99694a85aceaa271c | [
"Apache-2.0"
] | 1 | 2021-07-06T18:55:19.000Z | 2021-07-06T18:55:19.000Z | repo_list_multi_page_1 = """
{
"data": {
"organization": {
"id": "erhtjeKJHF4WjkhjkkEEbnncxerhtje=",
"name": "my-org",
"team": {
"name": "my-team",
"repositories": {
"totalCount": 111,
"pageInfo": {
"endCursor": "fd3kle2jkKLfdsklswHTjk==",
"hasNextPage": true
},
"edges": [
{
"permission": "WRITE",
"node": {
"isArchived": false,
"id": "fdFJKDSFJDSFDKLSjreldjs9c9dmsk=",
"name": "repo1",
"defaultBranchRef": {
"id": "fhddjksfhdfhdsfhhjew3RENCKCIDSHFHDdjksfhdfhd",
"name": "old-branch"
},
"branchProtectionRules": {
"edges": [
{
"node": {
"matchingRefs": {
"totalCount": 1,
"nodes": [
{
"name": "old-branch"
}
]
},
"requiredStatusCheckContexts": [
"some-check-pr-builder"
]
}
}
]
},
"pullRequests": {
"totalCount": 8
},
"ref": {
"name": "old-branch"
}
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"id": "djklsfjkdfjdklsfjdklsjfksdlsfj=",
"name": "repo2",
"defaultBranchRef": {
"id": "fdjksfjdkfjdklsjfdklsjfkdlsfjkldsfjdksllfdff",
"name": "old-branch"
},
"branchProtectionRules": {
"edges": [
{
"node": {
"matchingRefs": {
"totalCount": 1,
"nodes": [
{
"name": "old-branch"
}
]
},
"requiredStatusCheckContexts": []
}
}
]
},
"pullRequests": {
"totalCount": 4
},
"ref": {
"name": "old-branch"
}
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"id": "djklsfjkdfjdklsfjdklsjfksdlsfj=",
"name": "repo3",
"defaultBranchRef": {
"id": "fdjksfjdkfjdklsjfdklsjfkdlsfjkldsfjdksllfdff",
"name": "old-branch"
},
"branchProtectionRules": {
"edges": [
{
"node": {
"matchingRefs": {
"totalCount": 1,
"nodes": [
{
"name": "old-branch"
}
]
},
"requiredStatusCheckContexts": []
}
}
]
},
"pullRequests": {
"totalCount": 4
},
"ref": {
"name": "old-branch"
}
}
}
]
}
}
}
}
}
"""
repo_list_multi_page_2 = """
{
"data": {
"organization": {
"id": "erhtjeKJHF4WjkhjkkEEbnncxerhtje=",
"name": "my-org",
"team": {
"name": "my-team",
"repositories": {
"totalCount": 111,
"pageInfo": {
"endCursor": "Xd3kle2jkKLfdsklswHTjk==",
"hasNextPage": false
},
"edges": [
{
"permission": "WRITE",
"node": {
"isArchived": true,
"id": "fdhdjkshk6778T6HJGUf4wffdhsFkfh=",
"name": "repo-fhddjksfhk",
"defaultBranchRef": {
"id": "fdjksfjdkfjdklsjfdklsjfkdlsfjkldsfjdksllfdff",
"name": "main"
},
"branchProtectionRules": {
"edges": [
{
"node": {
"matchingRefs": {
"totalCount": 1,
"nodes": [
{
"name": "main"
}
]
},
"requiredStatusCheckContexts": []
}
}
]
},
"pullRequests": {
"totalCount": 0
},
"ref": null
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"id": "fdFJKDSFJDSFDKLSjreldjs9c9dmsk=",
"name": "repo-saderwkrhs",
"defaultBranchRef": {
"id": "fhddjksfhdfhdsfhhjew3RENCKCIDSHFHDdjksfhdf==",
"name": "main"
},
"branchProtectionRules": {
"edges": [
{
"node": {
"matchingRefs": {
"totalCount": 1,
"nodes": [
{
"name": "main"
}
]
},
"requiredStatusCheckContexts": []
}
}
]
},
"pullRequests": {
"totalCount": 0
},
"ref": null
}
},
{
"permission": "READ",
"node": {
"isArchived": false,
"id": "fdFJKDSFJDSFDKLSjreldjs9c9dmsk=",
"name": "repo-thjrecvix",
"defaultBranchRef": {
"id": "fhddjksfhdfhdsfhhjew3RENCKCIDSHFHDdjksfhdfhd",
"name": "old-branch"
},
"branchProtectionRules": {
"edges": [
{
"node": {
"matchingRefs": {
"totalCount": 1,
"nodes": [
{
"name": "old-branch"
}
]
},
"requiredStatusCheckContexts": []
}
}
]
},
"pullRequests": {
"totalCount": 0
},
"ref": {
"name": "old-branch"
}
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"id": "fd789DSHFsvcxjkfrketsancdsnjKfe=",
"name": "repo-fdhiob",
"defaultBranchRef": {
"id": "mkBF68fsdbk7LCFDBNFREbfd4bxsjkfBHFDJGFdrbk==",
"name": "main"
},
"branchProtectionRules": {
"edges": [
{
"node": {
"matchingRefs": {
"totalCount": 1,
"nodes": [
{
"name": "main"
}
]
},
"requiredStatusCheckContexts": []
}
}
]
},
"pullRequests": {
"totalCount": 0
},
"ref": null
}
}
]
}
}
}
}
}
"""
team_name_list_single_page = """
{
"data": {
"rateLimit": {
"limit": 5000,
"cost": 1,
"remaining": 4989,
"resetAt": "2021-06-18T13:41:33Z"
},
"organization": {
"name": "My Org",
"teams": {
"totalCount": 223,
"pageInfo": {
"hasNextPage": false,
"endCursor": ""
},
"nodes": [
{
"slug": "my-team"
},
{
"slug": "my-admin-team"
},
{
"slug": "justice-league"
}
]
}
}
}
}
"""
team_name_list_multi_page_1="""
{
"data": {
"rateLimit": {
"limit": 5000,
"cost": 1,
"remaining": 4989,
"resetAt": "2021-06-18T13:41:33Z"
},
"organization": {
"name": "My Org",
"teams": {
"totalCount": 223,
"pageInfo": {
"hasNextPage": true,
"endCursor": "fd3kle2jkKLfdsklswHTjk=="
},
"nodes": [
{
"slug": "my-team"
},
{
"slug": "my-admin-team"
},
{
"slug": "justice-league"
}
]
}
}
}
}
"""
team_name_list_multi_page_2 = """
{
"data": {
"rateLimit": {
"limit": 5000,
"cost": 1,
"remaining": 4989,
"resetAt": "2021-06-18T13:41:33Z"
},
"organization": {
"name": "my-org",
"teams": {
"totalCount": 223,
"pageInfo": {
"hasNextPage": false,
"endCursor": ""
},
"nodes": [
{
"slug": "my-additional-team"
}
]
}
}
}
}
"""
team_repo_list_single_page = """
{
"data": {
"organization": {
"id": "erhtjeKJHF4WjkhjkkEEbnncxerhtje=",
"name": "my-org",
"team": {
"name": "my-team",
"repositories": {
"totalCount": 111,
"pageInfo": {
"endCursor": "",
"hasNextPage": false
},
"edges": [
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo1"
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo2"
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo3"
}
}
]
}
}
}
}
}
"""
admin_team_repo_list_single_page = """
{
"data": {
"organization": {
"id": "erhtjeKJHF4WjkhjkkEEbnncxerhtje=",
"name": "my-org",
"team": {
"name": "my-admin-team",
"repositories": {
"totalCount": 111,
"pageInfo": {
"endCursor": "",
"hasNextPage": false
},
"edges": [
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo1"
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo2"
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo3"
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo4"
}
}
]
}
}
}
}
}
"""
team_repo_list_multi_page_1 = """
{
"data": {
"organization": {
"id": "erhtjeKJHF4WjkhjkkEEbnncxerhtje=",
"name": "my-org",
"team": {
"name": "my-team",
"repositories": {
"totalCount": 111,
"pageInfo": {
"endCursor": "fd3kle2jkKLfdsklswHTjk==",
"hasNextPage": true
},
"edges": [
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo1"
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo2"
}
},
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo3"
}
}
]
}
}
}
}
}
"""
team_repo_list_multi_page_2 = """
{
"data": {
"organization": {
"id": "erhtjeKJHF4WjkhjkkEEbnncxerhtje=",
"name": "my-org",
"team": {
"name": "my-team",
"repositories": {
"totalCount": 111,
"pageInfo": {
"endCursor": "",
"hasNextPage": false
},
"edges": [
{
"permission": "WRITE",
"node": {
"isArchived": false,
"name": "repo4"
}
}
]
}
}
}
}
}
"""
| 26.326007 | 73 | 0.287324 | 575 | 14,374 | 7.111304 | 0.146087 | 0.061629 | 0.078992 | 0.120567 | 0.886525 | 0.883835 | 0.883835 | 0.844461 | 0.820983 | 0.820983 | 0 | 0.028017 | 0.587797 | 14,374 | 545 | 74 | 26.374312 | 0.66211 | 0 | 0 | 0.535912 | 0 | 0 | 0.97732 | 0.094128 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b98dc3350ffb5be77ce05e0e9526bb1a6435db9e | 2,531 | py | Python | tests/rules/test_pacman_not_found.py | juzim/thefuck | a3b2e6872b9e75b8a259375b9440246fdd181565 | [
"MIT"
] | 5 | 2016-04-12T21:59:07.000Z | 2016-06-21T17:56:19.000Z | tests/rules/test_pacman_not_found.py | juzim/thefuck | a3b2e6872b9e75b8a259375b9440246fdd181565 | [
"MIT"
] | 4 | 2020-12-23T15:44:08.000Z | 2020-12-23T16:48:59.000Z | tests/rules/test_pacman_not_found.py | juzim/thefuck | a3b2e6872b9e75b8a259375b9440246fdd181565 | [
"MIT"
] | 2 | 2018-04-05T00:01:28.000Z | 2018-04-17T21:39:53.000Z | import pytest
from mock import patch
from thefuck.rules import pacman_not_found
from thefuck.rules.pacman_not_found import match, get_new_command
from tests.utils import Command
PKGFILE_OUTPUT_LLC = '''extra/llvm 3.6.0-5 /usr/bin/llc
extra/llvm35 3.5.2-13/usr/bin/llc'''
@pytest.mark.skipif(not getattr(pacman_not_found, 'enabled_by_default', True),
reason='Skip if pacman is not available')
@pytest.mark.parametrize('command', [
Command(script='yaourt -S llc', stderr='error: target not found: llc'),
Command(script='pacman llc', stderr='error: target not found: llc'),
Command(script='sudo pacman llc', stderr='error: target not found: llc')])
def test_match(command):
assert match(command)
@pytest.mark.parametrize('command', [
Command(script='yaourt -S llc', stderr='error: target not found: llc'),
Command(script='pacman llc', stderr='error: target not found: llc'),
Command(script='sudo pacman llc', stderr='error: target not found: llc')])
@patch('thefuck.specific.archlinux.subprocess')
def test_match_mocked(subp_mock, command):
subp_mock.check_output.return_value = PKGFILE_OUTPUT_LLC
assert match(command)
@pytest.mark.skipif(not getattr(pacman_not_found, 'enabled_by_default', True),
reason='Skip if pacman is not available')
@pytest.mark.parametrize('command, fixed', [
(Command(script='yaourt -S llc', stderr='error: target not found: llc'), ['yaourt -S extra/llvm', 'yaourt -S extra/llvm35']),
(Command(script='pacman -S llc', stderr='error: target not found: llc'), ['pacman -S extra/llvm', 'pacman -S extra/llvm35']),
(Command(script='sudo pacman -S llc', stderr='error: target not found: llc'), ['sudo pacman -S extra/llvm', 'sudo pacman -S extra/llvm35'])])
def test_get_new_command(command, fixed):
assert get_new_command(command) == fixed
@pytest.mark.parametrize('command, fixed', [
(Command(script='yaourt -S llc', stderr='error: target not found: llc'), ['yaourt -S extra/llvm', 'yaourt -S extra/llvm35']),
(Command(script='pacman -S llc', stderr='error: target not found: llc'), ['pacman -S extra/llvm', 'pacman -S extra/llvm35']),
(Command(script='sudo pacman -S llc', stderr='error: target not found: llc'), ['sudo pacman -S extra/llvm', 'sudo pacman -S extra/llvm35'])])
@patch('thefuck.specific.archlinux.subprocess')
def test_get_new_command_mocked(subp_mock, command, fixed):
subp_mock.check_output.return_value = PKGFILE_OUTPUT_LLC
assert get_new_command(command) == fixed
| 51.653061 | 145 | 0.711576 | 364 | 2,531 | 4.832418 | 0.17033 | 0.072769 | 0.095509 | 0.136441 | 0.839682 | 0.797044 | 0.765208 | 0.712905 | 0.712905 | 0.712905 | 0 | 0.010546 | 0.138285 | 2,531 | 48 | 146 | 52.729167 | 0.795965 | 0 | 0 | 0.717949 | 0 | 0 | 0.417226 | 0.029237 | 0 | 0 | 0 | 0 | 0.102564 | 1 | 0.102564 | false | 0 | 0.128205 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b9d7795f067a94a26597d395e472e5edbdecd13a | 92 | py | Python | model/__init__.py | galsuchetzky/PytorchTemplate | 4d80bfadf2012275e84ce757730e62155e4a78c5 | [
"MIT"
] | 1 | 2021-01-16T09:29:07.000Z | 2021-01-16T09:29:07.000Z | model/__init__.py | galsuchetzky/PytorchTemplate | 4d80bfadf2012275e84ce757730e62155e4a78c5 | [
"MIT"
] | null | null | null | model/__init__.py | galsuchetzky/PytorchTemplate | 4d80bfadf2012275e84ce757730e62155e4a78c5 | [
"MIT"
] | null | null | null | # from .MNIST_model import *
# from .BREAK_model import *
# from .BREAK_model_base import *
| 23 | 33 | 0.73913 | 13 | 92 | 4.923077 | 0.461538 | 0.34375 | 0.46875 | 0.625 | 0.703125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163043 | 92 | 3 | 34 | 30.666667 | 0.831169 | 0.923913 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
dbe3e835ccc437187b21e87dcc1ab37bb435a520 | 104 | py | Python | snippets/20160803_3.py | tchapeaux/cpython-playgrounds-snippet | 28e307529d67e42711bbf2681879f4a09e5c7350 | [
"WTFPL"
] | null | null | null | snippets/20160803_3.py | tchapeaux/cpython-playgrounds-snippet | 28e307529d67e42711bbf2681879f4a09e5c7350 | [
"WTFPL"
] | null | null | null | snippets/20160803_3.py | tchapeaux/cpython-playgrounds-snippet | 28e307529d67e42711bbf2681879f4a09e5c7350 | [
"WTFPL"
] | null | null | null | a = ['a', 'b', 'c', 'd', 'e', 'f', 'g']
print a
print a[3:]
print a[:3]
print a[1:-2]
print a[:len(a)]
| 13 | 39 | 0.442308 | 24 | 104 | 1.916667 | 0.5 | 0.652174 | 0.304348 | 0.521739 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0.192308 | 104 | 7 | 40 | 14.857143 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0.067308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.833333 | 1 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
e0536332346f44ee2fadd6b3e61f0649d3e807f3 | 199 | py | Python | overwrite_accounting/models/__init__.py | xpheragroup/FJAgosto21 | aad1ba8104a6ea1dfcc39fc250897465872ea930 | [
"MIT"
] | 1 | 2022-01-21T16:23:20.000Z | 2022-01-21T16:23:20.000Z | overwrite_accounting/models/__init__.py | xpheragroup/FJAgosto21 | aad1ba8104a6ea1dfcc39fc250897465872ea930 | [
"MIT"
] | null | null | null | overwrite_accounting/models/__init__.py | xpheragroup/FJAgosto21 | aad1ba8104a6ea1dfcc39fc250897465872ea930 | [
"MIT"
] | 1 | 2022-03-02T21:41:46.000Z | 2022-03-02T21:41:46.000Z | # -*- coding: utf-8 -*-
from . import account
from . import account_general_ledger
from . import account_move
from . import account_payment
from . import account_report
from . import button_confirm
| 22.111111 | 36 | 0.773869 | 27 | 199 | 5.481481 | 0.481481 | 0.405405 | 0.574324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005917 | 0.150754 | 199 | 8 | 37 | 24.875 | 0.869822 | 0.105528 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e07451ac7b4cfc5db2a67b0b8a32b4059a2507d1 | 157 | py | Python | src/extended_webdrivers/android.py | dillonm197/extended-webdrivers | 9cb4cdb75f37c66ee1ac7fa13b947ae3bcb17863 | [
"MIT"
] | null | null | null | src/extended_webdrivers/android.py | dillonm197/extended-webdrivers | 9cb4cdb75f37c66ee1ac7fa13b947ae3bcb17863 | [
"MIT"
] | null | null | null | src/extended_webdrivers/android.py | dillonm197/extended-webdrivers | 9cb4cdb75f37c66ee1ac7fa13b947ae3bcb17863 | [
"MIT"
] | 1 | 2019-08-07T01:48:36.000Z | 2019-08-07T01:48:36.000Z | from selenium.webdriver import Android as _Android
from .extended_webdriver import ExtendedWebdriver
class Android(ExtendedWebdriver, _Android):
pass
| 19.625 | 50 | 0.828025 | 17 | 157 | 7.470588 | 0.588235 | 0.23622 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133758 | 157 | 7 | 51 | 22.428571 | 0.933824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
e0777b9f89da533227caa678aec442220d885b8e | 176 | py | Python | temboo/core/Library/Twitter/Help/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/Twitter/Help/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/Twitter/Help/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | from temboo.Library.Twitter.Help.GetRateLimitStatus import GetRateLimitStatus, GetRateLimitStatusInputSet, GetRateLimitStatusResultSet, GetRateLimitStatusChoreographyExecution
| 88 | 175 | 0.920455 | 11 | 176 | 14.727273 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039773 | 176 | 1 | 176 | 176 | 0.95858 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0eca0d6d829abdd32ab78caba497fa399f3fa42c | 10,140 | py | Python | photongui/data.py | Mohamed501258/photongui | bb7543d3efbdb42646992f875f5febc5b75c2225 | [
"BSD-3-Clause"
] | 7 | 2021-08-20T00:52:46.000Z | 2021-11-24T11:34:27.000Z | photongui/data.py | Mohamed501258/photongui | bb7543d3efbdb42646992f875f5febc5b75c2225 | [
"BSD-3-Clause"
] | null | null | null | photongui/data.py | Mohamed501258/photongui | bb7543d3efbdb42646992f875f5febc5b75c2225 | [
"BSD-3-Clause"
] | 1 | 2021-09-19T14:30:16.000Z | 2021-09-19T14:30:16.000Z | defaultIcon = "iVBORw0KGgoAAAANSUhEUgAAAgAAAAIACAYAAAD0eNT6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAOxAAADsQBlSsOGwAAABl0RVh0U29mdHdhcmUAd3d3Lmlua3NjYXBlLm9yZ5vuPBoAAB0kSURBVHic7d17kJ1nfdjx33POWe1KtqybZXwFmztDm2RCMBA3XJIpJe5Qx8ZyjGMoNGkYrsZ2HDJp0uqPTiY1EAcn04RhUhpiQiqwAmlKEs8A4tImDU24JICNHezgG7allfaivZ73ffqHbSLbuqyk3fPsnufzmdGMLO2e3zOz1p7vPuc87xsBAAAAAAAAAAAAAAAAAAAAAAAAAAAArKC00gN27NixrtPpPCu63Re0OT+3E7E1ct6UU9oYEb2Vng8AR7PY778wN81Ym3O/zXkh9/tTi/3+1OLi4lTk3K7w+H6b0lQnYiLnPJ4j7sidzu1P27LlHz7+8Y8vrOTgFQmAHVdd9UMp55+MiB+PlH40ct6wEnMAYCUtLizEwmO/FhcXBzY3R8ykiP8dEZ9NOf/Z5z73ua8t94xlC4DXv/71pzcR/zYe/fXPl+txAWA1aJom5mZnY3Z2Ntp2pTcGnihHfL2T0u/npvnInj179i7HY550AFz6hjecMdI0b4ucr42I05ZhTQCwqs3NzcXB6elommbQow/mlH6vm/ONn/3sZ+8/mQc64QB405veNDazsPBLkfN7ImLsZBYBAGtNzjnmZmdj+uDByAPeEYiI2ZTSr+emuXHPnj1zJ/IAJxQAV1555U+0KX0wIp51Ip8PAMOibduYmpyM+fn5EuPvzCm95fOf+cznjvcTjysAduzY0U0jI78aOf9KRHSPdxgADKu5ubmYmpyMnPOgR+eIuHH7tm3/4eMf//iSX5NYcgBcddVVWxZz3p0iXnkiqwOAYbe4uBgTExPRDv69AZEjPpva9nV79uw5sJSPX1IA7Nix48zU6/15RPzgSa0OAIZc0zRxYP/+Em8QjIj4RrTta/bs2XPfsT7wmAGwY8eOp6de7/MRcf5yrAwAhl1u2xjfvz+afr/A8Hx3OzLyii/cdtu9R/uwztH+8tI3vnHbYz/5n7+cawOAYZY6ndiyZUt0ugXeLpfSBZ1+/zOvfvWrzzjahx0xAN70pjeNjSwufjoiXrDsiwOAIdfpdGLzli2ROkf9WXulPGehaT71yle+8ojH9I+4qtm5uQ9ExIUrsiwAqECv243TTit0jbycX5q73fcf6a8Puzdxxetff0Wk9OsrtyoAqEOv14uc80DvJfC4FPHiC84//8577rnn7w7zd0906RvecMZIv397RGwZyOoAoAL79u4tdTJg/7pe7/m33Xbbw4f+4VNeAhjp998bnvwBYFltLPVSQMSWxX7/Kbv6T9gBuOJnfuaiaNsvPvnPAYCTNzExEfNzJ3Tp/pOVc0oXff4zn/nLx//giTsAbfufwpM/AKyIU045pdTolNr2Px76B98PgB1XXXVhRPzLgS8JACrR6/Vi3ehomeEpveYVP/ETL378PzuH/Oa6MisCgHps2LCh2OxOztc+/vsUEbFjx45Nqdd7MCLWF1sVAFSi4ImAuWjbs/bs2XOgExHR6fWuDE/+ADAQo2NHvEDfShuLbvd1EY+9BJAjLi21EgCozVi5AIic82UREWnHjh3rUq83HhHF3poIADXJEbHvkUeibdsS46enJia2drrd7kvCkz8ADEyKiJF160qNP3XTpk0v7jQpvfjYHwsALKeRXq/Y7Dbiwk4n5+cVWwEAVKpbMAAi4nmdnNLzS64AAGpUPAAi4pySKwCAGnU7T7kf3yCd14mIYrcnAoBapVT01jsbOxFxaskVAECtUqFdgByxsRelrgCYUmx6+X8uMhoAIiJydyzGH/pOLC4O/ha9bdtGjP9WRHtw4LNTxIZy70DIEb1NFxQbDwAREe3UbLQLgw+AiIjodMvMjUPuBggA1EMAAECFBAAAVEgAAECFBAAAVEgAAECFBAAAVEgAAECFBAAAVKhcAJS9CQIAVM0OAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIWKBUByKWAAKMYOAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIWKBUDOpSYDAHYAAKBCAgAAKiQAAKBCAgAAKiQAAKBCAgAAKiQAAKBCAgAAKiQAAKBCAgAAKlQuAFIqNhoAamcHAAAqJAAAoEICAAAqJAAAoEICAAAqJAAAoEICAAAqJAAAoEICAAAqVCwAkisBAkAxdgAAoEICAAAqJAAAoEICAAAqJAAAoEICAAAqJAAAoEICAAAqJAAAoEICAAAqVCwAci41GQCwAwAAFRIAAFAhAQAAFRIAAFAhAQAAFRIAAFAhAQAAFRIAAFAhAQAAFRIAAFChYgGQUio1GgCqZwcAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQuUCwKWAAaAYOwAAUCEBAAAVEgAAUCEBAAAVEgAAUCEBAAAVEgAAUCEBAAAVEgAAUKFyAZCLTQaA6tkBAIAKCQAAqJAAAIAKCQAAqJAAAIAKCQAAqJAAAIAKCQAAqJAAAIAKCQAAqFC5AEip2GgAqJ0dAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAADqlksvoAwBAAAVEgAAUCEBAAAVEgAAUCEBAAAVEgAAUKGCAZDKjQaAytkBAIAKCQAAqJAAAIAKCQAAqJAAAIAKCQAAqJAAAIAKCQAAqJAAAIAKlQuA5EqAAFCKHQAAqJAAAIAKCQAAqJAAAIAKCQAAqJAAAIAKCQAAqJAAAIAKCQAAqJAAAIAKFQuAlEtNBgDsAABAhQQAAFRIAABAhQQAAFRIAABAhQQAAFRIAABAhQQAAFRIAABAhQQAAFSoWADklEqNBoDq2QEAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACoULkAcClgACjGDgAAVEgAAECFBAAAVEgAAECFBAAAVEgAAECFBAAAVEgAAECFBAAAVKhYAKRcajIAYAcAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQsUCIKdUajQAVM8OAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIWKBUByKWAAKMYOAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIUEAABUSAAAQIXKBUAuNhkAqmcHAAAqJAAAoEICAAAqJAAAoEICAAAqJAAAoEICAAAqJAAAoEICAAAqVCwAckqlRgNA9ewAAECFBAAAVEgAAECFBAAAVEgAAECFBAAAVEgAAECFBAAAVEgAAECFBAAAVKhYACSXAgaAYuwAAECFBAAAVEgAAECFBAAAVEgAAECFBAAAVEgAAECFBAAAVEgAAECFeqUG58XZmPnmx0qNB4DI3dHoTo1HWpwd+Oy2bSPahYHPfVy5AGjmY/au/1lqPABERET3sV8lpGYhcqHZXgIAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgAoJAACokAAAgEI6nXJPw70fu+ilt4+NjT4rpRgptgoAqNBDDz4UTdMMfG7btAfTl//vF/LAJwMAcf8DD0bTtEVmewkAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACokAACgQgIAACrUK70AYDC63V5s3HharN9waoytXx+dzqP937ZtzM3OxszMdExPTUTTNIVXCgyCAIAhd+rG0+LMM8+NTZu3RkrpqB+bc46JA+PxvQfvi+npyQGtEChBAMCQWr9+Q5xz7vmxecu2JX9OSik2b9kWm7dsiwP798V9994Tc3MzK7hKoBQBAENo67btcf4Fz/3+Nv+J2LxlW2zavCXuufuu2Lf3oWVcHbAaCAAYMueed0Gceda5y/JYKXXigmc+N9av3xD33Xv3sjwmsDo4BQBD5Oxznr5sT/6HOvOsc+Oss5++7I8LlCMAYEhs2XJ6nH3OM1bs8c859xmxdev2FXt8YLAEAAyB3shInP/M5674nGdc8Ozo9UZWfA6w8gQADIGzz356dLvdFZ/T7fbi7HO8FADDQADAGjc6OhbbzzhrYPO2n3FWjI6ODWwesDIEAKxx288485gX+FlOKaU4ffvTBjYPWBkCANa4zVtOH/jMLVsHPxNYXgIA1rB169bF2Nj6gc8dG9sQIyPrBj4XWD4CANawsfWnFJu9fv2GYrOBkycAYA1bt2603GxvBIQ1TQDAGjaIo3+rcTZw8gQArGE552Kz27YtNhs4eQIA1rB+f7HY7KbfLzYbOHkCANawubnZgrNnis0GTp4AgDVsduZgtG0z8LlN08TsrACAtUwAwBqWc47JyYmBz52aPFD0/QfAyRMAsMbt2/tQgZkPD3wmsLwEAKxxB/bvi9nZgwObNztzMPbv3zuwecDKEACwxuWc47577xnYvPvuvXtgs4CVIwBgCEwcGI8DB8ZXfM6B/ftiYmL/is8BVp4AgCFx9z/cvqIvBczNzcTd3/n2ij0+MFgCAIZE0zRx153fisXFhWV/7MXFhbjzjm9E07j4DwwLAQBDZH5uNr71ja/EwYPTy/aYMzMH41vf/GrMz88t22MC5QkAGDILCwtxx7e+Fnsf+d5JP9Yjj3wvbv/mV2Nhfn4ZVgasJr3SCwCWX9u2cc/dd8ZD37s/zj7nGbFl6+nH9fmTkwfivu9+J2ZmBne8EBgsAQBDbHZ2Jv7hrm/F+vUbYuu27bHxtM2xYcOp0ek8cfOvbduYmZmOyYkDsX/8EZf5hQr0IqIfQgCG2uzsTNx/3z9GxD9GRMS6daPR7XYj4tE3Dy4s2OKHEgpeULvfi4jpiNhcbg3AoHnCh9Uht8USYLITjwYAADBAOUfJm2pNdSKyi3oDwIC1efC38j7Evk5EurPkCgCgRouLBS+slfK3OznH7eVWAAB16pcMgDbd0YlO/vtyKwCAOpXcAUiRvtEZXeztiaInEQCgPnPlLq+du+vaL3R+8KKLHo6Ib5ZaBQDUpmmbkjsAf3fJJVc91ImIyDn9ealVAEBt5mYLXovjsef8R68H2uncUm4lAFCXgwfLXW47p/ajEY8FwIUXXvTViPh6sdUAQCWatom5QnfYzJG+evnlV3094pDbAacUv1tkNQBQkenpcnfZ7ET+4D/9/jGbtkz/t0jxQJklAcDwa3OOqXIB8ODE1Nx/f/w/vh8Az3nOxfMp8m8UWRIAVGB6ajrapi0zPOUb3/zmN3//7OETbgq+acvB346IOwa+KAAYcm3bxORUsfvv3TU5Of+El/qfEADPec7F8ynndw12TQAw/Pbvn4y2LfPTfyendx3603/EkwIgIuJHXvqK2yLiYwNbFQAMubm5+Tg4U+joX8ofvXTHlX/25D9+SgBERGyYaX4+vBQAACetbZvYN76/1Pi7Zmfbtx3uLw4bAC981aumUxtXRUSxCxUDwDDYu3d/NE1TYvRsp213XH311ZOH+8vDBkBExI+87OV/G23+6YgoeL9CAFi7xvcfKHXRnybldPWlV/zMV4/0AUcMgIiIF7/sFX+SUxx26wAAOLKJicliF/3Jkd592Y4rdx/tY44aABERF1748g/lnP5d2AkAgCWZnJyKicmpEqObFPHWyy+/8reP9YFpqY/45b/6wiWR4o8iYuyklgYAQ2x8/4FSP/nP58hvvPzyq3Yt5YOXHAAREX/zV198UZvy/4iIZ53Q0gBgSDVNG+Pj4zE7V+Q1/+9GSj/9utdd+VdL/YTjCoCIiK985XOb+/O934uULzvezwWAYTQ3Nx/7xsejKXKZ33Rrpzv7c5de+uYDx/VZJzrur//6i6/t5Hxzjjj/RB8DANaytm3jwIHJmD5YZMv//pTTL1+248qPnMgnn3AARER87Wt/ccr87PrrU4prImLryTwWAKwVOeeYmp6OycnpEpf3HY8cv7lu7NT3v/a1rz3hywueVAA87ktf+tLG0ZH2rSnirXYEABhWTdvE9PRMTE0VeeK/J1L+nV7vlN+55JJLTvqIwbIEwONyzunLX/7iyyPH1SnFxZHj7OV8fAAYtKZtYm52PmZmZgb/Br8cD+QU/yulzi2XXXbFF1NKebkeelkD4Mn+5i8//4LcTa/MOX4gIj0vRfvcHOmMiBhZybkAcCKapo1+vx/9xcVYWFyM+fmFWFhcHMToxYh4OHLckVN8OyJ9PaL93OWXX3X7Sg1c0QA4kjvv/PTo+Pi6Uzud0U0l5sPJePDBh7fdf/8DfxmRu6XXQkREas455+yXnXXWGftKr4S1ad++AxtnZg4u5NwM/PxeSv2JsbHt0xdffPHgZw96IKx1N930vjdE5BN61y0rI6V4w7vffcMtpdcBa8kxLwUMPFn7k6VXwFP4msBxEgBwHHbt2tWNSK8uvQ6eKOd4zaNfG2CpBAAch/vv/8cLI2Jb6XXwFFvvvffeF5deBKwlAgCOj63mVSqlxtcGjoMAgOPjSWaVSin52sBxcAoAlujmm39te9OMfC+E82rV9vtx1g033PBw6YXAWuAbGSxR06x7Tfg3s5p1RkbCGzRhiXwzgyVz/G8N8DWCJRIAsASO/60NjgPC0gkAWALH/9YMxwFhiQQALI2t5TXCcUBYGgEAS+NJZY1wHBCWxjFAOAbH/9YcxwFhCXxDg2Nw/G/NcRwQlsA3NTgmx//WIF8zOAYBAEfh+N/a5DggHJsAgKN47EiZ439rj+OAcAwCAI7CkbK1y9cOjk4AwFGklC4uvQZOjOOAcHSOAcIROP635jkOCEfhGxscgeN/a57jgHAUvrnBETn+NwR8DeEIBAAchuN/w8FxQDgyAQCH4fjf0HAcEI5AAMBhOEI2PHwt4fAEAByGI2TDw9cSDs8xQHgSx/+GjuOAcBi+wcGTOP43dBwHhMPwTQ6eJKW4rPQaWHZeBoAn8RIAHGLXrl3dbqd5KJwAWHY557j/gQejbdsS48fPOecZZ1xxxRVNieGwGtkBgEN0u63jfyskpRRjo6OlxjsOCE8iAOAQKbv630oaWz9WbLbjgPBEAgAOkcPd/1bS+rGSAeA4IBxKAMBjdu3atT0ifrj0OoZZt9uNkZGRUuNf9N73vveMUsNhtREA8Jhut3H8bwAK7gI4DgiH8M0OHpOyo2KDMFbwZYBwHBC+TwBAPHr8L4efDgdhbGw0Op0y33rcHRD+iQCAiOhF/8Jw/G9gRkfXlRrtOCA8RgBAROSud4gP0vqx9cVmOw4IjxIAEBHh9f+BWl/0egBiDyIEAMSuXbvODMf/Bqr0ccAbb7zxzFLDYbUQAFSv02kvD/8WBm7DhmIvA3RGRuJ1pYbDauGbHtVLKf906TXUaMP6DQWnpx0Fh8OqIACo2q23fvQZkeNHS6+jRiMjvVi3rthpgB97//vff16p4bAaCADq1nbfGf4dFLPx1FNLje50Ou3bSw2H1SCVXgCU8qlPfWpjf3Hm3ojYVHotNbv/gQejaZoSow+sW7fhvLe//e3TJYZDaX7yoVr9/sGfD0/+xW3cWGwXYPPCwszPlRoOpdkBoEp/+tGPbpkf7dwZrv5XXM45HnjwoWiafonx4zl3n3PdddeNlxgOJdkBoEoLo+lXwpP/qpBSis2bTis1fmtKzS+WGg4l2QGgOrt3/9ELcpu/EhGjpdfCP3no4Ydjfn5h4HNTirmm6fzg9ddf/+2BD4eC7ABQlZ07d3Zymz8UnvxXnW1bt0RKg/+ZJOcY63ab33eXQGojAKjKD7zweTdExEWl18FT9XojsanQSwE5p5fed993311kOBTiJQCq8YlPfOwlKeLz4af/Ve2RR/bG7NxcidHzKaUfe/e7f+HLJYbDoNkBoAq7d3/kjBTxifDkv+ptO31r9HpFduNHc8633nzzr20vMRwGTQAw9Hbt2rUu55FdEXFu6bVwbJ3UidO3nV7k/QARcV6/3/vYBz/4wWK3KoRB8aYXhtrOnTs7Zz5t6y0R8a9Lr4Wl63a7MTY6GjMzswOfnVJ65uLi3LNf9rKL/njPnj154AuAAbEDwFD7gX/23N+ICHf7W4NGR0dj29athaan12/efMpvFRoOAyEAGEo557T7E3/4mznSNaXXwonbsGF9bNtW7Hjg22666b3vzzl7szRDyf/YDJ1du3Z1u53mgxHxs6XXwvKYnZ2Lvfv2Rc5FduQ/MjFx8Gd37txZ5FrFsFIEAENl165dp3Y77R9E5J8qvRaW1/z8fDyyd1+0bTvw2TnHraeccvCNb3nLzpmBD4cVIgAYGp/85MfOa5r0x5Hzi0qvhZXR7/fjkb37YnFxscT4r/d68VPvfOcNd5cYDstNADAUdu/+w9fknP4gcpxeei2srLZtY3z//iInBCLikZzz1ddd94u3lRgOy0kAsKZ9+MMfHjvttLGdkeOG8KbWqszMzsT4vgPR5oG/JJAj8of6/XTdDTfccHDQw2G5CADWrN27/vAVuZN+NyKeX3otlNHvNzG+f3/MFbh0cM7xzW43veWaa37hSwMfDstAALDm3HrrH5wVufdfIuLq8P8w8egpgfH9B6JpBv9G/ZzTn+ac3nb99dffO/DhcBJ882TN2L3797flvO66yPmdEWlj6fWwuuScY3r6YExOTUXTNIMeP5Vz/kBE76brrrtufNDD4UQIAFa9T33qY2f3F+IdkfI7PPFzLI+HwNT0dPT7A98RmMw535zSyH+99tprHxz0cDgeAoBVaefOnZ0XvvB5r+qk9NaIfElE9EqvibVndnYupg9Ox+zswN8jsBiRPhmRf3di4uCenTt3Dv7iBXAMAoBV49Of/vTo3NyBl7dtuixF/FREnFl6TQyHpm1jbnY2ZmZnY25uIfJgTw48GBGfjIjd3e7oF9/1rnfND3I4HIkAoIhbbrnltPXre8/OuX12SumHI8e/iIgfiYjR0mtj+C0sLMT8/HzMLyxEv99Ef7E/kOOEKcVczvH/Ukpfatv8lZTyXQsL+c73vOc9Uys+HJ5EABRw0003XpRSXNq2neellJ8WFZxf73S627rdztNyzt1OSt1wK2pWme/fZSDnyPHoBYcGeNnhJufcRKTxlOJ7ETHwdzEW0OacHup02jtS6u6+5prr/0/pBdVGAAzQzTff+Ky27Xwo5/yq0msBWGU+G9H799dee+13Si+kFgJgQD7wgfe9qG3zX0TEttJrAVil9kbkf3Xttb/4t6UXUgMBMADve9/7Tu9281cj4pzSawFY5e4bGVn8oXe845f3lV7IsBv6155Xg04n/2p48gdYinMXF3u/WnoRNbADsMJuvvnm0aaZfyQiXMAGYGmmJyYObt+5c+fgb/JQETsAK6xp5l8WnvwBjsepmzefemHpRQw7AbDi0nmlVwCw9uSnl17BsBMAK+zRs70AHI+2jcHf2rEyAmCFdbude0qvAWCtydn3zpUmAFbY/v1Tfx0RjrMALN3e884778ulFzHsBMAK27lzZz8ifqv0OgDWkJuvuOIKL5+uMAEwAP1+vC/n+GbpdQCsAX/f78dvlF5EDVwHYEBuuummZ0b0b4uIZ5VeC8AqdVevF69+5ztvuLv0QmpgB2BArr322u+MjCy+JKX4cNRxpy+ApWpyjt/LufsST/6DYweggA984NefnnP33+Scnh8RZ5ReD0AhD6eUb0+p+ZNrrvml75ZeDAAAAAAAAAAAAAAAAAAAAAAAAAAAAMAy+v+gWFNBngjy6AAAAABJRU5ErkJggg==" | 10,140 | 10,140 | 0.963807 | 335 | 10,140 | 29.173134 | 0.98806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136713 | 0.000197 | 10,140 | 1 | 10,140 | 10,140 | 0.827283 | 0 | 0 | 0 | 0 | 1 | 0.998324 | 0.998324 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1632c14f5272b0d0beb9366e110e6ece1091b186 | 11,041 | py | Python | users/migrations/0007_auto_20190430_0126.py | Isaacli0520/msnmatch | 228c6d546e16bd54dc8c7e0803f0f8c408cb0219 | [
"MIT"
] | null | null | null | users/migrations/0007_auto_20190430_0126.py | Isaacli0520/msnmatch | 228c6d546e16bd54dc8c7e0803f0f8c408cb0219 | [
"MIT"
] | 18 | 2020-03-11T18:57:27.000Z | 2022-02-26T11:14:38.000Z | users/migrations/0007_auto_20190430_0126.py | Isaacli0520/msnmatch | 228c6d546e16bd54dc8c7e0803f0f8c408cb0219 | [
"MIT"
] | null | null | null | # Generated by Django 2.1.5 on 2019-04-30 05:26
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('users', '0006_profile_role'),
]
operations = [
migrations.AlterField(
model_name='profile',
name='major',
field=models.CharField(choices=[('Accelerated B.A./Master of Public Policy (MPP) Program', 'Accelerated B.A./Master of Public Policy (MPP) Program'), ('Accounting', 'Accounting'), ('Aerospace Engineering', 'Aerospace Engineering'), ('African American & African Studies', 'African American & African Studies'), ('American Studies', 'American Studies'), ('Anthropology', 'Anthropology'), ('Architectural History', 'Architectural History'), ('Architecture', 'Architecture'), ('Art History', 'Art History'), ('Art, Studio', 'Art, Studio'), ('Astronomy', 'Astronomy'), ('Astronomy-Physics', 'Astronomy-Physics'), ('B.A. in Public Policy and Leadership', 'B.A. in Public Policy and Leadership'), ('B.S./M.S. in Teaching', 'B.S./M.S. in Teaching'), ('Bachelor of Science in Nursing (BSN)', 'Bachelor of Science in Nursing (BSN)'), ('Biology', 'Biology'), ('Biomedical Engineering', 'Biomedical Engineering'), ('Chemical Engineering', 'Chemical Engineering'), ('Chemistry', 'Chemistry'), ('Chinese Language & Literature', 'Chinese Language & Literature'), ('Civil Engineering', 'Civil Engineering'), ('Classics', 'Classics'), ('Cognitive Science', 'Cognitive Science'), ('Commerce', 'Commerce'), ('Comparative Literature', 'Comparative Literature'), ('Computer Engineering', 'Computer Engineering'), ('Computer Science', 'Computer Science'), ('Drama', 'Drama'), ('East Asian Studies', 'East Asian Studies'), ('Economics', 'Economics'), ('Electrical Engineering', 'Electrical Engineering'), ('Engineering Science', 'Engineering Science'), ('English', 'English'), ('Environmental Sciences', 'Environmental Sciences'), ('Environmental Thought & Practice', 'Environmental Thought & Practice'), ('Finance', 'Finance'), ('French', 'French'), ('German', 'German'), ('Global Studies', 'Global Studies'), ('History', 'History'), ('Human Biology', 'Human Biology'), ('Information Technology', 'Information Technology'), ('Italian', 'Italian'), ('Japanese Language & Literature', 'Japanese Language & Literature'), ('Jewish Studies', 'Jewish Studies'), ('Kinesiology', 'Kinesiology'), ('Latin American Studies', 'Latin American Studies'), ('Linguistics', 'Linguistics'), ('Management', 'Management'), ('Marketing', 'Marketing'), ('Mathematics', 'Mathematics'), ('Mechanical Engineering', 'Mechanical Engineering'), ('Media Studies', 'Media Studies'), ('Medieval Studies', 'Medieval Studies'), ('Middle Eastern and South Asian Languages and Cultures', 'Middle Eastern and South Asian Languages and Cultures'), ('Music', 'Music'), ('Neuroscience', 'Neuroscience'), ('Philosophy', 'Philosophy'), ('Physics', 'Physics'), ('Political Philosophy, Policy, and Law', 'Political Philosophy, Policy, and Law'), ('Political and Social Thought', 'Political and Social Thought'), ('Politics', 'Politics'), ('Psychology', 'Psychology'), ('RN to BSN', 'RN to BSN'), ('Religious Studies', 'Religious Studies'), ('Slavic Languages and Literatures', 'Slavic Languages and Literatures'), ('Sociology', 'Sociology'), ('South Asian Studies', 'South Asian Studies'), ('Spanish', 'Spanish'), ('Speech Communications Disorders Major', 'Speech Communications Disorders Major'), ('Statistics', 'Statistics'), ('Systems Engineering', 'Systems Engineering'), ('Teacher Education', 'Teacher Education'), ('Urban & Environmental Planning', 'Urban & Environmental Planning'), ('Women, Gender, and Sexuality', 'Women, Gender, and Sexuality'), ('Youth & Social Innovation Major', 'Youth & Social Innovation Major')], max_length=255),
),
migrations.AlterField(
model_name='profile',
name='major_two',
field=models.CharField(blank=True, choices=[('Accelerated B.A./Master of Public Policy (MPP) Program', 'Accelerated B.A./Master of Public Policy (MPP) Program'), ('Accounting', 'Accounting'), ('Aerospace Engineering', 'Aerospace Engineering'), ('African American & African Studies', 'African American & African Studies'), ('American Studies', 'American Studies'), ('Anthropology', 'Anthropology'), ('Architectural History', 'Architectural History'), ('Architecture', 'Architecture'), ('Art History', 'Art History'), ('Art, Studio', 'Art, Studio'), ('Astronomy', 'Astronomy'), ('Astronomy-Physics', 'Astronomy-Physics'), ('B.A. in Public Policy and Leadership', 'B.A. in Public Policy and Leadership'), ('B.S./M.S. in Teaching', 'B.S./M.S. in Teaching'), ('Bachelor of Science in Nursing (BSN)', 'Bachelor of Science in Nursing (BSN)'), ('Biology', 'Biology'), ('Biomedical Engineering', 'Biomedical Engineering'), ('Chemical Engineering', 'Chemical Engineering'), ('Chemistry', 'Chemistry'), ('Chinese Language & Literature', 'Chinese Language & Literature'), ('Civil Engineering', 'Civil Engineering'), ('Classics', 'Classics'), ('Cognitive Science', 'Cognitive Science'), ('Commerce', 'Commerce'), ('Comparative Literature', 'Comparative Literature'), ('Computer Engineering', 'Computer Engineering'), ('Computer Science', 'Computer Science'), ('Drama', 'Drama'), ('East Asian Studies', 'East Asian Studies'), ('Economics', 'Economics'), ('Electrical Engineering', 'Electrical Engineering'), ('Engineering Science', 'Engineering Science'), ('English', 'English'), ('Environmental Sciences', 'Environmental Sciences'), ('Environmental Thought & Practice', 'Environmental Thought & Practice'), ('Finance', 'Finance'), ('French', 'French'), ('German', 'German'), ('Global Studies', 'Global Studies'), ('History', 'History'), ('Human Biology', 'Human Biology'), ('Information Technology', 'Information Technology'), ('Italian', 'Italian'), ('Japanese Language & Literature', 'Japanese Language & Literature'), ('Jewish Studies', 'Jewish Studies'), ('Kinesiology', 'Kinesiology'), ('Latin American Studies', 'Latin American Studies'), ('Linguistics', 'Linguistics'), ('Management', 'Management'), ('Marketing', 'Marketing'), ('Mathematics', 'Mathematics'), ('Mechanical Engineering', 'Mechanical Engineering'), ('Media Studies', 'Media Studies'), ('Medieval Studies', 'Medieval Studies'), ('Middle Eastern and South Asian Languages and Cultures', 'Middle Eastern and South Asian Languages and Cultures'), ('Music', 'Music'), ('Neuroscience', 'Neuroscience'), ('Philosophy', 'Philosophy'), ('Physics', 'Physics'), ('Political Philosophy, Policy, and Law', 'Political Philosophy, Policy, and Law'), ('Political and Social Thought', 'Political and Social Thought'), ('Politics', 'Politics'), ('Psychology', 'Psychology'), ('RN to BSN', 'RN to BSN'), ('Religious Studies', 'Religious Studies'), ('Slavic Languages and Literatures', 'Slavic Languages and Literatures'), ('Sociology', 'Sociology'), ('South Asian Studies', 'South Asian Studies'), ('Spanish', 'Spanish'), ('Speech Communications Disorders Major', 'Speech Communications Disorders Major'), ('Statistics', 'Statistics'), ('Systems Engineering', 'Systems Engineering'), ('Teacher Education', 'Teacher Education'), ('Urban & Environmental Planning', 'Urban & Environmental Planning'), ('Women, Gender, and Sexuality', 'Women, Gender, and Sexuality'), ('Youth & Social Innovation Major', 'Youth & Social Innovation Major')], max_length=255),
),
migrations.AlterField(
model_name='profile',
name='minor',
field=models.CharField(blank=True, choices=[('Accelerated B.A./Master of Public Policy (MPP) Program', 'Accelerated B.A./Master of Public Policy (MPP) Program'), ('Accounting', 'Accounting'), ('Aerospace Engineering', 'Aerospace Engineering'), ('African American & African Studies', 'African American & African Studies'), ('American Studies', 'American Studies'), ('Anthropology', 'Anthropology'), ('Architectural History', 'Architectural History'), ('Architecture', 'Architecture'), ('Art History', 'Art History'), ('Art, Studio', 'Art, Studio'), ('Astronomy', 'Astronomy'), ('Astronomy-Physics', 'Astronomy-Physics'), ('B.A. in Public Policy and Leadership', 'B.A. in Public Policy and Leadership'), ('B.S./M.S. in Teaching', 'B.S./M.S. in Teaching'), ('Bachelor of Science in Nursing (BSN)', 'Bachelor of Science in Nursing (BSN)'), ('Biology', 'Biology'), ('Biomedical Engineering', 'Biomedical Engineering'), ('Chemical Engineering', 'Chemical Engineering'), ('Chemistry', 'Chemistry'), ('Chinese Language & Literature', 'Chinese Language & Literature'), ('Civil Engineering', 'Civil Engineering'), ('Classics', 'Classics'), ('Cognitive Science', 'Cognitive Science'), ('Commerce', 'Commerce'), ('Comparative Literature', 'Comparative Literature'), ('Computer Engineering', 'Computer Engineering'), ('Computer Science', 'Computer Science'), ('Drama', 'Drama'), ('East Asian Studies', 'East Asian Studies'), ('Economics', 'Economics'), ('Electrical Engineering', 'Electrical Engineering'), ('Engineering Science', 'Engineering Science'), ('English', 'English'), ('Environmental Sciences', 'Environmental Sciences'), ('Environmental Thought & Practice', 'Environmental Thought & Practice'), ('Finance', 'Finance'), ('French', 'French'), ('German', 'German'), ('Global Studies', 'Global Studies'), ('History', 'History'), ('Human Biology', 'Human Biology'), ('Information Technology', 'Information Technology'), ('Italian', 'Italian'), ('Japanese Language & Literature', 'Japanese Language & Literature'), ('Jewish Studies', 'Jewish Studies'), ('Kinesiology', 'Kinesiology'), ('Latin American Studies', 'Latin American Studies'), ('Linguistics', 'Linguistics'), ('Management', 'Management'), ('Marketing', 'Marketing'), ('Mathematics', 'Mathematics'), ('Mechanical Engineering', 'Mechanical Engineering'), ('Media Studies', 'Media Studies'), ('Medieval Studies', 'Medieval Studies'), ('Middle Eastern and South Asian Languages and Cultures', 'Middle Eastern and South Asian Languages and Cultures'), ('Music', 'Music'), ('Neuroscience', 'Neuroscience'), ('Philosophy', 'Philosophy'), ('Physics', 'Physics'), ('Political Philosophy, Policy, and Law', 'Political Philosophy, Policy, and Law'), ('Political and Social Thought', 'Political and Social Thought'), ('Politics', 'Politics'), ('Psychology', 'Psychology'), ('RN to BSN', 'RN to BSN'), ('Religious Studies', 'Religious Studies'), ('Slavic Languages and Literatures', 'Slavic Languages and Literatures'), ('Sociology', 'Sociology'), ('South Asian Studies', 'South Asian Studies'), ('Spanish', 'Spanish'), ('Speech Communications Disorders Major', 'Speech Communications Disorders Major'), ('Statistics', 'Statistics'), ('Systems Engineering', 'Systems Engineering'), ('Teacher Education', 'Teacher Education'), ('Urban & Environmental Planning', 'Urban & Environmental Planning'), ('Women, Gender, and Sexuality', 'Women, Gender, and Sexuality'), ('Youth & Social Innovation Major', 'Youth & Social Innovation Major')], max_length=255),
),
]
| 380.724138 | 3,505 | 0.699303 | 1,101 | 11,041 | 7.004541 | 0.134423 | 0.003112 | 0.010114 | 0.014782 | 0.977827 | 0.977827 | 0.977827 | 0.971343 | 0.971343 | 0.971343 | 0 | 0.002859 | 0.112943 | 11,041 | 28 | 3,506 | 394.321429 | 0.784562 | 0.004076 | 0 | 0.5 | 1 | 0 | 0.734764 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
167d072742eb0f27c58020601c72e51325578a33 | 138,388 | py | Python | atom/nucleus/python/nucleus_api/api/aggregation_account_api.py | ShekharPaatni/SDK | 6534ffdb63af87c02c431df9add05a90370183cb | [
"Apache-2.0"
] | 11 | 2019-04-16T02:11:17.000Z | 2021-12-16T22:51:40.000Z | atom/nucleus/python/nucleus_api/api/aggregation_account_api.py | ShekharPaatni/SDK | 6534ffdb63af87c02c431df9add05a90370183cb | [
"Apache-2.0"
] | 81 | 2019-11-19T23:24:28.000Z | 2022-03-28T11:35:47.000Z | atom/nucleus/python/nucleus_api/api/aggregation_account_api.py | ShekharPaatni/SDK | 6534ffdb63af87c02c431df9add05a90370183cb | [
"Apache-2.0"
] | 11 | 2020-07-08T02:29:56.000Z | 2022-03-28T10:05:33.000Z | # coding: utf-8
"""
Hydrogen Nucleus API
The Hydrogen Nucleus API # noqa: E501
OpenAPI spec version: 1.9.4
Contact: info@hydrogenplatform.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from nucleus_api.api_client import ApiClient
class AggregationAccountApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_aggregation_account_balance_bulk_using_post(self, aggregation_account_balance, **kwargs): # noqa: E501
"""Create a bulk aggregation account balance # noqa: E501
Create a balance records under an aggregation accounts. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_balance_bulk_using_post(aggregation_account_balance, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[AggregationAccountBalance] aggregation_account_balance: aggregationAccountBalance (required)
:return: list[AggregationAccountBalance]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_aggregation_account_balance_bulk_using_post_with_http_info(aggregation_account_balance, **kwargs) # noqa: E501
else:
(data) = self.create_aggregation_account_balance_bulk_using_post_with_http_info(aggregation_account_balance, **kwargs) # noqa: E501
return data
def create_aggregation_account_balance_bulk_using_post_with_http_info(self, aggregation_account_balance, **kwargs): # noqa: E501
"""Create a bulk aggregation account balance # noqa: E501
Create a balance records under an aggregation accounts. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_balance_bulk_using_post_with_http_info(aggregation_account_balance, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[AggregationAccountBalance] aggregation_account_balance: aggregationAccountBalance (required)
:return: list[AggregationAccountBalance]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_balance'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_aggregation_account_balance_bulk_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_balance' is set
if self.api_client.client_side_validation and ('aggregation_account_balance' not in params or
params['aggregation_account_balance'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_balance` when calling `create_aggregation_account_balance_bulk_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_balance' in params:
body_params = params['aggregation_account_balance']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/bulk_aggregation_account_balance', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[AggregationAccountBalance]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_aggregation_account_balance_using_post(self, aggregation_account_balance, **kwargs): # noqa: E501
"""Create an aggregation account balance # noqa: E501
Create a balance record under an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_balance_using_post(aggregation_account_balance, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AggregationAccountBalance aggregation_account_balance: aggregationAccountBalance (required)
:return: AggregationAccountBalance
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_aggregation_account_balance_using_post_with_http_info(aggregation_account_balance, **kwargs) # noqa: E501
else:
(data) = self.create_aggregation_account_balance_using_post_with_http_info(aggregation_account_balance, **kwargs) # noqa: E501
return data
def create_aggregation_account_balance_using_post_with_http_info(self, aggregation_account_balance, **kwargs): # noqa: E501
"""Create an aggregation account balance # noqa: E501
Create a balance record under an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_balance_using_post_with_http_info(aggregation_account_balance, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AggregationAccountBalance aggregation_account_balance: aggregationAccountBalance (required)
:return: AggregationAccountBalance
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_balance'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_aggregation_account_balance_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_balance' is set
if self.api_client.client_side_validation and ('aggregation_account_balance' not in params or
params['aggregation_account_balance'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_balance` when calling `create_aggregation_account_balance_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_balance' in params:
body_params = params['aggregation_account_balance']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_balance', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccountBalance', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_aggregation_account_bulk_using_post(self, aggregation_account_list, **kwargs): # noqa: E501
"""Create a bulk aggregation account # noqa: E501
Create a bulk aggregation account under a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_bulk_using_post(aggregation_account_list, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[AggregationAccount] aggregation_account_list: aggregationAccountList (required)
:return: list[AggregationAccount]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_aggregation_account_bulk_using_post_with_http_info(aggregation_account_list, **kwargs) # noqa: E501
else:
(data) = self.create_aggregation_account_bulk_using_post_with_http_info(aggregation_account_list, **kwargs) # noqa: E501
return data
def create_aggregation_account_bulk_using_post_with_http_info(self, aggregation_account_list, **kwargs): # noqa: E501
"""Create a bulk aggregation account # noqa: E501
Create a bulk aggregation account under a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_bulk_using_post_with_http_info(aggregation_account_list, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[AggregationAccount] aggregation_account_list: aggregationAccountList (required)
:return: list[AggregationAccount]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_list'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_aggregation_account_bulk_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_list' is set
if self.api_client.client_side_validation and ('aggregation_account_list' not in params or
params['aggregation_account_list'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_list` when calling `create_aggregation_account_bulk_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_list' in params:
body_params = params['aggregation_account_list']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/bulk_aggregation_account', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[AggregationAccount]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_aggregation_account_holding_bulk_using_post(self, aggregation_transaction, **kwargs): # noqa: E501
"""Create a bulk aggregation account holding # noqa: E501
Create a bulk aggregation account holding. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_holding_bulk_using_post(aggregation_transaction, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[AggregationAccountHolding] aggregation_transaction: aggregationTransaction (required)
:return: list[AggregationAccountHolding]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_aggregation_account_holding_bulk_using_post_with_http_info(aggregation_transaction, **kwargs) # noqa: E501
else:
(data) = self.create_aggregation_account_holding_bulk_using_post_with_http_info(aggregation_transaction, **kwargs) # noqa: E501
return data
def create_aggregation_account_holding_bulk_using_post_with_http_info(self, aggregation_transaction, **kwargs): # noqa: E501
"""Create a bulk aggregation account holding # noqa: E501
Create a bulk aggregation account holding. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_holding_bulk_using_post_with_http_info(aggregation_transaction, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[AggregationAccountHolding] aggregation_transaction: aggregationTransaction (required)
:return: list[AggregationAccountHolding]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_transaction'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_aggregation_account_holding_bulk_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_transaction' is set
if self.api_client.client_side_validation and ('aggregation_transaction' not in params or
params['aggregation_transaction'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_transaction` when calling `create_aggregation_account_holding_bulk_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_transaction' in params:
body_params = params['aggregation_transaction']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/bulk_aggregation_account_holding', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[AggregationAccountHolding]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_aggregation_account_holding_using_post(self, aggregation_account_holding, **kwargs): # noqa: E501
"""Create an aggregation account holding # noqa: E501
Create a holding record under an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_holding_using_post(aggregation_account_holding, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AggregationAccountHolding aggregation_account_holding: aggregationAccountHolding (required)
:return: AggregationAccountHolding
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_aggregation_account_holding_using_post_with_http_info(aggregation_account_holding, **kwargs) # noqa: E501
else:
(data) = self.create_aggregation_account_holding_using_post_with_http_info(aggregation_account_holding, **kwargs) # noqa: E501
return data
def create_aggregation_account_holding_using_post_with_http_info(self, aggregation_account_holding, **kwargs): # noqa: E501
"""Create an aggregation account holding # noqa: E501
Create a holding record under an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_holding_using_post_with_http_info(aggregation_account_holding, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AggregationAccountHolding aggregation_account_holding: aggregationAccountHolding (required)
:return: AggregationAccountHolding
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_holding'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_aggregation_account_holding_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_holding' is set
if self.api_client.client_side_validation and ('aggregation_account_holding' not in params or
params['aggregation_account_holding'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_holding` when calling `create_aggregation_account_holding_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_holding' in params:
body_params = params['aggregation_account_holding']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_holding', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccountHolding', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_aggregation_account_transaction_bulk_using_post(self, aggregation_account_transactions, **kwargs): # noqa: E501
"""Create a bulk aggregation account transaction # noqa: E501
Create a bulk transaction record under an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_transaction_bulk_using_post(aggregation_account_transactions, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[AggregationAccountTransaction] aggregation_account_transactions: aggregationAccountTransactions (required)
:return: list[AggregationAccountTransaction]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_aggregation_account_transaction_bulk_using_post_with_http_info(aggregation_account_transactions, **kwargs) # noqa: E501
else:
(data) = self.create_aggregation_account_transaction_bulk_using_post_with_http_info(aggregation_account_transactions, **kwargs) # noqa: E501
return data
def create_aggregation_account_transaction_bulk_using_post_with_http_info(self, aggregation_account_transactions, **kwargs): # noqa: E501
"""Create a bulk aggregation account transaction # noqa: E501
Create a bulk transaction record under an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_transaction_bulk_using_post_with_http_info(aggregation_account_transactions, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[AggregationAccountTransaction] aggregation_account_transactions: aggregationAccountTransactions (required)
:return: list[AggregationAccountTransaction]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_transactions'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_aggregation_account_transaction_bulk_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_transactions' is set
if self.api_client.client_side_validation and ('aggregation_account_transactions' not in params or
params['aggregation_account_transactions'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_transactions` when calling `create_aggregation_account_transaction_bulk_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_transactions' in params:
body_params = params['aggregation_account_transactions']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/bulk_aggregation_account_transaction', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[AggregationAccountTransaction]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_aggregation_account_transaction_using_post(self, aggregation_account_transaction, **kwargs): # noqa: E501
"""Create an aggregation account transaction # noqa: E501
Create a transaction record under an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_transaction_using_post(aggregation_account_transaction, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AggregationAccountTransaction aggregation_account_transaction: aggregationAccountTransaction (required)
:return: AggregationAccountTransaction
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_aggregation_account_transaction_using_post_with_http_info(aggregation_account_transaction, **kwargs) # noqa: E501
else:
(data) = self.create_aggregation_account_transaction_using_post_with_http_info(aggregation_account_transaction, **kwargs) # noqa: E501
return data
def create_aggregation_account_transaction_using_post_with_http_info(self, aggregation_account_transaction, **kwargs): # noqa: E501
"""Create an aggregation account transaction # noqa: E501
Create a transaction record under an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_transaction_using_post_with_http_info(aggregation_account_transaction, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AggregationAccountTransaction aggregation_account_transaction: aggregationAccountTransaction (required)
:return: AggregationAccountTransaction
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_transaction'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_aggregation_account_transaction_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_transaction' is set
if self.api_client.client_side_validation and ('aggregation_account_transaction' not in params or
params['aggregation_account_transaction'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_transaction` when calling `create_aggregation_account_transaction_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_transaction' in params:
body_params = params['aggregation_account_transaction']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_transaction', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccountTransaction', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_aggregation_account_using_post(self, aggregation_account, **kwargs): # noqa: E501
"""Create an aggregation account # noqa: E501
Create an aggregation account under a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_using_post(aggregation_account, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AggregationAccount aggregation_account: aggregationAccount (required)
:return: AggregationAccount
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_aggregation_account_using_post_with_http_info(aggregation_account, **kwargs) # noqa: E501
else:
(data) = self.create_aggregation_account_using_post_with_http_info(aggregation_account, **kwargs) # noqa: E501
return data
def create_aggregation_account_using_post_with_http_info(self, aggregation_account, **kwargs): # noqa: E501
"""Create an aggregation account # noqa: E501
Create an aggregation account under a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_aggregation_account_using_post_with_http_info(aggregation_account, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AggregationAccount aggregation_account: aggregationAccount (required)
:return: AggregationAccount
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_aggregation_account_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account' is set
if self.api_client.client_side_validation and ('aggregation_account' not in params or
params['aggregation_account'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account` when calling `create_aggregation_account_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account' in params:
body_params = params['aggregation_account']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccount', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_aggregation_account_balance_using_delete(self, aggregation_account_balance_id, **kwargs): # noqa: E501
"""Delete an aggregation account balance # noqa: E501
Permanently delete a balance record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_aggregation_account_balance_using_delete(aggregation_account_balance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_balance_id: UUID aggregation_account_balance_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_aggregation_account_balance_using_delete_with_http_info(aggregation_account_balance_id, **kwargs) # noqa: E501
else:
(data) = self.delete_aggregation_account_balance_using_delete_with_http_info(aggregation_account_balance_id, **kwargs) # noqa: E501
return data
def delete_aggregation_account_balance_using_delete_with_http_info(self, aggregation_account_balance_id, **kwargs): # noqa: E501
"""Delete an aggregation account balance # noqa: E501
Permanently delete a balance record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_aggregation_account_balance_using_delete_with_http_info(aggregation_account_balance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_balance_id: UUID aggregation_account_balance_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_balance_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_aggregation_account_balance_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_balance_id' is set
if self.api_client.client_side_validation and ('aggregation_account_balance_id' not in params or
params['aggregation_account_balance_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_balance_id` when calling `delete_aggregation_account_balance_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_balance_id' in params:
path_params['aggregation_account_balance_id'] = params['aggregation_account_balance_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_balance/{aggregation_account_balance_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_aggregation_account_holding_using_delete(self, aggregation_account_holding_id, **kwargs): # noqa: E501
"""Delete an aggregation account holding # noqa: E501
Permanently delete a holding record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_aggregation_account_holding_using_delete(aggregation_account_holding_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_holding_id: UUID aggregation_account_holding_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_aggregation_account_holding_using_delete_with_http_info(aggregation_account_holding_id, **kwargs) # noqa: E501
else:
(data) = self.delete_aggregation_account_holding_using_delete_with_http_info(aggregation_account_holding_id, **kwargs) # noqa: E501
return data
def delete_aggregation_account_holding_using_delete_with_http_info(self, aggregation_account_holding_id, **kwargs): # noqa: E501
"""Delete an aggregation account holding # noqa: E501
Permanently delete a holding record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_aggregation_account_holding_using_delete_with_http_info(aggregation_account_holding_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_holding_id: UUID aggregation_account_holding_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_holding_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_aggregation_account_holding_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_holding_id' is set
if self.api_client.client_side_validation and ('aggregation_account_holding_id' not in params or
params['aggregation_account_holding_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_holding_id` when calling `delete_aggregation_account_holding_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_holding_id' in params:
path_params['aggregation_account_holding_id'] = params['aggregation_account_holding_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_holding/{aggregation_account_holding_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_aggregation_account_transaction_using_delete(self, aggregation_account_transaction_id, **kwargs): # noqa: E501
"""Delete an aggregation account transaction # noqa: E501
Permanently delete a transaction record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_aggregation_account_transaction_using_delete(aggregation_account_transaction_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_transaction_id: UUID aggregation_account_transaction_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_aggregation_account_transaction_using_delete_with_http_info(aggregation_account_transaction_id, **kwargs) # noqa: E501
else:
(data) = self.delete_aggregation_account_transaction_using_delete_with_http_info(aggregation_account_transaction_id, **kwargs) # noqa: E501
return data
def delete_aggregation_account_transaction_using_delete_with_http_info(self, aggregation_account_transaction_id, **kwargs): # noqa: E501
"""Delete an aggregation account transaction # noqa: E501
Permanently delete a transaction record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_aggregation_account_transaction_using_delete_with_http_info(aggregation_account_transaction_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_transaction_id: UUID aggregation_account_transaction_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_transaction_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_aggregation_account_transaction_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_transaction_id' is set
if self.api_client.client_side_validation and ('aggregation_account_transaction_id' not in params or
params['aggregation_account_transaction_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_transaction_id` when calling `delete_aggregation_account_transaction_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_transaction_id' in params:
path_params['aggregation_account_transaction_id'] = params['aggregation_account_transaction_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_transaction/{aggregation_account_transaction_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_aggregation_account_using_delete(self, aggregation_account_id, **kwargs): # noqa: E501
"""Delete an aggregation account # noqa: E501
Permanently delete an aggregation account under a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_aggregation_account_using_delete(aggregation_account_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_id: UUID aggregation_account_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_aggregation_account_using_delete_with_http_info(aggregation_account_id, **kwargs) # noqa: E501
else:
(data) = self.delete_aggregation_account_using_delete_with_http_info(aggregation_account_id, **kwargs) # noqa: E501
return data
def delete_aggregation_account_using_delete_with_http_info(self, aggregation_account_id, **kwargs): # noqa: E501
"""Delete an aggregation account # noqa: E501
Permanently delete an aggregation account under a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_aggregation_account_using_delete_with_http_info(aggregation_account_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_id: UUID aggregation_account_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_aggregation_account_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_id' is set
if self.api_client.client_side_validation and ('aggregation_account_id' not in params or
params['aggregation_account_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_id` when calling `delete_aggregation_account_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_id' in params:
path_params['aggregation_account_id'] = params['aggregation_account_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account/{aggregation_account_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_aggregate_data_using_get(self, aggregation_account_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account aggregate data # noqa: E501
Retrieve the information for a specific aggregation account associated with a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_aggregate_data_using_get(aggregation_account_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_id: UUID aggregation_account_id (required)
:param str currency_conversion: USD
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_aggregate_data_using_get_with_http_info(aggregation_account_id, **kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_aggregate_data_using_get_with_http_info(aggregation_account_id, **kwargs) # noqa: E501
return data
def get_aggregation_account_aggregate_data_using_get_with_http_info(self, aggregation_account_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account aggregate data # noqa: E501
Retrieve the information for a specific aggregation account associated with a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_aggregate_data_using_get_with_http_info(aggregation_account_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_id: UUID aggregation_account_id (required)
:param str currency_conversion: USD
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_id', 'currency_conversion'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_aggregate_data_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_id' is set
if self.api_client.client_side_validation and ('aggregation_account_id' not in params or
params['aggregation_account_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_id` when calling `get_aggregation_account_aggregate_data_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_id' in params:
path_params['aggregation_account_id'] = params['aggregation_account_id'] # noqa: E501
query_params = []
if 'currency_conversion' in params:
query_params.append(('currency_conversion', params['currency_conversion'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account/{aggregation_account_id}/aggregate_data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_all_using_get(self, **kwargs): # noqa: E501
"""List all aggregation accounts # noqa: E501
Get information for all aggregation accounts for all clients defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_all_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAggregationAccount
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_all_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_all_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_aggregation_account_all_using_get_with_http_info(self, **kwargs): # noqa: E501
"""List all aggregation accounts # noqa: E501
Get information for all aggregation accounts for all clients defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_all_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAggregationAccount
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ascending', 'filter', 'order_by', 'page', 'size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_all_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'ascending' in params:
query_params.append(('ascending', params['ascending'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageAggregationAccount', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_balance_all_using_get(self, **kwargs): # noqa: E501
"""List all aggregation account balances # noqa: E501
Get all of the balance records for all aggregation accounts defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_balance_all_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str currency_conversion: currency_conversion
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAggregationAccountBalance
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_balance_all_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_balance_all_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_aggregation_account_balance_all_using_get_with_http_info(self, **kwargs): # noqa: E501
"""List all aggregation account balances # noqa: E501
Get all of the balance records for all aggregation accounts defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_balance_all_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str currency_conversion: currency_conversion
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAggregationAccountBalance
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ascending', 'currency_conversion', 'filter', 'order_by', 'page', 'size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_balance_all_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'ascending' in params:
query_params.append(('ascending', params['ascending'])) # noqa: E501
if 'currency_conversion' in params:
query_params.append(('currency_conversion', params['currency_conversion'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_balance', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageAggregationAccountBalance', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_balance_using_get(self, aggregation_account_balance_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account balance # noqa: E501
Retrieve the information for a specific balance record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_balance_using_get(aggregation_account_balance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_balance_id: UUID aggregation_account_balance_id (required)
:param str currency_conversion: USD
:return: AggregationAccountBalance
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_balance_using_get_with_http_info(aggregation_account_balance_id, **kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_balance_using_get_with_http_info(aggregation_account_balance_id, **kwargs) # noqa: E501
return data
def get_aggregation_account_balance_using_get_with_http_info(self, aggregation_account_balance_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account balance # noqa: E501
Retrieve the information for a specific balance record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_balance_using_get_with_http_info(aggregation_account_balance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_balance_id: UUID aggregation_account_balance_id (required)
:param str currency_conversion: USD
:return: AggregationAccountBalance
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_balance_id', 'currency_conversion'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_balance_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_balance_id' is set
if self.api_client.client_side_validation and ('aggregation_account_balance_id' not in params or
params['aggregation_account_balance_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_balance_id` when calling `get_aggregation_account_balance_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_balance_id' in params:
path_params['aggregation_account_balance_id'] = params['aggregation_account_balance_id'] # noqa: E501
query_params = []
if 'currency_conversion' in params:
query_params.append(('currency_conversion', params['currency_conversion'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_balance/{aggregation_account_balance_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccountBalance', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_holding_all_using_get(self, **kwargs): # noqa: E501
"""List all aggregation account holdings # noqa: E501
Get all of the holding records for all aggregation accounts defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_holding_all_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str currency_conversion: currency_conversion
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAggregationAccountHolding
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_holding_all_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_holding_all_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_aggregation_account_holding_all_using_get_with_http_info(self, **kwargs): # noqa: E501
"""List all aggregation account holdings # noqa: E501
Get all of the holding records for all aggregation accounts defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_holding_all_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str currency_conversion: currency_conversion
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAggregationAccountHolding
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ascending', 'currency_conversion', 'filter', 'order_by', 'page', 'size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_holding_all_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'ascending' in params:
query_params.append(('ascending', params['ascending'])) # noqa: E501
if 'currency_conversion' in params:
query_params.append(('currency_conversion', params['currency_conversion'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_holding', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageAggregationAccountHolding', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_holding_using_get(self, aggregation_account_holding_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account holding # noqa: E501
Retrieve the information for a specific holding record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_holding_using_get(aggregation_account_holding_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_holding_id: UUID aggregation_account_holding_id (required)
:param str currency_conversion: USD
:return: AggregationAccountHolding
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_holding_using_get_with_http_info(aggregation_account_holding_id, **kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_holding_using_get_with_http_info(aggregation_account_holding_id, **kwargs) # noqa: E501
return data
def get_aggregation_account_holding_using_get_with_http_info(self, aggregation_account_holding_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account holding # noqa: E501
Retrieve the information for a specific holding record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_holding_using_get_with_http_info(aggregation_account_holding_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_holding_id: UUID aggregation_account_holding_id (required)
:param str currency_conversion: USD
:return: AggregationAccountHolding
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_holding_id', 'currency_conversion'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_holding_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_holding_id' is set
if self.api_client.client_side_validation and ('aggregation_account_holding_id' not in params or
params['aggregation_account_holding_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_holding_id` when calling `get_aggregation_account_holding_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_holding_id' in params:
path_params['aggregation_account_holding_id'] = params['aggregation_account_holding_id'] # noqa: E501
query_params = []
if 'currency_conversion' in params:
query_params.append(('currency_conversion', params['currency_conversion'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_holding/{aggregation_account_holding_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccountHolding', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_overview_by_business_id_using_get(self, business_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account aggregate data # noqa: E501
Retrieve the information for a specific aggregation account with aggregate data for a business. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_overview_by_business_id_using_get(business_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str business_id: UUID business_id (required)
:param str currency_conversion: USD
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_overview_by_business_id_using_get_with_http_info(business_id, **kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_overview_by_business_id_using_get_with_http_info(business_id, **kwargs) # noqa: E501
return data
def get_aggregation_account_overview_by_business_id_using_get_with_http_info(self, business_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account aggregate data # noqa: E501
Retrieve the information for a specific aggregation account with aggregate data for a business. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_overview_by_business_id_using_get_with_http_info(business_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str business_id: UUID business_id (required)
:param str currency_conversion: USD
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['business_id', 'currency_conversion'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_overview_by_business_id_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'business_id' is set
if self.api_client.client_side_validation and ('business_id' not in params or
params['business_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `business_id` when calling `get_aggregation_account_overview_by_business_id_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'business_id' in params:
path_params['business_id'] = params['business_id'] # noqa: E501
query_params = []
if 'currency_conversion' in params:
query_params.append(('currency_conversion', params['currency_conversion'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/business/{business_id}/aggregation_account_overview', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_overview_using_get(self, client_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account aggregate data # noqa: E501
Retrieve the information for a specific aggregation account with aggregate data for a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_overview_using_get(client_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str client_id: UUID client_id (required)
:param str currency_conversion: USD
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_overview_using_get_with_http_info(client_id, **kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_overview_using_get_with_http_info(client_id, **kwargs) # noqa: E501
return data
def get_aggregation_account_overview_using_get_with_http_info(self, client_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account aggregate data # noqa: E501
Retrieve the information for a specific aggregation account with aggregate data for a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_overview_using_get_with_http_info(client_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str client_id: UUID client_id (required)
:param str currency_conversion: USD
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['client_id', 'currency_conversion'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_overview_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'client_id' is set
if self.api_client.client_side_validation and ('client_id' not in params or
params['client_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `client_id` when calling `get_aggregation_account_overview_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'client_id' in params:
path_params['client_id'] = params['client_id'] # noqa: E501
query_params = []
if 'currency_conversion' in params:
query_params.append(('currency_conversion', params['currency_conversion'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/client/{client_id}/aggregation_account_overview', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_transaction_all_using_get(self, **kwargs): # noqa: E501
"""List all aggregation account transactions # noqa: E501
Get all of the transaction records for all aggregation accounts defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_transaction_all_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str currency_conversion: currency_conversion
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAggregationAccountTransaction
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_transaction_all_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_transaction_all_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_aggregation_account_transaction_all_using_get_with_http_info(self, **kwargs): # noqa: E501
"""List all aggregation account transactions # noqa: E501
Get all of the transaction records for all aggregation accounts defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_transaction_all_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str currency_conversion: currency_conversion
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAggregationAccountTransaction
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ascending', 'currency_conversion', 'filter', 'order_by', 'page', 'size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_transaction_all_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'ascending' in params:
query_params.append(('ascending', params['ascending'])) # noqa: E501
if 'currency_conversion' in params:
query_params.append(('currency_conversion', params['currency_conversion'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_transaction', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageAggregationAccountTransaction', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_transaction_using_get(self, aggregation_account_transaction_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account transaction # noqa: E501
Retrieve the information for a specific transaction record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_transaction_using_get(aggregation_account_transaction_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_transaction_id: UUID aggregation_account_transaction_id (required)
:param str currency_conversion: USD
:return: AggregationAccountTransaction
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_transaction_using_get_with_http_info(aggregation_account_transaction_id, **kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_transaction_using_get_with_http_info(aggregation_account_transaction_id, **kwargs) # noqa: E501
return data
def get_aggregation_account_transaction_using_get_with_http_info(self, aggregation_account_transaction_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account transaction # noqa: E501
Retrieve the information for a specific transaction record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_transaction_using_get_with_http_info(aggregation_account_transaction_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_transaction_id: UUID aggregation_account_transaction_id (required)
:param str currency_conversion: USD
:return: AggregationAccountTransaction
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_transaction_id', 'currency_conversion'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_transaction_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_transaction_id' is set
if self.api_client.client_side_validation and ('aggregation_account_transaction_id' not in params or
params['aggregation_account_transaction_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_transaction_id` when calling `get_aggregation_account_transaction_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_transaction_id' in params:
path_params['aggregation_account_transaction_id'] = params['aggregation_account_transaction_id'] # noqa: E501
query_params = []
if 'currency_conversion' in params:
query_params.append(('currency_conversion', params['currency_conversion'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_transaction/{aggregation_account_transaction_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccountTransaction', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aggregation_account_using_get(self, aggregation_account_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account # noqa: E501
Retrieve the information for a specific aggregation account associated with a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_using_get(aggregation_account_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_id: UUID aggregation_account_id (required)
:return: AggregationAccount
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aggregation_account_using_get_with_http_info(aggregation_account_id, **kwargs) # noqa: E501
else:
(data) = self.get_aggregation_account_using_get_with_http_info(aggregation_account_id, **kwargs) # noqa: E501
return data
def get_aggregation_account_using_get_with_http_info(self, aggregation_account_id, **kwargs): # noqa: E501
"""Retrieve an aggregation account # noqa: E501
Retrieve the information for a specific aggregation account associated with a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aggregation_account_using_get_with_http_info(aggregation_account_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str aggregation_account_id: UUID aggregation_account_id (required)
:return: AggregationAccount
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aggregation_account_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_id' is set
if self.api_client.client_side_validation and ('aggregation_account_id' not in params or
params['aggregation_account_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_id` when calling `get_aggregation_account_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_id' in params:
path_params['aggregation_account_id'] = params['aggregation_account_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account/{aggregation_account_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccount', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_aggregation_account_balance_using_put(self, aggregation_account_balance, aggregation_account_balance_id, **kwargs): # noqa: E501
"""Update an aggregation account balance # noqa: E501
Update a balance record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_balance_using_put(aggregation_account_balance, aggregation_account_balance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param object aggregation_account_balance: aggregation_account_balance (required)
:param str aggregation_account_balance_id: UUID aggregation_account_balance_id (required)
:return: AggregationAccountBalance
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_aggregation_account_balance_using_put_with_http_info(aggregation_account_balance, aggregation_account_balance_id, **kwargs) # noqa: E501
else:
(data) = self.update_aggregation_account_balance_using_put_with_http_info(aggregation_account_balance, aggregation_account_balance_id, **kwargs) # noqa: E501
return data
def update_aggregation_account_balance_using_put_with_http_info(self, aggregation_account_balance, aggregation_account_balance_id, **kwargs): # noqa: E501
"""Update an aggregation account balance # noqa: E501
Update a balance record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_balance_using_put_with_http_info(aggregation_account_balance, aggregation_account_balance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param object aggregation_account_balance: aggregation_account_balance (required)
:param str aggregation_account_balance_id: UUID aggregation_account_balance_id (required)
:return: AggregationAccountBalance
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_balance', 'aggregation_account_balance_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_aggregation_account_balance_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_balance' is set
if self.api_client.client_side_validation and ('aggregation_account_balance' not in params or
params['aggregation_account_balance'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_balance` when calling `update_aggregation_account_balance_using_put`") # noqa: E501
# verify the required parameter 'aggregation_account_balance_id' is set
if self.api_client.client_side_validation and ('aggregation_account_balance_id' not in params or
params['aggregation_account_balance_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_balance_id` when calling `update_aggregation_account_balance_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_balance_id' in params:
path_params['aggregation_account_balance_id'] = params['aggregation_account_balance_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_balance' in params:
body_params = params['aggregation_account_balance']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_balance/{aggregation_account_balance_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccountBalance', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_aggregation_account_bulk_using_put(self, aggregation_account_list, **kwargs): # noqa: E501
"""Update a bulk aggregation account # noqa: E501
Update a bulk aggregation account under a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_bulk_using_put(aggregation_account_list, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[object] aggregation_account_list: aggregationAccountList (required)
:return: list[AggregationAccount]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_aggregation_account_bulk_using_put_with_http_info(aggregation_account_list, **kwargs) # noqa: E501
else:
(data) = self.update_aggregation_account_bulk_using_put_with_http_info(aggregation_account_list, **kwargs) # noqa: E501
return data
def update_aggregation_account_bulk_using_put_with_http_info(self, aggregation_account_list, **kwargs): # noqa: E501
"""Update a bulk aggregation account # noqa: E501
Update a bulk aggregation account under a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_bulk_using_put_with_http_info(aggregation_account_list, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[object] aggregation_account_list: aggregationAccountList (required)
:return: list[AggregationAccount]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_list'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_aggregation_account_bulk_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_list' is set
if self.api_client.client_side_validation and ('aggregation_account_list' not in params or
params['aggregation_account_list'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_list` when calling `update_aggregation_account_bulk_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_list' in params:
body_params = params['aggregation_account_list']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/bulk_aggregation_account', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[AggregationAccount]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_aggregation_account_holding_bulk_using_put(self, aggregation_account_holding, **kwargs): # noqa: E501
"""Update an bulk aggregation account holding # noqa: E501
Update a bulk holding record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_holding_bulk_using_put(aggregation_account_holding, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[object] aggregation_account_holding: aggregationAccountHolding (required)
:return: list[AggregationAccountHolding]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_aggregation_account_holding_bulk_using_put_with_http_info(aggregation_account_holding, **kwargs) # noqa: E501
else:
(data) = self.update_aggregation_account_holding_bulk_using_put_with_http_info(aggregation_account_holding, **kwargs) # noqa: E501
return data
def update_aggregation_account_holding_bulk_using_put_with_http_info(self, aggregation_account_holding, **kwargs): # noqa: E501
"""Update an bulk aggregation account holding # noqa: E501
Update a bulk holding record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_holding_bulk_using_put_with_http_info(aggregation_account_holding, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[object] aggregation_account_holding: aggregationAccountHolding (required)
:return: list[AggregationAccountHolding]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_holding'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_aggregation_account_holding_bulk_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_holding' is set
if self.api_client.client_side_validation and ('aggregation_account_holding' not in params or
params['aggregation_account_holding'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_holding` when calling `update_aggregation_account_holding_bulk_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_holding' in params:
body_params = params['aggregation_account_holding']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/bulk_aggregation_account_holding', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[AggregationAccountHolding]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_aggregation_account_holding_using_put(self, aggregation_account_holding, aggregation_account_holding_id, **kwargs): # noqa: E501
"""Update an aggregation account holding # noqa: E501
Update a holding record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_holding_using_put(aggregation_account_holding, aggregation_account_holding_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param object aggregation_account_holding: aggregation_account_holding (required)
:param str aggregation_account_holding_id: UUID aggregation_account_holding_id (required)
:return: AggregationAccountHolding
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_aggregation_account_holding_using_put_with_http_info(aggregation_account_holding, aggregation_account_holding_id, **kwargs) # noqa: E501
else:
(data) = self.update_aggregation_account_holding_using_put_with_http_info(aggregation_account_holding, aggregation_account_holding_id, **kwargs) # noqa: E501
return data
def update_aggregation_account_holding_using_put_with_http_info(self, aggregation_account_holding, aggregation_account_holding_id, **kwargs): # noqa: E501
"""Update an aggregation account holding # noqa: E501
Update a holding record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_holding_using_put_with_http_info(aggregation_account_holding, aggregation_account_holding_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param object aggregation_account_holding: aggregation_account_holding (required)
:param str aggregation_account_holding_id: UUID aggregation_account_holding_id (required)
:return: AggregationAccountHolding
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_holding', 'aggregation_account_holding_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_aggregation_account_holding_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_holding' is set
if self.api_client.client_side_validation and ('aggregation_account_holding' not in params or
params['aggregation_account_holding'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_holding` when calling `update_aggregation_account_holding_using_put`") # noqa: E501
# verify the required parameter 'aggregation_account_holding_id' is set
if self.api_client.client_side_validation and ('aggregation_account_holding_id' not in params or
params['aggregation_account_holding_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_holding_id` when calling `update_aggregation_account_holding_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_holding_id' in params:
path_params['aggregation_account_holding_id'] = params['aggregation_account_holding_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_holding' in params:
body_params = params['aggregation_account_holding']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_holding/{aggregation_account_holding_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccountHolding', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_aggregation_account_transaction_using_put(self, aggregation_account_transaction, aggregation_account_transaction_id, **kwargs): # noqa: E501
"""Update an aggregation account transaction # noqa: E501
Update a transaction record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_transaction_using_put(aggregation_account_transaction, aggregation_account_transaction_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param object aggregation_account_transaction: aggregation_account_transaction (required)
:param str aggregation_account_transaction_id: UUID aggregation_account_transaction_id (required)
:return: AggregationAccountTransaction
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_aggregation_account_transaction_using_put_with_http_info(aggregation_account_transaction, aggregation_account_transaction_id, **kwargs) # noqa: E501
else:
(data) = self.update_aggregation_account_transaction_using_put_with_http_info(aggregation_account_transaction, aggregation_account_transaction_id, **kwargs) # noqa: E501
return data
def update_aggregation_account_transaction_using_put_with_http_info(self, aggregation_account_transaction, aggregation_account_transaction_id, **kwargs): # noqa: E501
"""Update an aggregation account transaction # noqa: E501
Update a transaction record for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_transaction_using_put_with_http_info(aggregation_account_transaction, aggregation_account_transaction_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param object aggregation_account_transaction: aggregation_account_transaction (required)
:param str aggregation_account_transaction_id: UUID aggregation_account_transaction_id (required)
:return: AggregationAccountTransaction
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account_transaction', 'aggregation_account_transaction_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_aggregation_account_transaction_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account_transaction' is set
if self.api_client.client_side_validation and ('aggregation_account_transaction' not in params or
params['aggregation_account_transaction'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_transaction` when calling `update_aggregation_account_transaction_using_put`") # noqa: E501
# verify the required parameter 'aggregation_account_transaction_id' is set
if self.api_client.client_side_validation and ('aggregation_account_transaction_id' not in params or
params['aggregation_account_transaction_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_transaction_id` when calling `update_aggregation_account_transaction_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_transaction_id' in params:
path_params['aggregation_account_transaction_id'] = params['aggregation_account_transaction_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account_transaction' in params:
body_params = params['aggregation_account_transaction']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account_transaction/{aggregation_account_transaction_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccountTransaction', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_aggregation_account_using_put(self, aggregation_account, aggregation_account_id, **kwargs): # noqa: E501
"""Update an aggregation account # noqa: E501
Update the information for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_using_put(aggregation_account, aggregation_account_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param object aggregation_account: aggregation_account (required)
:param str aggregation_account_id: UUID aggregation_account_id (required)
:return: AggregationAccount
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_aggregation_account_using_put_with_http_info(aggregation_account, aggregation_account_id, **kwargs) # noqa: E501
else:
(data) = self.update_aggregation_account_using_put_with_http_info(aggregation_account, aggregation_account_id, **kwargs) # noqa: E501
return data
def update_aggregation_account_using_put_with_http_info(self, aggregation_account, aggregation_account_id, **kwargs): # noqa: E501
"""Update an aggregation account # noqa: E501
Update the information for an aggregation account. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_aggregation_account_using_put_with_http_info(aggregation_account, aggregation_account_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param object aggregation_account: aggregation_account (required)
:param str aggregation_account_id: UUID aggregation_account_id (required)
:return: AggregationAccount
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['aggregation_account', 'aggregation_account_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_aggregation_account_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'aggregation_account' is set
if self.api_client.client_side_validation and ('aggregation_account' not in params or
params['aggregation_account'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account` when calling `update_aggregation_account_using_put`") # noqa: E501
# verify the required parameter 'aggregation_account_id' is set
if self.api_client.client_side_validation and ('aggregation_account_id' not in params or
params['aggregation_account_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `aggregation_account_id` when calling `update_aggregation_account_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'aggregation_account_id' in params:
path_params['aggregation_account_id'] = params['aggregation_account_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'aggregation_account' in params:
body_params = params['aggregation_account']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/nucleus/v1/aggregation_account/{aggregation_account_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AggregationAccount', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 46.721134 | 182 | 0.656466 | 15,387 | 138,388 | 5.571391 | 0.013063 | 0.171125 | 0.052784 | 0.024356 | 0.991193 | 0.98998 | 0.984229 | 0.977568 | 0.975924 | 0.969869 | 0 | 0.014807 | 0.266035 | 138,388 | 2,961 | 183 | 46.736913 | 0.829195 | 0.33237 | 0 | 0.824704 | 1 | 0 | 0.227895 | 0.131786 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036806 | false | 0 | 0.002495 | 0 | 0.094198 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
16d484f1e4c3f2d9e32722ab4b83cfc85e3e7dc4 | 140 | py | Python | pympg/gen/main.py | embedvr/pympg | b0517a916e4829ce538512dff3e98deab5a1f18f | [
"MIT"
] | null | null | null | pympg/gen/main.py | embedvr/pympg | b0517a916e4829ce538512dff3e98deab5a1f18f | [
"MIT"
] | null | null | null | pympg/gen/main.py | embedvr/pympg | b0517a916e4829ce538512dff3e98deab5a1f18f | [
"MIT"
] | null | null | null | class ConfigGenerator:
def generate():
raise NotImplementedError()
def reload():
raise NotImplementedError()
| 20 | 36 | 0.628571 | 10 | 140 | 8.8 | 0.7 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.292857 | 140 | 6 | 37 | 23.333333 | 0.888889 | 0 | 0 | 0.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | true | 0 | 0 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
bc5c0737732066f8599c4c39a6275c7b1189b99b | 7,382 | py | Python | imcsdk/mometa/memory/MemoryPersistentMemoryConfiguration.py | ecoen66/imcsdk | b10eaa926a5ee57cea7182ae0adc8dd1c818b0ab | [
"Apache-2.0"
] | 31 | 2016-06-14T07:23:59.000Z | 2021-09-12T17:17:26.000Z | imcsdk/mometa/memory/MemoryPersistentMemoryConfiguration.py | sthagen/imcsdk | 1831eaecb5960ca03a8624b1579521749762b932 | [
"Apache-2.0"
] | 109 | 2016-05-25T03:56:56.000Z | 2021-10-18T02:58:12.000Z | imcsdk/mometa/memory/MemoryPersistentMemoryConfiguration.py | sthagen/imcsdk | 1831eaecb5960ca03a8624b1579521749762b932 | [
"Apache-2.0"
] | 67 | 2016-05-17T05:53:56.000Z | 2022-03-24T15:52:53.000Z | """This module contains the general information for MemoryPersistentMemoryConfiguration ManagedObject."""
from ...imcmo import ManagedObject
from ...imccoremeta import MoPropertyMeta, MoMeta
from ...imcmeta import VersionMeta
class MemoryPersistentMemoryConfigurationConsts:
NUM_OF_DIMMS_UNSPECIFIED = "unspecified"
NUM_OF_REGIONS_UNSPECIFIED = "unspecified"
class MemoryPersistentMemoryConfiguration(ManagedObject):
"""This is MemoryPersistentMemoryConfiguration class."""
consts = MemoryPersistentMemoryConfigurationConsts()
naming_props = set([])
mo_meta = {
"classic": MoMeta("MemoryPersistentMemoryConfiguration", "memoryPersistentMemoryConfiguration", "pmemory-config", VersionMeta.Version404b, "OutputOnly", 0xf, [], ["admin", "read-only", "user"], ['computeBoard'], ['memoryPersistentMemoryBackup', 'memoryPersistentMemoryConfigResult', 'memoryPersistentMemoryImporter', 'memoryPersistentMemoryRegion'], [None]),
"modular": MoMeta("MemoryPersistentMemoryConfiguration", "memoryPersistentMemoryConfiguration", "pmemory-config", VersionMeta.Version404b, "OutputOnly", 0xf, [], ["admin", "read-only", "user"], ['computeBoard'], ['memoryPersistentMemoryBackup', 'memoryPersistentMemoryConfigResult', 'memoryPersistentMemoryImporter', 'memoryPersistentMemoryRegion'], [None])
}
prop_meta = {
"classic": {
"child_action": MoPropertyMeta("child_action", "childAction", "string", VersionMeta.Version404b, MoPropertyMeta.INTERNAL, None, None, None, None, [], []),
"config_state": MoPropertyMeta("config_state", "configState", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, 0, 510, None, [], []),
"dn": MoPropertyMeta("dn", "dn", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, 0x2, 0, 255, None, [], []),
"memory_capacity": MoPropertyMeta("memory_capacity", "memoryCapacity", "long", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, [], []),
"num_of_dimms": MoPropertyMeta("num_of_dimms", "numOfDimms", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, ["unspecified"], ["0-4294967295"]),
"num_of_regions": MoPropertyMeta("num_of_regions", "numOfRegions", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, ["unspecified"], ["0-4294967295"]),
"persistent_memory_capacity": MoPropertyMeta("persistent_memory_capacity", "persistentMemoryCapacity", "long", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, [], []),
"reserved_capacity": MoPropertyMeta("reserved_capacity", "reservedCapacity", "long", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, [], []),
"rn": MoPropertyMeta("rn", "rn", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, 0x4, 0, 255, None, [], []),
"security_state": MoPropertyMeta("security_state", "securityState", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, 0, 510, None, [], []),
"status": MoPropertyMeta("status", "status", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, 0x8, None, None, r"""((removed|created|modified|deleted),){0,3}(removed|created|modified|deleted){0,1}""", [], []),
"total_capacity": MoPropertyMeta("total_capacity", "totalCapacity", "long", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, [], []),
},
"modular": {
"child_action": MoPropertyMeta("child_action", "childAction", "string", VersionMeta.Version404b, MoPropertyMeta.INTERNAL, None, None, None, None, [], []),
"config_state": MoPropertyMeta("config_state", "configState", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, 0, 510, None, [], []),
"dn": MoPropertyMeta("dn", "dn", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, 0x2, 0, 255, None, [], []),
"memory_capacity": MoPropertyMeta("memory_capacity", "memoryCapacity", "long", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, [], []),
"num_of_dimms": MoPropertyMeta("num_of_dimms", "numOfDimms", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, ["unspecified"], ["0-4294967295"]),
"num_of_regions": MoPropertyMeta("num_of_regions", "numOfRegions", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, ["unspecified"], ["0-4294967295"]),
"persistent_memory_capacity": MoPropertyMeta("persistent_memory_capacity", "persistentMemoryCapacity", "long", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, [], []),
"reserved_capacity": MoPropertyMeta("reserved_capacity", "reservedCapacity", "long", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, [], []),
"rn": MoPropertyMeta("rn", "rn", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, 0x4, 0, 255, None, [], []),
"security_state": MoPropertyMeta("security_state", "securityState", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, 0, 510, None, [], []),
"status": MoPropertyMeta("status", "status", "string", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, 0x8, None, None, r"""((removed|created|modified|deleted),){0,3}(removed|created|modified|deleted){0,1}""", [], []),
"total_capacity": MoPropertyMeta("total_capacity", "totalCapacity", "long", VersionMeta.Version404b, MoPropertyMeta.READ_ONLY, None, None, None, None, [], []),
},
}
prop_map = {
"classic": {
"childAction": "child_action",
"configState": "config_state",
"dn": "dn",
"memoryCapacity": "memory_capacity",
"numOfDimms": "num_of_dimms",
"numOfRegions": "num_of_regions",
"persistentMemoryCapacity": "persistent_memory_capacity",
"reservedCapacity": "reserved_capacity",
"rn": "rn",
"securityState": "security_state",
"status": "status",
"totalCapacity": "total_capacity",
},
"modular": {
"childAction": "child_action",
"configState": "config_state",
"dn": "dn",
"memoryCapacity": "memory_capacity",
"numOfDimms": "num_of_dimms",
"numOfRegions": "num_of_regions",
"persistentMemoryCapacity": "persistent_memory_capacity",
"reservedCapacity": "reserved_capacity",
"rn": "rn",
"securityState": "security_state",
"status": "status",
"totalCapacity": "total_capacity",
},
}
def __init__(self, parent_mo_or_dn, **kwargs):
self._dirty_mask = 0
self.child_action = None
self.config_state = None
self.memory_capacity = None
self.num_of_dimms = None
self.num_of_regions = None
self.persistent_memory_capacity = None
self.reserved_capacity = None
self.security_state = None
self.status = None
self.total_capacity = None
ManagedObject.__init__(self, "MemoryPersistentMemoryConfiguration", parent_mo_or_dn, **kwargs)
| 68.351852 | 366 | 0.670144 | 649 | 7,382 | 7.419106 | 0.14792 | 0.073105 | 0.069782 | 0.182762 | 0.831153 | 0.823676 | 0.823676 | 0.823676 | 0.823676 | 0.823676 | 0 | 0.029198 | 0.178813 | 7,382 | 107 | 367 | 68.990654 | 0.765094 | 0.02032 | 0 | 0.604651 | 0 | 0.023256 | 0.329778 | 0.11482 | 0 | 0 | 0.003324 | 0 | 0 | 1 | 0.011628 | false | 0 | 0.05814 | 0 | 0.174419 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bc6fbf49b3d167f1546f8f9d140d11e3c9c33296 | 73,707 | py | Python | sdk/python/pulumi_aws/kinesis/firehose_delivery_stream.py | rapzo/pulumi-aws | 390a098221315d98a54ba97d1559e750dc3053b7 | [
"ECL-2.0",
"Apache-2.0"
] | 260 | 2018-06-18T14:57:00.000Z | 2022-03-29T11:41:03.000Z | sdk/python/pulumi_aws/kinesis/firehose_delivery_stream.py | rapzo/pulumi-aws | 390a098221315d98a54ba97d1559e750dc3053b7 | [
"ECL-2.0",
"Apache-2.0"
] | 1,154 | 2018-06-19T20:38:20.000Z | 2022-03-31T19:48:16.000Z | sdk/python/pulumi_aws/kinesis/firehose_delivery_stream.py | rapzo/pulumi-aws | 390a098221315d98a54ba97d1559e750dc3053b7 | [
"ECL-2.0",
"Apache-2.0"
] | 115 | 2018-06-28T03:20:27.000Z | 2022-03-29T11:41:06.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['FirehoseDeliveryStreamArgs', 'FirehoseDeliveryStream']
@pulumi.input_type
class FirehoseDeliveryStreamArgs:
def __init__(__self__, *,
destination: pulumi.Input[str],
arn: Optional[pulumi.Input[str]] = None,
destination_id: Optional[pulumi.Input[str]] = None,
elasticsearch_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamElasticsearchConfigurationArgs']] = None,
extended_s3_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']] = None,
http_endpoint_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']] = None,
kinesis_source_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']] = None,
name: Optional[pulumi.Input[str]] = None,
redshift_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamRedshiftConfigurationArgs']] = None,
s3_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamS3ConfigurationArgs']] = None,
server_side_encryption: Optional[pulumi.Input['FirehoseDeliveryStreamServerSideEncryptionArgs']] = None,
splunk_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamSplunkConfigurationArgs']] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
version_id: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a FirehoseDeliveryStream resource.
:param pulumi.Input[str] destination: This is the destination to where the data is delivered. The only options are `s3` (Deprecated, use `extended_s3` instead), `extended_s3`, `redshift`, `elasticsearch`, `splunk`, and `http_endpoint`.
:param pulumi.Input[str] arn: The Amazon Resource Name (ARN) specifying the Stream
:param pulumi.Input['FirehoseDeliveryStreamElasticsearchConfigurationArgs'] elasticsearch_configuration: Configuration options if elasticsearch is the destination. More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamExtendedS3ConfigurationArgs'] extended_s3_configuration: Enhanced configuration options for the s3 destination. More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamHttpEndpointConfigurationArgs'] http_endpoint_configuration: Configuration options if http_endpoint is the destination. requires the user to also specify a `s3_configuration` block. More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamKinesisSourceConfigurationArgs'] kinesis_source_configuration: Allows the ability to specify the kinesis stream that is used as the source of the firehose delivery stream.
:param pulumi.Input[str] name: A name to identify the stream. This is unique to the
AWS account and region the Stream is created in.
:param pulumi.Input['FirehoseDeliveryStreamRedshiftConfigurationArgs'] redshift_configuration: Configuration options if redshift is the destination.
Using `redshift_configuration` requires the user to also specify a
`s3_configuration` block. More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamS3ConfigurationArgs'] s3_configuration: Required for non-S3 destinations. For S3 destination, use `extended_s3_configuration` instead. Configuration options for the s3 destination (or the intermediate bucket if the destination
is redshift). More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamServerSideEncryptionArgs'] server_side_encryption: Encrypt at rest options.
Server-side encryption should not be enabled when a kinesis stream is configured as the source of the firehose delivery stream.
:param pulumi.Input['FirehoseDeliveryStreamSplunkConfigurationArgs'] splunk_configuration: Configuration options if splunk is the destination. More details are given below.
:param pulumi.Input[str] version_id: Specifies the table version for the output data schema. Defaults to `LATEST`.
"""
pulumi.set(__self__, "destination", destination)
if arn is not None:
pulumi.set(__self__, "arn", arn)
if destination_id is not None:
pulumi.set(__self__, "destination_id", destination_id)
if elasticsearch_configuration is not None:
pulumi.set(__self__, "elasticsearch_configuration", elasticsearch_configuration)
if extended_s3_configuration is not None:
pulumi.set(__self__, "extended_s3_configuration", extended_s3_configuration)
if http_endpoint_configuration is not None:
pulumi.set(__self__, "http_endpoint_configuration", http_endpoint_configuration)
if kinesis_source_configuration is not None:
pulumi.set(__self__, "kinesis_source_configuration", kinesis_source_configuration)
if name is not None:
pulumi.set(__self__, "name", name)
if redshift_configuration is not None:
pulumi.set(__self__, "redshift_configuration", redshift_configuration)
if s3_configuration is not None:
pulumi.set(__self__, "s3_configuration", s3_configuration)
if server_side_encryption is not None:
pulumi.set(__self__, "server_side_encryption", server_side_encryption)
if splunk_configuration is not None:
pulumi.set(__self__, "splunk_configuration", splunk_configuration)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if version_id is not None:
pulumi.set(__self__, "version_id", version_id)
@property
@pulumi.getter
def destination(self) -> pulumi.Input[str]:
"""
This is the destination to where the data is delivered. The only options are `s3` (Deprecated, use `extended_s3` instead), `extended_s3`, `redshift`, `elasticsearch`, `splunk`, and `http_endpoint`.
"""
return pulumi.get(self, "destination")
@destination.setter
def destination(self, value: pulumi.Input[str]):
pulumi.set(self, "destination", value)
@property
@pulumi.getter
def arn(self) -> Optional[pulumi.Input[str]]:
"""
The Amazon Resource Name (ARN) specifying the Stream
"""
return pulumi.get(self, "arn")
@arn.setter
def arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "arn", value)
@property
@pulumi.getter(name="destinationId")
def destination_id(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "destination_id")
@destination_id.setter
def destination_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_id", value)
@property
@pulumi.getter(name="elasticsearchConfiguration")
def elasticsearch_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamElasticsearchConfigurationArgs']]:
"""
Configuration options if elasticsearch is the destination. More details are given below.
"""
return pulumi.get(self, "elasticsearch_configuration")
@elasticsearch_configuration.setter
def elasticsearch_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamElasticsearchConfigurationArgs']]):
pulumi.set(self, "elasticsearch_configuration", value)
@property
@pulumi.getter(name="extendedS3Configuration")
def extended_s3_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']]:
"""
Enhanced configuration options for the s3 destination. More details are given below.
"""
return pulumi.get(self, "extended_s3_configuration")
@extended_s3_configuration.setter
def extended_s3_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']]):
pulumi.set(self, "extended_s3_configuration", value)
@property
@pulumi.getter(name="httpEndpointConfiguration")
def http_endpoint_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']]:
"""
Configuration options if http_endpoint is the destination. requires the user to also specify a `s3_configuration` block. More details are given below.
"""
return pulumi.get(self, "http_endpoint_configuration")
@http_endpoint_configuration.setter
def http_endpoint_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']]):
pulumi.set(self, "http_endpoint_configuration", value)
@property
@pulumi.getter(name="kinesisSourceConfiguration")
def kinesis_source_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']]:
"""
Allows the ability to specify the kinesis stream that is used as the source of the firehose delivery stream.
"""
return pulumi.get(self, "kinesis_source_configuration")
@kinesis_source_configuration.setter
def kinesis_source_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']]):
pulumi.set(self, "kinesis_source_configuration", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
A name to identify the stream. This is unique to the
AWS account and region the Stream is created in.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="redshiftConfiguration")
def redshift_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamRedshiftConfigurationArgs']]:
"""
Configuration options if redshift is the destination.
Using `redshift_configuration` requires the user to also specify a
`s3_configuration` block. More details are given below.
"""
return pulumi.get(self, "redshift_configuration")
@redshift_configuration.setter
def redshift_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamRedshiftConfigurationArgs']]):
pulumi.set(self, "redshift_configuration", value)
@property
@pulumi.getter(name="s3Configuration")
def s3_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamS3ConfigurationArgs']]:
"""
Required for non-S3 destinations. For S3 destination, use `extended_s3_configuration` instead. Configuration options for the s3 destination (or the intermediate bucket if the destination
is redshift). More details are given below.
"""
return pulumi.get(self, "s3_configuration")
@s3_configuration.setter
def s3_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamS3ConfigurationArgs']]):
pulumi.set(self, "s3_configuration", value)
@property
@pulumi.getter(name="serverSideEncryption")
def server_side_encryption(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamServerSideEncryptionArgs']]:
"""
Encrypt at rest options.
Server-side encryption should not be enabled when a kinesis stream is configured as the source of the firehose delivery stream.
"""
return pulumi.get(self, "server_side_encryption")
@server_side_encryption.setter
def server_side_encryption(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamServerSideEncryptionArgs']]):
pulumi.set(self, "server_side_encryption", value)
@property
@pulumi.getter(name="splunkConfiguration")
def splunk_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamSplunkConfigurationArgs']]:
"""
Configuration options if splunk is the destination. More details are given below.
"""
return pulumi.get(self, "splunk_configuration")
@splunk_configuration.setter
def splunk_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamSplunkConfigurationArgs']]):
pulumi.set(self, "splunk_configuration", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="versionId")
def version_id(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the table version for the output data schema. Defaults to `LATEST`.
"""
return pulumi.get(self, "version_id")
@version_id.setter
def version_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "version_id", value)
@pulumi.input_type
class _FirehoseDeliveryStreamState:
def __init__(__self__, *,
arn: Optional[pulumi.Input[str]] = None,
destination: Optional[pulumi.Input[str]] = None,
destination_id: Optional[pulumi.Input[str]] = None,
elasticsearch_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamElasticsearchConfigurationArgs']] = None,
extended_s3_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']] = None,
http_endpoint_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']] = None,
kinesis_source_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']] = None,
name: Optional[pulumi.Input[str]] = None,
redshift_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamRedshiftConfigurationArgs']] = None,
s3_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamS3ConfigurationArgs']] = None,
server_side_encryption: Optional[pulumi.Input['FirehoseDeliveryStreamServerSideEncryptionArgs']] = None,
splunk_configuration: Optional[pulumi.Input['FirehoseDeliveryStreamSplunkConfigurationArgs']] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
tags_all: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
version_id: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering FirehoseDeliveryStream resources.
:param pulumi.Input[str] arn: The Amazon Resource Name (ARN) specifying the Stream
:param pulumi.Input[str] destination: This is the destination to where the data is delivered. The only options are `s3` (Deprecated, use `extended_s3` instead), `extended_s3`, `redshift`, `elasticsearch`, `splunk`, and `http_endpoint`.
:param pulumi.Input['FirehoseDeliveryStreamElasticsearchConfigurationArgs'] elasticsearch_configuration: Configuration options if elasticsearch is the destination. More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamExtendedS3ConfigurationArgs'] extended_s3_configuration: Enhanced configuration options for the s3 destination. More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamHttpEndpointConfigurationArgs'] http_endpoint_configuration: Configuration options if http_endpoint is the destination. requires the user to also specify a `s3_configuration` block. More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamKinesisSourceConfigurationArgs'] kinesis_source_configuration: Allows the ability to specify the kinesis stream that is used as the source of the firehose delivery stream.
:param pulumi.Input[str] name: A name to identify the stream. This is unique to the
AWS account and region the Stream is created in.
:param pulumi.Input['FirehoseDeliveryStreamRedshiftConfigurationArgs'] redshift_configuration: Configuration options if redshift is the destination.
Using `redshift_configuration` requires the user to also specify a
`s3_configuration` block. More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamS3ConfigurationArgs'] s3_configuration: Required for non-S3 destinations. For S3 destination, use `extended_s3_configuration` instead. Configuration options for the s3 destination (or the intermediate bucket if the destination
is redshift). More details are given below.
:param pulumi.Input['FirehoseDeliveryStreamServerSideEncryptionArgs'] server_side_encryption: Encrypt at rest options.
Server-side encryption should not be enabled when a kinesis stream is configured as the source of the firehose delivery stream.
:param pulumi.Input['FirehoseDeliveryStreamSplunkConfigurationArgs'] splunk_configuration: Configuration options if splunk is the destination. More details are given below.
:param pulumi.Input[str] version_id: Specifies the table version for the output data schema. Defaults to `LATEST`.
"""
if arn is not None:
pulumi.set(__self__, "arn", arn)
if destination is not None:
pulumi.set(__self__, "destination", destination)
if destination_id is not None:
pulumi.set(__self__, "destination_id", destination_id)
if elasticsearch_configuration is not None:
pulumi.set(__self__, "elasticsearch_configuration", elasticsearch_configuration)
if extended_s3_configuration is not None:
pulumi.set(__self__, "extended_s3_configuration", extended_s3_configuration)
if http_endpoint_configuration is not None:
pulumi.set(__self__, "http_endpoint_configuration", http_endpoint_configuration)
if kinesis_source_configuration is not None:
pulumi.set(__self__, "kinesis_source_configuration", kinesis_source_configuration)
if name is not None:
pulumi.set(__self__, "name", name)
if redshift_configuration is not None:
pulumi.set(__self__, "redshift_configuration", redshift_configuration)
if s3_configuration is not None:
pulumi.set(__self__, "s3_configuration", s3_configuration)
if server_side_encryption is not None:
pulumi.set(__self__, "server_side_encryption", server_side_encryption)
if splunk_configuration is not None:
pulumi.set(__self__, "splunk_configuration", splunk_configuration)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if tags_all is not None:
pulumi.set(__self__, "tags_all", tags_all)
if version_id is not None:
pulumi.set(__self__, "version_id", version_id)
@property
@pulumi.getter
def arn(self) -> Optional[pulumi.Input[str]]:
"""
The Amazon Resource Name (ARN) specifying the Stream
"""
return pulumi.get(self, "arn")
@arn.setter
def arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "arn", value)
@property
@pulumi.getter
def destination(self) -> Optional[pulumi.Input[str]]:
"""
This is the destination to where the data is delivered. The only options are `s3` (Deprecated, use `extended_s3` instead), `extended_s3`, `redshift`, `elasticsearch`, `splunk`, and `http_endpoint`.
"""
return pulumi.get(self, "destination")
@destination.setter
def destination(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination", value)
@property
@pulumi.getter(name="destinationId")
def destination_id(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "destination_id")
@destination_id.setter
def destination_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_id", value)
@property
@pulumi.getter(name="elasticsearchConfiguration")
def elasticsearch_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamElasticsearchConfigurationArgs']]:
"""
Configuration options if elasticsearch is the destination. More details are given below.
"""
return pulumi.get(self, "elasticsearch_configuration")
@elasticsearch_configuration.setter
def elasticsearch_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamElasticsearchConfigurationArgs']]):
pulumi.set(self, "elasticsearch_configuration", value)
@property
@pulumi.getter(name="extendedS3Configuration")
def extended_s3_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']]:
"""
Enhanced configuration options for the s3 destination. More details are given below.
"""
return pulumi.get(self, "extended_s3_configuration")
@extended_s3_configuration.setter
def extended_s3_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']]):
pulumi.set(self, "extended_s3_configuration", value)
@property
@pulumi.getter(name="httpEndpointConfiguration")
def http_endpoint_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']]:
"""
Configuration options if http_endpoint is the destination. requires the user to also specify a `s3_configuration` block. More details are given below.
"""
return pulumi.get(self, "http_endpoint_configuration")
@http_endpoint_configuration.setter
def http_endpoint_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']]):
pulumi.set(self, "http_endpoint_configuration", value)
@property
@pulumi.getter(name="kinesisSourceConfiguration")
def kinesis_source_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']]:
"""
Allows the ability to specify the kinesis stream that is used as the source of the firehose delivery stream.
"""
return pulumi.get(self, "kinesis_source_configuration")
@kinesis_source_configuration.setter
def kinesis_source_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']]):
pulumi.set(self, "kinesis_source_configuration", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
A name to identify the stream. This is unique to the
AWS account and region the Stream is created in.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="redshiftConfiguration")
def redshift_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamRedshiftConfigurationArgs']]:
"""
Configuration options if redshift is the destination.
Using `redshift_configuration` requires the user to also specify a
`s3_configuration` block. More details are given below.
"""
return pulumi.get(self, "redshift_configuration")
@redshift_configuration.setter
def redshift_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamRedshiftConfigurationArgs']]):
pulumi.set(self, "redshift_configuration", value)
@property
@pulumi.getter(name="s3Configuration")
def s3_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamS3ConfigurationArgs']]:
"""
Required for non-S3 destinations. For S3 destination, use `extended_s3_configuration` instead. Configuration options for the s3 destination (or the intermediate bucket if the destination
is redshift). More details are given below.
"""
return pulumi.get(self, "s3_configuration")
@s3_configuration.setter
def s3_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamS3ConfigurationArgs']]):
pulumi.set(self, "s3_configuration", value)
@property
@pulumi.getter(name="serverSideEncryption")
def server_side_encryption(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamServerSideEncryptionArgs']]:
"""
Encrypt at rest options.
Server-side encryption should not be enabled when a kinesis stream is configured as the source of the firehose delivery stream.
"""
return pulumi.get(self, "server_side_encryption")
@server_side_encryption.setter
def server_side_encryption(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamServerSideEncryptionArgs']]):
pulumi.set(self, "server_side_encryption", value)
@property
@pulumi.getter(name="splunkConfiguration")
def splunk_configuration(self) -> Optional[pulumi.Input['FirehoseDeliveryStreamSplunkConfigurationArgs']]:
"""
Configuration options if splunk is the destination. More details are given below.
"""
return pulumi.get(self, "splunk_configuration")
@splunk_configuration.setter
def splunk_configuration(self, value: Optional[pulumi.Input['FirehoseDeliveryStreamSplunkConfigurationArgs']]):
pulumi.set(self, "splunk_configuration", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="tagsAll")
def tags_all(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "tags_all")
@tags_all.setter
def tags_all(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags_all", value)
@property
@pulumi.getter(name="versionId")
def version_id(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the table version for the output data schema. Defaults to `LATEST`.
"""
return pulumi.get(self, "version_id")
@version_id.setter
def version_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "version_id", value)
class FirehoseDeliveryStream(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
arn: Optional[pulumi.Input[str]] = None,
destination: Optional[pulumi.Input[str]] = None,
destination_id: Optional[pulumi.Input[str]] = None,
elasticsearch_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamElasticsearchConfigurationArgs']]] = None,
extended_s3_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']]] = None,
http_endpoint_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']]] = None,
kinesis_source_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']]] = None,
name: Optional[pulumi.Input[str]] = None,
redshift_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamRedshiftConfigurationArgs']]] = None,
s3_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamS3ConfigurationArgs']]] = None,
server_side_encryption: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamServerSideEncryptionArgs']]] = None,
splunk_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamSplunkConfigurationArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
version_id: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Provides a Kinesis Firehose Delivery Stream resource. Amazon Kinesis Firehose is a fully managed, elastic service to easily deliver real-time data streams to destinations such as Amazon S3 and Amazon Redshift.
For more details, see the [Amazon Kinesis Firehose Documentation](https://aws.amazon.com/documentation/firehose/).
## Example Usage
### Extended S3 Destination
```python
import pulumi
import pulumi_aws as aws
bucket = aws.s3.Bucket("bucket", acl="private")
firehose_role = aws.iam.Role("firehoseRole", assume_role_policy=\"\"\"{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "firehose.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
\"\"\")
lambda_iam = aws.iam.Role("lambdaIam", assume_role_policy=\"\"\"{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
\"\"\")
lambda_processor = aws.lambda_.Function("lambdaProcessor",
code=pulumi.FileArchive("lambda.zip"),
role=lambda_iam.arn,
handler="exports.handler",
runtime="nodejs12.x")
extended_s3_stream = aws.kinesis.FirehoseDeliveryStream("extendedS3Stream",
destination="extended_s3",
extended_s3_configuration=aws.kinesis.FirehoseDeliveryStreamExtendedS3ConfigurationArgs(
role_arn=firehose_role.arn,
bucket_arn=bucket.arn,
processing_configuration=aws.kinesis.FirehoseDeliveryStreamExtendedS3ConfigurationProcessingConfigurationArgs(
enabled=True,
processors=[aws.kinesis.FirehoseDeliveryStreamExtendedS3ConfigurationProcessingConfigurationProcessorArgs(
type="Lambda",
parameters=[aws.kinesis.FirehoseDeliveryStreamExtendedS3ConfigurationProcessingConfigurationProcessorParameterArgs(
parameter_name="LambdaArn",
parameter_value=lambda_processor.arn.apply(lambda arn: f"{arn}:$LATEST"),
)],
)],
),
))
```
### S3 Destination (deprecated)
```python
import pulumi
import pulumi_aws as aws
bucket = aws.s3.Bucket("bucket", acl="private")
firehose_role = aws.iam.Role("firehoseRole", assume_role_policy=\"\"\"{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "firehose.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
\"\"\")
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="s3",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=firehose_role.arn,
bucket_arn=bucket.arn,
))
```
### Redshift Destination
```python
import pulumi
import pulumi_aws as aws
test_cluster = aws.redshift.Cluster("testCluster",
cluster_identifier="tf-redshift-cluster",
database_name="test",
master_username="testuser",
master_password="T3stPass",
node_type="dc1.large",
cluster_type="single-node")
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="redshift",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose_role"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=10,
buffer_interval=400,
compression_format="GZIP",
),
redshift_configuration=aws.kinesis.FirehoseDeliveryStreamRedshiftConfigurationArgs(
role_arn=aws_iam_role["firehose_role"]["arn"],
cluster_jdbcurl=pulumi.Output.all(test_cluster.endpoint, test_cluster.database_name).apply(lambda endpoint, database_name: f"jdbc:redshift://{endpoint}/{database_name}"),
username="testuser",
password="T3stPass",
data_table_name="test-table",
copy_options="delimiter '|'",
data_table_columns="test-col",
s3_backup_mode="Enabled",
s3_backup_configuration=aws.kinesis.FirehoseDeliveryStreamRedshiftConfigurationS3BackupConfigurationArgs(
role_arn=aws_iam_role["firehose_role"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=15,
buffer_interval=300,
compression_format="GZIP",
),
))
```
### Elasticsearch Destination
```python
import pulumi
import pulumi_aws as aws
test_cluster = aws.elasticsearch.Domain("testCluster")
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="elasticsearch",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose_role"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=10,
buffer_interval=400,
compression_format="GZIP",
),
elasticsearch_configuration=aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationArgs(
domain_arn=test_cluster.arn,
role_arn=aws_iam_role["firehose_role"]["arn"],
index_name="test",
type_name="test",
processing_configuration=aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationProcessingConfigurationArgs(
enabled=True,
processors=[aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationProcessingConfigurationProcessorArgs(
type="Lambda",
parameters=[aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationProcessingConfigurationProcessorParameterArgs(
parameter_name="LambdaArn",
parameter_value=f"{aws_lambda_function['lambda_processor']['arn']}:$LATEST",
)],
)],
),
))
```
### Elasticsearch Destination With VPC
```python
import pulumi
import pulumi_aws as aws
test_cluster = aws.elasticsearch.Domain("testCluster",
cluster_config=aws.elasticsearch.DomainClusterConfigArgs(
instance_count=2,
zone_awareness_enabled=True,
instance_type="t2.small.elasticsearch",
),
ebs_options=aws.elasticsearch.DomainEbsOptionsArgs(
ebs_enabled=True,
volume_size=10,
),
vpc_options=aws.elasticsearch.DomainVpcOptionsArgs(
security_group_ids=[aws_security_group["first"]["id"]],
subnet_ids=[
aws_subnet["first"]["id"],
aws_subnet["second"]["id"],
],
))
firehose_elasticsearch = aws.iam.RolePolicy("firehose-elasticsearch",
role=aws_iam_role["firehose"]["id"],
policy=pulumi.Output.all(test_cluster.arn, test_cluster.arn).apply(lambda testClusterArn, testClusterArn1: f\"\"\"{{
"Version": "2012-10-17",
"Statement": [
{{
"Effect": "Allow",
"Action": [
"es:*"
],
"Resource": [
"{test_cluster_arn}",
"{test_cluster_arn1}/*"
]
}},
{{
"Effect": "Allow",
"Action": [
"ec2:DescribeVpcs",
"ec2:DescribeVpcAttribute",
"ec2:DescribeSubnets",
"ec2:DescribeSecurityGroups",
"ec2:DescribeNetworkInterfaces",
"ec2:CreateNetworkInterface",
"ec2:CreateNetworkInterfacePermission",
"ec2:DeleteNetworkInterface"
],
"Resource": [
"*"
]
}}
]
}}
\"\"\"))
test = aws.kinesis.FirehoseDeliveryStream("test",
destination="elasticsearch",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
),
elasticsearch_configuration=aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationArgs(
domain_arn=test_cluster.arn,
role_arn=aws_iam_role["firehose"]["arn"],
index_name="test",
type_name="test",
vpc_config=aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationVpcConfigArgs(
subnet_ids=[
aws_subnet["first"]["id"],
aws_subnet["second"]["id"],
],
security_group_ids=[aws_security_group["first"]["id"]],
role_arn=aws_iam_role["firehose"]["arn"],
),
),
opts=pulumi.ResourceOptions(depends_on=[firehose_elasticsearch]))
```
### Splunk Destination
```python
import pulumi
import pulumi_aws as aws
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="splunk",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=10,
buffer_interval=400,
compression_format="GZIP",
),
splunk_configuration=aws.kinesis.FirehoseDeliveryStreamSplunkConfigurationArgs(
hec_endpoint="https://http-inputs-mydomain.splunkcloud.com:443",
hec_token="51D4DA16-C61B-4F5F-8EC7-ED4301342A4A",
hec_acknowledgment_timeout=600,
hec_endpoint_type="Event",
s3_backup_mode="FailedEventsOnly",
))
```
### HTTP Endpoint (e.g. New Relic) Destination
```python
import pulumi
import pulumi_aws as aws
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="http_endpoint",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=10,
buffer_interval=400,
compression_format="GZIP",
),
http_endpoint_configuration=aws.kinesis.FirehoseDeliveryStreamHttpEndpointConfigurationArgs(
url="https://aws-api.newrelic.com/firehose/v1",
name="New Relic",
access_key="my-key",
buffering_size=15,
buffering_interval=600,
role_arn=aws_iam_role["firehose"]["arn"],
s3_backup_mode="FailedDataOnly",
request_configuration=aws.kinesis.FirehoseDeliveryStreamHttpEndpointConfigurationRequestConfigurationArgs(
content_encoding="GZIP",
common_attributes=[
aws.kinesis.FirehoseDeliveryStreamHttpEndpointConfigurationRequestConfigurationCommonAttributeArgs(
name="testname",
value="testvalue",
),
aws.kinesis.FirehoseDeliveryStreamHttpEndpointConfigurationRequestConfigurationCommonAttributeArgs(
name="testname2",
value="testvalue2",
),
],
),
))
```
## Import
Kinesis Firehose Delivery streams can be imported using the stream ARN, e.g.
```sh
$ pulumi import aws:kinesis/firehoseDeliveryStream:FirehoseDeliveryStream foo arn:aws:firehose:us-east-1:XXX:deliverystream/example
```
NoteImport does not work for stream destination `s3`. Consider using `extended_s3` since `s3` destination is deprecated.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] arn: The Amazon Resource Name (ARN) specifying the Stream
:param pulumi.Input[str] destination: This is the destination to where the data is delivered. The only options are `s3` (Deprecated, use `extended_s3` instead), `extended_s3`, `redshift`, `elasticsearch`, `splunk`, and `http_endpoint`.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamElasticsearchConfigurationArgs']] elasticsearch_configuration: Configuration options if elasticsearch is the destination. More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']] extended_s3_configuration: Enhanced configuration options for the s3 destination. More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']] http_endpoint_configuration: Configuration options if http_endpoint is the destination. requires the user to also specify a `s3_configuration` block. More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']] kinesis_source_configuration: Allows the ability to specify the kinesis stream that is used as the source of the firehose delivery stream.
:param pulumi.Input[str] name: A name to identify the stream. This is unique to the
AWS account and region the Stream is created in.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamRedshiftConfigurationArgs']] redshift_configuration: Configuration options if redshift is the destination.
Using `redshift_configuration` requires the user to also specify a
`s3_configuration` block. More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamS3ConfigurationArgs']] s3_configuration: Required for non-S3 destinations. For S3 destination, use `extended_s3_configuration` instead. Configuration options for the s3 destination (or the intermediate bucket if the destination
is redshift). More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamServerSideEncryptionArgs']] server_side_encryption: Encrypt at rest options.
Server-side encryption should not be enabled when a kinesis stream is configured as the source of the firehose delivery stream.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamSplunkConfigurationArgs']] splunk_configuration: Configuration options if splunk is the destination. More details are given below.
:param pulumi.Input[str] version_id: Specifies the table version for the output data schema. Defaults to `LATEST`.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: FirehoseDeliveryStreamArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a Kinesis Firehose Delivery Stream resource. Amazon Kinesis Firehose is a fully managed, elastic service to easily deliver real-time data streams to destinations such as Amazon S3 and Amazon Redshift.
For more details, see the [Amazon Kinesis Firehose Documentation](https://aws.amazon.com/documentation/firehose/).
## Example Usage
### Extended S3 Destination
```python
import pulumi
import pulumi_aws as aws
bucket = aws.s3.Bucket("bucket", acl="private")
firehose_role = aws.iam.Role("firehoseRole", assume_role_policy=\"\"\"{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "firehose.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
\"\"\")
lambda_iam = aws.iam.Role("lambdaIam", assume_role_policy=\"\"\"{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
\"\"\")
lambda_processor = aws.lambda_.Function("lambdaProcessor",
code=pulumi.FileArchive("lambda.zip"),
role=lambda_iam.arn,
handler="exports.handler",
runtime="nodejs12.x")
extended_s3_stream = aws.kinesis.FirehoseDeliveryStream("extendedS3Stream",
destination="extended_s3",
extended_s3_configuration=aws.kinesis.FirehoseDeliveryStreamExtendedS3ConfigurationArgs(
role_arn=firehose_role.arn,
bucket_arn=bucket.arn,
processing_configuration=aws.kinesis.FirehoseDeliveryStreamExtendedS3ConfigurationProcessingConfigurationArgs(
enabled=True,
processors=[aws.kinesis.FirehoseDeliveryStreamExtendedS3ConfigurationProcessingConfigurationProcessorArgs(
type="Lambda",
parameters=[aws.kinesis.FirehoseDeliveryStreamExtendedS3ConfigurationProcessingConfigurationProcessorParameterArgs(
parameter_name="LambdaArn",
parameter_value=lambda_processor.arn.apply(lambda arn: f"{arn}:$LATEST"),
)],
)],
),
))
```
### S3 Destination (deprecated)
```python
import pulumi
import pulumi_aws as aws
bucket = aws.s3.Bucket("bucket", acl="private")
firehose_role = aws.iam.Role("firehoseRole", assume_role_policy=\"\"\"{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "firehose.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
\"\"\")
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="s3",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=firehose_role.arn,
bucket_arn=bucket.arn,
))
```
### Redshift Destination
```python
import pulumi
import pulumi_aws as aws
test_cluster = aws.redshift.Cluster("testCluster",
cluster_identifier="tf-redshift-cluster",
database_name="test",
master_username="testuser",
master_password="T3stPass",
node_type="dc1.large",
cluster_type="single-node")
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="redshift",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose_role"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=10,
buffer_interval=400,
compression_format="GZIP",
),
redshift_configuration=aws.kinesis.FirehoseDeliveryStreamRedshiftConfigurationArgs(
role_arn=aws_iam_role["firehose_role"]["arn"],
cluster_jdbcurl=pulumi.Output.all(test_cluster.endpoint, test_cluster.database_name).apply(lambda endpoint, database_name: f"jdbc:redshift://{endpoint}/{database_name}"),
username="testuser",
password="T3stPass",
data_table_name="test-table",
copy_options="delimiter '|'",
data_table_columns="test-col",
s3_backup_mode="Enabled",
s3_backup_configuration=aws.kinesis.FirehoseDeliveryStreamRedshiftConfigurationS3BackupConfigurationArgs(
role_arn=aws_iam_role["firehose_role"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=15,
buffer_interval=300,
compression_format="GZIP",
),
))
```
### Elasticsearch Destination
```python
import pulumi
import pulumi_aws as aws
test_cluster = aws.elasticsearch.Domain("testCluster")
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="elasticsearch",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose_role"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=10,
buffer_interval=400,
compression_format="GZIP",
),
elasticsearch_configuration=aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationArgs(
domain_arn=test_cluster.arn,
role_arn=aws_iam_role["firehose_role"]["arn"],
index_name="test",
type_name="test",
processing_configuration=aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationProcessingConfigurationArgs(
enabled=True,
processors=[aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationProcessingConfigurationProcessorArgs(
type="Lambda",
parameters=[aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationProcessingConfigurationProcessorParameterArgs(
parameter_name="LambdaArn",
parameter_value=f"{aws_lambda_function['lambda_processor']['arn']}:$LATEST",
)],
)],
),
))
```
### Elasticsearch Destination With VPC
```python
import pulumi
import pulumi_aws as aws
test_cluster = aws.elasticsearch.Domain("testCluster",
cluster_config=aws.elasticsearch.DomainClusterConfigArgs(
instance_count=2,
zone_awareness_enabled=True,
instance_type="t2.small.elasticsearch",
),
ebs_options=aws.elasticsearch.DomainEbsOptionsArgs(
ebs_enabled=True,
volume_size=10,
),
vpc_options=aws.elasticsearch.DomainVpcOptionsArgs(
security_group_ids=[aws_security_group["first"]["id"]],
subnet_ids=[
aws_subnet["first"]["id"],
aws_subnet["second"]["id"],
],
))
firehose_elasticsearch = aws.iam.RolePolicy("firehose-elasticsearch",
role=aws_iam_role["firehose"]["id"],
policy=pulumi.Output.all(test_cluster.arn, test_cluster.arn).apply(lambda testClusterArn, testClusterArn1: f\"\"\"{{
"Version": "2012-10-17",
"Statement": [
{{
"Effect": "Allow",
"Action": [
"es:*"
],
"Resource": [
"{test_cluster_arn}",
"{test_cluster_arn1}/*"
]
}},
{{
"Effect": "Allow",
"Action": [
"ec2:DescribeVpcs",
"ec2:DescribeVpcAttribute",
"ec2:DescribeSubnets",
"ec2:DescribeSecurityGroups",
"ec2:DescribeNetworkInterfaces",
"ec2:CreateNetworkInterface",
"ec2:CreateNetworkInterfacePermission",
"ec2:DeleteNetworkInterface"
],
"Resource": [
"*"
]
}}
]
}}
\"\"\"))
test = aws.kinesis.FirehoseDeliveryStream("test",
destination="elasticsearch",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
),
elasticsearch_configuration=aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationArgs(
domain_arn=test_cluster.arn,
role_arn=aws_iam_role["firehose"]["arn"],
index_name="test",
type_name="test",
vpc_config=aws.kinesis.FirehoseDeliveryStreamElasticsearchConfigurationVpcConfigArgs(
subnet_ids=[
aws_subnet["first"]["id"],
aws_subnet["second"]["id"],
],
security_group_ids=[aws_security_group["first"]["id"]],
role_arn=aws_iam_role["firehose"]["arn"],
),
),
opts=pulumi.ResourceOptions(depends_on=[firehose_elasticsearch]))
```
### Splunk Destination
```python
import pulumi
import pulumi_aws as aws
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="splunk",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=10,
buffer_interval=400,
compression_format="GZIP",
),
splunk_configuration=aws.kinesis.FirehoseDeliveryStreamSplunkConfigurationArgs(
hec_endpoint="https://http-inputs-mydomain.splunkcloud.com:443",
hec_token="51D4DA16-C61B-4F5F-8EC7-ED4301342A4A",
hec_acknowledgment_timeout=600,
hec_endpoint_type="Event",
s3_backup_mode="FailedEventsOnly",
))
```
### HTTP Endpoint (e.g. New Relic) Destination
```python
import pulumi
import pulumi_aws as aws
test_stream = aws.kinesis.FirehoseDeliveryStream("testStream",
destination="http_endpoint",
s3_configuration=aws.kinesis.FirehoseDeliveryStreamS3ConfigurationArgs(
role_arn=aws_iam_role["firehose"]["arn"],
bucket_arn=aws_s3_bucket["bucket"]["arn"],
buffer_size=10,
buffer_interval=400,
compression_format="GZIP",
),
http_endpoint_configuration=aws.kinesis.FirehoseDeliveryStreamHttpEndpointConfigurationArgs(
url="https://aws-api.newrelic.com/firehose/v1",
name="New Relic",
access_key="my-key",
buffering_size=15,
buffering_interval=600,
role_arn=aws_iam_role["firehose"]["arn"],
s3_backup_mode="FailedDataOnly",
request_configuration=aws.kinesis.FirehoseDeliveryStreamHttpEndpointConfigurationRequestConfigurationArgs(
content_encoding="GZIP",
common_attributes=[
aws.kinesis.FirehoseDeliveryStreamHttpEndpointConfigurationRequestConfigurationCommonAttributeArgs(
name="testname",
value="testvalue",
),
aws.kinesis.FirehoseDeliveryStreamHttpEndpointConfigurationRequestConfigurationCommonAttributeArgs(
name="testname2",
value="testvalue2",
),
],
),
))
```
## Import
Kinesis Firehose Delivery streams can be imported using the stream ARN, e.g.
```sh
$ pulumi import aws:kinesis/firehoseDeliveryStream:FirehoseDeliveryStream foo arn:aws:firehose:us-east-1:XXX:deliverystream/example
```
NoteImport does not work for stream destination `s3`. Consider using `extended_s3` since `s3` destination is deprecated.
:param str resource_name: The name of the resource.
:param FirehoseDeliveryStreamArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(FirehoseDeliveryStreamArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
arn: Optional[pulumi.Input[str]] = None,
destination: Optional[pulumi.Input[str]] = None,
destination_id: Optional[pulumi.Input[str]] = None,
elasticsearch_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamElasticsearchConfigurationArgs']]] = None,
extended_s3_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']]] = None,
http_endpoint_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']]] = None,
kinesis_source_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']]] = None,
name: Optional[pulumi.Input[str]] = None,
redshift_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamRedshiftConfigurationArgs']]] = None,
s3_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamS3ConfigurationArgs']]] = None,
server_side_encryption: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamServerSideEncryptionArgs']]] = None,
splunk_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamSplunkConfigurationArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
version_id: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = FirehoseDeliveryStreamArgs.__new__(FirehoseDeliveryStreamArgs)
__props__.__dict__["arn"] = arn
if destination is None and not opts.urn:
raise TypeError("Missing required property 'destination'")
__props__.__dict__["destination"] = destination
__props__.__dict__["destination_id"] = destination_id
__props__.__dict__["elasticsearch_configuration"] = elasticsearch_configuration
__props__.__dict__["extended_s3_configuration"] = extended_s3_configuration
__props__.__dict__["http_endpoint_configuration"] = http_endpoint_configuration
__props__.__dict__["kinesis_source_configuration"] = kinesis_source_configuration
__props__.__dict__["name"] = name
__props__.__dict__["redshift_configuration"] = redshift_configuration
__props__.__dict__["s3_configuration"] = s3_configuration
__props__.__dict__["server_side_encryption"] = server_side_encryption
__props__.__dict__["splunk_configuration"] = splunk_configuration
__props__.__dict__["tags"] = tags
__props__.__dict__["version_id"] = version_id
__props__.__dict__["tags_all"] = None
super(FirehoseDeliveryStream, __self__).__init__(
'aws:kinesis/firehoseDeliveryStream:FirehoseDeliveryStream',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
arn: Optional[pulumi.Input[str]] = None,
destination: Optional[pulumi.Input[str]] = None,
destination_id: Optional[pulumi.Input[str]] = None,
elasticsearch_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamElasticsearchConfigurationArgs']]] = None,
extended_s3_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']]] = None,
http_endpoint_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']]] = None,
kinesis_source_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']]] = None,
name: Optional[pulumi.Input[str]] = None,
redshift_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamRedshiftConfigurationArgs']]] = None,
s3_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamS3ConfigurationArgs']]] = None,
server_side_encryption: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamServerSideEncryptionArgs']]] = None,
splunk_configuration: Optional[pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamSplunkConfigurationArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
tags_all: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
version_id: Optional[pulumi.Input[str]] = None) -> 'FirehoseDeliveryStream':
"""
Get an existing FirehoseDeliveryStream resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] arn: The Amazon Resource Name (ARN) specifying the Stream
:param pulumi.Input[str] destination: This is the destination to where the data is delivered. The only options are `s3` (Deprecated, use `extended_s3` instead), `extended_s3`, `redshift`, `elasticsearch`, `splunk`, and `http_endpoint`.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamElasticsearchConfigurationArgs']] elasticsearch_configuration: Configuration options if elasticsearch is the destination. More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamExtendedS3ConfigurationArgs']] extended_s3_configuration: Enhanced configuration options for the s3 destination. More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamHttpEndpointConfigurationArgs']] http_endpoint_configuration: Configuration options if http_endpoint is the destination. requires the user to also specify a `s3_configuration` block. More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamKinesisSourceConfigurationArgs']] kinesis_source_configuration: Allows the ability to specify the kinesis stream that is used as the source of the firehose delivery stream.
:param pulumi.Input[str] name: A name to identify the stream. This is unique to the
AWS account and region the Stream is created in.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamRedshiftConfigurationArgs']] redshift_configuration: Configuration options if redshift is the destination.
Using `redshift_configuration` requires the user to also specify a
`s3_configuration` block. More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamS3ConfigurationArgs']] s3_configuration: Required for non-S3 destinations. For S3 destination, use `extended_s3_configuration` instead. Configuration options for the s3 destination (or the intermediate bucket if the destination
is redshift). More details are given below.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamServerSideEncryptionArgs']] server_side_encryption: Encrypt at rest options.
Server-side encryption should not be enabled when a kinesis stream is configured as the source of the firehose delivery stream.
:param pulumi.Input[pulumi.InputType['FirehoseDeliveryStreamSplunkConfigurationArgs']] splunk_configuration: Configuration options if splunk is the destination. More details are given below.
:param pulumi.Input[str] version_id: Specifies the table version for the output data schema. Defaults to `LATEST`.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _FirehoseDeliveryStreamState.__new__(_FirehoseDeliveryStreamState)
__props__.__dict__["arn"] = arn
__props__.__dict__["destination"] = destination
__props__.__dict__["destination_id"] = destination_id
__props__.__dict__["elasticsearch_configuration"] = elasticsearch_configuration
__props__.__dict__["extended_s3_configuration"] = extended_s3_configuration
__props__.__dict__["http_endpoint_configuration"] = http_endpoint_configuration
__props__.__dict__["kinesis_source_configuration"] = kinesis_source_configuration
__props__.__dict__["name"] = name
__props__.__dict__["redshift_configuration"] = redshift_configuration
__props__.__dict__["s3_configuration"] = s3_configuration
__props__.__dict__["server_side_encryption"] = server_side_encryption
__props__.__dict__["splunk_configuration"] = splunk_configuration
__props__.__dict__["tags"] = tags
__props__.__dict__["tags_all"] = tags_all
__props__.__dict__["version_id"] = version_id
return FirehoseDeliveryStream(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def arn(self) -> pulumi.Output[str]:
"""
The Amazon Resource Name (ARN) specifying the Stream
"""
return pulumi.get(self, "arn")
@property
@pulumi.getter
def destination(self) -> pulumi.Output[str]:
"""
This is the destination to where the data is delivered. The only options are `s3` (Deprecated, use `extended_s3` instead), `extended_s3`, `redshift`, `elasticsearch`, `splunk`, and `http_endpoint`.
"""
return pulumi.get(self, "destination")
@property
@pulumi.getter(name="destinationId")
def destination_id(self) -> pulumi.Output[str]:
return pulumi.get(self, "destination_id")
@property
@pulumi.getter(name="elasticsearchConfiguration")
def elasticsearch_configuration(self) -> pulumi.Output[Optional['outputs.FirehoseDeliveryStreamElasticsearchConfiguration']]:
"""
Configuration options if elasticsearch is the destination. More details are given below.
"""
return pulumi.get(self, "elasticsearch_configuration")
@property
@pulumi.getter(name="extendedS3Configuration")
def extended_s3_configuration(self) -> pulumi.Output[Optional['outputs.FirehoseDeliveryStreamExtendedS3Configuration']]:
"""
Enhanced configuration options for the s3 destination. More details are given below.
"""
return pulumi.get(self, "extended_s3_configuration")
@property
@pulumi.getter(name="httpEndpointConfiguration")
def http_endpoint_configuration(self) -> pulumi.Output[Optional['outputs.FirehoseDeliveryStreamHttpEndpointConfiguration']]:
"""
Configuration options if http_endpoint is the destination. requires the user to also specify a `s3_configuration` block. More details are given below.
"""
return pulumi.get(self, "http_endpoint_configuration")
@property
@pulumi.getter(name="kinesisSourceConfiguration")
def kinesis_source_configuration(self) -> pulumi.Output[Optional['outputs.FirehoseDeliveryStreamKinesisSourceConfiguration']]:
"""
Allows the ability to specify the kinesis stream that is used as the source of the firehose delivery stream.
"""
return pulumi.get(self, "kinesis_source_configuration")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
A name to identify the stream. This is unique to the
AWS account and region the Stream is created in.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="redshiftConfiguration")
def redshift_configuration(self) -> pulumi.Output[Optional['outputs.FirehoseDeliveryStreamRedshiftConfiguration']]:
"""
Configuration options if redshift is the destination.
Using `redshift_configuration` requires the user to also specify a
`s3_configuration` block. More details are given below.
"""
return pulumi.get(self, "redshift_configuration")
@property
@pulumi.getter(name="s3Configuration")
def s3_configuration(self) -> pulumi.Output[Optional['outputs.FirehoseDeliveryStreamS3Configuration']]:
"""
Required for non-S3 destinations. For S3 destination, use `extended_s3_configuration` instead. Configuration options for the s3 destination (or the intermediate bucket if the destination
is redshift). More details are given below.
"""
return pulumi.get(self, "s3_configuration")
@property
@pulumi.getter(name="serverSideEncryption")
def server_side_encryption(self) -> pulumi.Output[Optional['outputs.FirehoseDeliveryStreamServerSideEncryption']]:
"""
Encrypt at rest options.
Server-side encryption should not be enabled when a kinesis stream is configured as the source of the firehose delivery stream.
"""
return pulumi.get(self, "server_side_encryption")
@property
@pulumi.getter(name="splunkConfiguration")
def splunk_configuration(self) -> pulumi.Output[Optional['outputs.FirehoseDeliveryStreamSplunkConfiguration']]:
"""
Configuration options if splunk is the destination. More details are given below.
"""
return pulumi.get(self, "splunk_configuration")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
return pulumi.get(self, "tags")
@property
@pulumi.getter(name="tagsAll")
def tags_all(self) -> pulumi.Output[Mapping[str, str]]:
return pulumi.get(self, "tags_all")
@property
@pulumi.getter(name="versionId")
def version_id(self) -> pulumi.Output[str]:
"""
Specifies the table version for the output data schema. Defaults to `LATEST`.
"""
return pulumi.get(self, "version_id")
| 52.089753 | 295 | 0.657509 | 6,808 | 73,707 | 6.907609 | 0.058901 | 0.045612 | 0.051311 | 0.017777 | 0.949029 | 0.943777 | 0.936632 | 0.930359 | 0.927744 | 0.91809 | 0 | 0.008444 | 0.249664 | 73,707 | 1,414 | 296 | 52.126591 | 0.841877 | 0.507604 | 0 | 0.848861 | 1 | 0 | 0.232017 | 0.191067 | 0 | 0 | 0 | 0 | 0 | 1 | 0.165631 | false | 0.00207 | 0.014493 | 0.016563 | 0.279503 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bc74da6a0953f86c2875b29a561812b628f67452 | 401 | py | Python | api/radiam/api/tests/permissionstests/__init__.py | usask-rc/radiam | bfa38fd33e211b66e30e453a717c5f216e848cb2 | [
"MIT"
] | 2 | 2020-02-01T20:41:28.000Z | 2020-02-03T20:57:59.000Z | api/radiam/api/tests/permissionstests/__init__.py | usask-rc/radiam | bfa38fd33e211b66e30e453a717c5f216e848cb2 | [
"MIT"
] | 10 | 2020-04-20T15:52:49.000Z | 2020-04-30T18:03:09.000Z | api/radiam/api/tests/permissionstests/__init__.py | usask-rc/radiam | bfa38fd33e211b66e30e453a717c5f216e848cb2 | [
"MIT"
] | null | null | null | from .userpermissionstests import *
from .projectpermissionstests import *
from .researchgrouppermissionstests import *
from .groupmemberpermissionstests import *
from .groupviewgrantpermissionstests import *
from .grouphierarchytests import *
from .datasetdatacollectionmethodpermissionstests import *
from .datasetsensitivitypermissionstests import *
from .projectstatisticspermissiontests import *
| 40.1 | 58 | 0.865337 | 27 | 401 | 12.851852 | 0.407407 | 0.230548 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089776 | 401 | 9 | 59 | 44.555556 | 0.950685 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bc909d5d64e59f7a8589dd60b85927b323269091 | 322 | py | Python | dbt/flags.py | cwkrebs/dbt | 0b135772d2db2549225365eeeb465b5316930145 | [
"Apache-2.0"
] | 1 | 2022-01-09T19:33:25.000Z | 2022-01-09T19:33:25.000Z | dbt/flags.py | cwkrebs/dbt | 0b135772d2db2549225365eeeb465b5316930145 | [
"Apache-2.0"
] | null | null | null | dbt/flags.py | cwkrebs/dbt | 0b135772d2db2549225365eeeb465b5316930145 | [
"Apache-2.0"
] | null | null | null | STRICT_MODE = False
NON_DESTRUCTIVE = False
FULL_REFRESH = False
LOG_CACHE_EVENTS = False
USE_CACHE = True
def reset():
global STRICT_MODE, NON_DESTRUCTIVE, FULL_REFRESH, LOG_CACHE_EVENTS
STRICT_MODE = False
NON_DESTRUCTIVE = False
FULL_REFRESH = False
LOG_CACHE_EVENTS = False
USE_CACHE = True
| 20.125 | 71 | 0.751553 | 44 | 322 | 5.113636 | 0.340909 | 0.133333 | 0.186667 | 0.16 | 0.72 | 0.72 | 0.72 | 0.72 | 0.72 | 0.72 | 0 | 0 | 0.198758 | 322 | 15 | 72 | 21.466667 | 0.872093 | 0 | 0 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bca47ed828aaaf5d7b84bc424982e66a1455f784 | 7,555 | py | Python | Chapter10_On_Policy_Control_with_Approximation/Mountain_Car_1_Step_Semi_Gradient_Q_Learning_RBF_Linear_Appr.py | quangnguyendang/Reinforcement_Learning | 2551ce95068561c553500838ee6b976f001ba667 | [
"MIT"
] | null | null | null | Chapter10_On_Policy_Control_with_Approximation/Mountain_Car_1_Step_Semi_Gradient_Q_Learning_RBF_Linear_Appr.py | quangnguyendang/Reinforcement_Learning | 2551ce95068561c553500838ee6b976f001ba667 | [
"MIT"
] | null | null | null | Chapter10_On_Policy_Control_with_Approximation/Mountain_Car_1_Step_Semi_Gradient_Q_Learning_RBF_Linear_Appr.py | quangnguyendang/Reinforcement_Learning | 2551ce95068561c553500838ee6b976f001ba667 | [
"MIT"
] | null | null | null | # Example 10.1 page 198 in Reinforcement Learning: An Introduction Book
# PhD Student: Nguyen Dang Quang, Computer Science Department, KyungHee University, Korea.
# 1-step Semi Gradient Q-learning Implementation for MountainCar-v0 environment
# RBF Feature Vector Generated for Linear Approximation
# Reference from http://scikit-learn.org/stable/modules/kernel_approximation.html#rbf-kernel-approx
import gym
import numpy as np
import matplotlib.pyplot as plt
from sklearn.kernel_approximation import RBFSampler
class QLearningAgent:
def __init__(self, environment=gym.make('MountainCar-v0')):
self.env = environment
self.state = self.env.reset()
self.state_low_bound = self.env.observation_space.low
self.state_high_bound = self.env.observation_space.high
self.n_action = env.action_space.n
self.action_space = gym.spaces.Discrete(self.n_action)
self.d = 100
self.w = np.random.rand(self.d)
self.feature = RBFSampler(gamma=1, random_state=1)
X = []
for _ in range(100000):
s = env.observation_space.sample()
sa = np.append(s, np.random.randint(self.n_action))
X.append(sa)
self.feature.fit(X)
def feature_x(self, s, a):
# print('state = ', s, ' & action = ', a)
feature_sa = self.feature.transform([[s[0], s[1], a]])
# print(feature_sa)
return feature_sa
def is_state_valid(self, s):
valid = True
for i in range(s.shape[0]):
if (s[i] < self.state_low_bound[i]) and (s[i] > self.state_high_bound[i]):
valid = False
return valid
def Q_hat(self, s, a):
if self.is_state_valid(s):
return np.dot(self.feature_x(s, a), np.transpose(self.w))
def reset(self):
self.state = self.env.reset()
def A_max(self, state, epsilon):
if np.random.rand() < epsilon:
# Exploration
return np.random.randint(self.n_action)
else:
# Exploitation
max_a = []
maxQ = -np.inf
for a in range(0, self.n_action):
if self.Q_hat(state, a) > maxQ:
max_a = [a]
maxQ = self.Q_hat(state, a)
elif self.Q_hat(state, a) == maxQ:
max_a.append(a)
if max_a != []:
return max_a[np.random.randint(0, len(max_a))]
else:
return np.random.randint(self.n_action)
def train(self, n_episode=5000, learning_rate=0.01, gamma=0.99, epsilon=0.01):
num_steps_of_episode = []
for i_episode in range(n_episode):
self.reset()
n_trajectory = 0
while True:
while True:
try:
a = self.A_max(state=self.state, epsilon=epsilon)
s_, r_, done, _ = self.env.step(a)
# env.render()
break
except (RuntimeError, TypeError, NameError):
print("Action {} at state {} is invalid!".format(a, self.state))
self.w = self.w + learning_rate * (r_ + gamma * self.Q_hat(s_, self.A_max(state=s_, epsilon=0)) - self.Q_hat(self.state, a)) * self.feature_x(self.state, a)
self.state = s_
n_trajectory += 1
if done:
num_steps_of_episode.append(n_trajectory)
print("Episode = {}, took {} to go to the goal.".format(i_episode, n_trajectory))
break
return num_steps_of_episode
def get_w(self):
return self.w
class SarsaAgent:
def __init__(self, environment=gym.make('MountainCar-v0')):
self.env = environment
self.state = self.env.reset()
self.state_low_bound = self.env.observation_space.low
self.state_high_bound = self.env.observation_space.high
self.n_action = env.action_space.n
self.action_space = gym.spaces.Discrete(self.n_action)
self.d = 100
self.w = np.random.rand(self.d)
self.feature = RBFSampler(gamma=1, random_state=1)
X = []
for _ in range(100000):
s = env.observation_space.sample()
sa = np.append(s, np.random.randint(self.n_action))
X.append(sa)
self.feature.fit(X)
def feature_x(self, s, a):
# print('state = ', s, ' & action = ', a)
feature_sa = self.feature.transform([[s[0], s[1], a]])
# print(feature_sa)
return feature_sa
def is_state_valid(self, s):
valid = True
for i in range(s.shape[0]):
if (s[i] < self.state_low_bound[i]) and (s[i] > self.state_high_bound[i]):
valid = False
return valid
def Q_hat(self, s, a):
if self.is_state_valid(s):
return np.dot(self.feature_x(s, a), np.transpose(self.w))
def reset(self):
self.state = self.env.reset()
def A_max(self, state, epsilon):
if np.random.rand() < epsilon:
# Exploration
return np.random.randint(self.n_action)
else:
# Exploitation
max_a = []
maxQ = -np.inf
for a in range(0, self.n_action):
if self.Q_hat(state, a) > maxQ:
max_a = [a]
maxQ = self.Q_hat(state, a)
elif self.Q_hat(state, a) == maxQ:
max_a.append(a)
if max_a != []:
return max_a[np.random.randint(0, len(max_a))]
else:
return np.random.randint(self.n_action)
def train(self, n_episode=5000, learning_rate=0.01, gamma=0.99, epsilon=0.01):
num_steps_of_episode = []
for i_episode in range(n_episode):
self.reset()
n_trajectory = 0
a = self.A_max(state=self.state, epsilon=epsilon)
while True:
while True:
try:
s_, r_, done, _ = self.env.step(a)
a_ = self.A_max(state=s_, epsilon=epsilon)
# env.render()
break
except (RuntimeError, TypeError, NameError):
print("Action {} at state {} is invalid!".format(a, self.state))
self.w = self.w + learning_rate * (r_ + gamma * self.Q_hat(s_, a_) - self.Q_hat(self.state, a)) * self.feature_x(self.state, a)
self.state = s_
a = a_
n_trajectory += 1
if done:
num_steps_of_episode.append(n_trajectory)
print("Episode = {}, took {} to go to the goal.".format(i_episode, n_trajectory))
break
return num_steps_of_episode
def get_w(self):
return self.w
env = gym.make('MountainCar-v0')
agent1 = QLearningAgent(env)
steps_of_episode_QLearning = agent1.train(n_episode=10000, learning_rate=0.0001, gamma=0.99, epsilon=0.001)
env.close()
env = gym.make('MountainCar-v0')
agent2 = SarsaAgent(env)
steps_of_episode_SARSA = agent2.train(n_episode=10000, learning_rate=0.0001, gamma=0.99, epsilon=0.001)
env.close()
plt.plot(steps_of_episode_SARSA, label="SARSA Linear Approximation")
plt.plot(steps_of_episode_QLearning, label="Q-Learning Linear Approximation")
plt.xlabel('episode')
plt.ylabel('number of needed steps')
plt.legend(loc='best')
plt.show()
| 36.853659 | 172 | 0.561482 | 999 | 7,555 | 4.071071 | 0.164164 | 0.05311 | 0.032456 | 0.02803 | 0.825916 | 0.793705 | 0.783378 | 0.774527 | 0.774527 | 0.756331 | 0 | 0.022083 | 0.3227 | 7,555 | 204 | 173 | 37.034314 | 0.772718 | 0.076638 | 0 | 0.875 | 0 | 0 | 0.04196 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.025 | 0.0125 | 0.2375 | 0.025 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4c046b1d4cbdde63bf90ddd1e4498c6d26828e58 | 36,649 | py | Python | saleor/graphql/account/tests/test_account_depreceated_service_account.py | angeles-ricardo-89/saleor | 5fab7a883d025bff83320fbdd557ed7afa2923a9 | [
"BSD-3-Clause"
] | 4 | 2021-03-27T16:38:48.000Z | 2021-10-18T12:54:15.000Z | saleor/graphql/account/tests/test_account_depreceated_service_account.py | DuongHieuMAI/saleor | e20b6283182f3a2886fe36fcdef8e47e4fcf7a14 | [
"CC-BY-4.0"
] | 11 | 2021-03-30T14:26:57.000Z | 2022-03-12T00:51:07.000Z | saleor/graphql/account/tests/test_account_depreceated_service_account.py | DuongHieuMAI/saleor | e20b6283182f3a2886fe36fcdef8e47e4fcf7a14 | [
"CC-BY-4.0"
] | 12 | 2019-03-21T03:24:58.000Z | 2022-01-13T10:55:34.000Z | import graphene
import pytest
from django.contrib.auth.models import Permission
from freezegun import freeze_time
from ....account.error_codes import AccountErrorCode
from ....app.models import App, AppToken
from ...core.enums import PermissionEnum
from ...tests.utils import assert_no_permission, get_graphql_content
@pytest.fixture
def permission_manage_service_accounts():
return Permission.objects.get(codename="manage_apps")
SERVICE_ACCOUNT_CREATE_MUTATION = """
mutation ServiceAccountCreate(
$name: String, $is_active: Boolean $permissions: [PermissionEnum]){
serviceAccountCreate(input:
{name: $name, isActive: $is_active, permissions: $permissions})
{
authToken
serviceAccount{
permissions{
code
name
}
id
isActive
name
tokens{
authToken
}
}
accountErrors{
field
message
code
}
}
}
"""
def test_service_account_create_mutation(
permission_manage_service_accounts,
permission_manage_products,
staff_api_client,
staff_user,
):
query = SERVICE_ACCOUNT_CREATE_MUTATION
staff_user.user_permissions.add(
permission_manage_service_accounts, permission_manage_products
)
variables = {
"name": "New integration",
"is_active": True,
"permissions": [PermissionEnum.MANAGE_PRODUCTS.name],
}
response = staff_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
service_account_data = content["data"]["serviceAccountCreate"]["serviceAccount"]
default_token = content["data"]["serviceAccountCreate"]["authToken"]
app = App.objects.get()
assert service_account_data["isActive"] == app.is_active
assert service_account_data["name"] == app.name
assert list(app.permissions.all()) == [permission_manage_products]
assert default_token == app.tokens.get().auth_token
def test_service_account_create_mutation_for_service_account(
permission_manage_service_accounts,
permission_manage_products,
app_api_client,
staff_user,
):
query = SERVICE_ACCOUNT_CREATE_MUTATION
requestor = app_api_client.app
requestor.permissions.add(
permission_manage_service_accounts, permission_manage_products
)
variables = {
"name": "New integration",
"is_active": True,
"permissions": [PermissionEnum.MANAGE_PRODUCTS.name],
}
response = app_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
service_account_data = content["data"]["serviceAccountCreate"]["serviceAccount"]
default_token = content["data"]["serviceAccountCreate"]["authToken"]
app = App.objects.exclude(pk=requestor.pk).get()
assert service_account_data["isActive"] == app.is_active
assert service_account_data["name"] == app.name
assert list(app.permissions.all()) == [permission_manage_products]
assert default_token == app.tokens.get().auth_token
def test_service_account_create_mutation_out_of_scope_permissions(
permission_manage_service_accounts,
permission_manage_products,
staff_api_client,
superuser_api_client,
staff_user,
):
"""Ensure user can't create service account with permissions out of user's scope.
Ensure superuser pass restrictions.
"""
query = SERVICE_ACCOUNT_CREATE_MUTATION
staff_user.user_permissions.add(permission_manage_service_accounts)
variables = {
"name": "New integration",
"is_active": True,
"permissions": [PermissionEnum.MANAGE_PRODUCTS.name],
}
# for staff user
response = staff_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
data = content["data"]["serviceAccountCreate"]
errors = data["accountErrors"]
assert not data["serviceAccount"]
assert len(errors) == 1
error = errors[0]
assert error["field"] == "permissions"
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_PERMISSION.name
# for superuser
response = superuser_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
service_account_data = content["data"]["serviceAccountCreate"]["serviceAccount"]
default_token = content["data"]["serviceAccountCreate"]["authToken"]
app = App.objects.get()
assert service_account_data["isActive"] == app.is_active
assert service_account_data["name"] == app.name
assert list(app.permissions.all()) == [permission_manage_products]
assert default_token == app.tokens.get().auth_token
def test_service_account_create_mutation_for_service_account_out_of_scope_permissions(
permission_manage_service_accounts,
permission_manage_products,
app_api_client,
staff_user,
):
query = SERVICE_ACCOUNT_CREATE_MUTATION
variables = {
"name": "New integration",
"is_active": True,
"permissions": [PermissionEnum.MANAGE_PRODUCTS.name],
}
response = app_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
content = get_graphql_content(response)
data = content["data"]["serviceAccountCreate"]
errors = data["accountErrors"]
assert not data["serviceAccount"]
assert len(errors) == 1
error = errors[0]
assert error["field"] == "permissions"
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_PERMISSION.name
def test_service_account_create_mutation_no_permissions(
permission_manage_service_accounts,
permission_manage_products,
staff_api_client,
staff_user,
):
query = SERVICE_ACCOUNT_CREATE_MUTATION
variables = {
"name": "New integration",
"is_active": True,
"permissions": [PermissionEnum.MANAGE_PRODUCTS.name],
}
response = staff_api_client.post_graphql(query, variables=variables)
assert_no_permission(response)
SERVICE_ACCOUNT_UPDATE_MUTATION = """
mutation ServiceAccountUpdate($id: ID!, $is_active: Boolean,
$permissions: [PermissionEnum]){
serviceAccountUpdate(id: $id,
input:{isActive: $is_active, permissions:$permissions}){
serviceAccount{
isActive
id
permissions{
code
name
}
tokens{
authToken
}
name
}
accountErrors{
field
message
code
}
}
}
"""
def test_service_account_update_mutation(
app,
permission_manage_service_accounts,
permission_manage_products,
permission_manage_orders,
permission_manage_users,
staff_api_client,
staff_user,
):
query = SERVICE_ACCOUNT_UPDATE_MUTATION
staff_user.user_permissions.add(
permission_manage_products, permission_manage_users, permission_manage_orders,
)
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {
"id": id,
"is_active": False,
"permissions": [
PermissionEnum.MANAGE_PRODUCTS.name,
PermissionEnum.MANAGE_USERS.name,
],
}
response = staff_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
content = get_graphql_content(response)
service_account_data = content["data"]["serviceAccountUpdate"]["serviceAccount"]
tokens_data = service_account_data["tokens"]
app.refresh_from_db()
tokens = app.tokens.all()
assert service_account_data["isActive"] == app.is_active
assert app.is_active is False
assert len(tokens_data) == 1
assert tokens_data[0]["authToken"] == tokens.get().auth_token[-4:]
assert set(app.permissions.all()) == {
permission_manage_products,
permission_manage_users,
}
def test_service_account_update_mutation_for_service_account(
permission_manage_service_accounts,
permission_manage_products,
permission_manage_orders,
permission_manage_users,
app_api_client,
):
app = App.objects.create(name="New_sa")
AppToken.objects.create(app=app)
query = SERVICE_ACCOUNT_UPDATE_MUTATION
requestor = app_api_client.app
requestor.permissions.add(
permission_manage_service_accounts,
permission_manage_products,
permission_manage_users,
permission_manage_orders,
)
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {
"id": id,
"is_active": False,
"permissions": [
PermissionEnum.MANAGE_PRODUCTS.name,
PermissionEnum.MANAGE_USERS.name,
],
}
response = app_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
service_account_data = content["data"]["serviceAccountUpdate"]["serviceAccount"]
tokens_data = service_account_data["tokens"]
app.refresh_from_db()
tokens = app.tokens.all()
assert service_account_data["isActive"] == app.is_active
assert app.is_active is False
assert len(tokens_data) == 1
assert tokens_data[0]["authToken"] == tokens.get().auth_token[-4:]
assert set(app.permissions.all()) == {
permission_manage_products,
permission_manage_users,
}
def test_service_account_update_mutation_out_of_scope_permissions(
app,
permission_manage_service_accounts,
permission_manage_products,
permission_manage_users,
staff_api_client,
superuser_api_client,
staff_user,
):
"""Ensure user cannot add permissions to service account witch he doesn't have.
Ensure that superuser pass restrictions.
"""
query = SERVICE_ACCOUNT_UPDATE_MUTATION
staff_user.user_permissions.add(
permission_manage_service_accounts, permission_manage_products
)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {
"id": id,
"is_active": False,
"permissions": [
PermissionEnum.MANAGE_PRODUCTS.name,
PermissionEnum.MANAGE_USERS.name,
],
}
# for staff user
response = staff_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
data = content["data"]["serviceAccountUpdate"]
errors = data["accountErrors"]
assert not data["serviceAccount"]
assert len(errors) == 1
error = errors[0]
assert error["field"] == "permissions"
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_PERMISSION.name
# for superuser
response = superuser_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
data = content["data"]["serviceAccountUpdate"]
service_account_data = data["serviceAccount"]
tokens_data = service_account_data["tokens"]
app.refresh_from_db()
tokens = app.tokens.all()
assert service_account_data["isActive"] == app.is_active
assert app.is_active is False
assert len(tokens_data) == 1
assert tokens_data[0]["authToken"] == tokens.get().auth_token[-4:]
assert set(app.permissions.all()) == {
permission_manage_products,
permission_manage_users,
}
def test_service_account_update_mutation_for_service_account_out_of_scope_permissions(
permission_manage_service_accounts,
permission_manage_products,
permission_manage_orders,
permission_manage_users,
app_api_client,
):
app = App.objects.create(name="New_sa")
query = SERVICE_ACCOUNT_UPDATE_MUTATION
requestor = app_api_client.app
requestor.permissions.add(
permission_manage_service_accounts,
permission_manage_products,
permission_manage_orders,
)
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {
"id": id,
"is_active": False,
"permissions": [
PermissionEnum.MANAGE_PRODUCTS.name,
PermissionEnum.MANAGE_USERS.name,
],
}
response = app_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
data = content["data"]["serviceAccountUpdate"]
errors = data["accountErrors"]
assert not data["serviceAccount"]
assert len(errors) == 1
error = errors[0]
assert error["field"] == "permissions"
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_PERMISSION.name
def test_service_account_update_mutation_out_of_scope_service_account(
app,
permission_manage_service_accounts,
permission_manage_products,
permission_manage_orders,
permission_manage_users,
superuser_api_client,
staff_api_client,
staff_user,
):
"""Ensure user cannot manage service account with wider permission scope.
Ensure that superuser pass restrictions.
"""
query = SERVICE_ACCOUNT_UPDATE_MUTATION
staff_user.user_permissions.add(
permission_manage_service_accounts,
permission_manage_products,
permission_manage_users,
)
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {
"id": id,
"is_active": False,
"permissions": [
PermissionEnum.MANAGE_PRODUCTS.name,
PermissionEnum.MANAGE_USERS.name,
],
}
# for staff user
response = staff_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
data = content["data"]["serviceAccountUpdate"]
errors = data["accountErrors"]
assert not data["serviceAccount"]
assert len(errors) == 1
error = errors[0]
assert error["field"] == "id"
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_SERVICE_ACCOUNT.name
# for superuser
response = superuser_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
data = content["data"]["serviceAccountUpdate"]
service_account_data = data["serviceAccount"]
tokens_data = service_account_data["tokens"]
app.refresh_from_db()
tokens = app.tokens.all()
assert service_account_data["isActive"] == app.is_active
assert app.is_active is False
assert len(tokens_data) == 1
assert tokens_data[0]["authToken"] == tokens.get().auth_token[-4:]
assert set(app.permissions.all()) == {
permission_manage_products,
permission_manage_users,
}
def test_service_account_update_mutation_for_service_account_out_of_scope_service_acc(
permission_manage_service_accounts,
permission_manage_products,
permission_manage_orders,
permission_manage_users,
app_api_client,
):
app = App.objects.create(name="New_sa")
query = SERVICE_ACCOUNT_UPDATE_MUTATION
requestor = app_api_client.app
requestor.permissions.add(
permission_manage_service_accounts,
permission_manage_products,
permission_manage_users,
)
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {
"id": id,
"is_active": False,
"permissions": [
PermissionEnum.MANAGE_PRODUCTS.name,
PermissionEnum.MANAGE_USERS.name,
],
}
response = app_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
data = content["data"]["serviceAccountUpdate"]
errors = data["accountErrors"]
assert not data["serviceAccount"]
assert len(errors) == 1
error = errors[0]
assert error["field"] == "id"
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_SERVICE_ACCOUNT.name
def test_service_account_update_no_permission(app, staff_api_client, staff_user):
query = SERVICE_ACCOUNT_UPDATE_MUTATION
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {
"id": id,
"is_active": False,
"permissions": [PermissionEnum.MANAGE_PRODUCTS.name],
}
response = staff_api_client.post_graphql(query, variables=variables)
assert_no_permission(response)
SERVICE_ACCOUNT_DELETE_MUTATION = """
mutation serviceAccountDelete($id: ID!){
serviceAccountDelete(id: $id){
accountErrors{
field
message
code
}
serviceAccount{
name
}
}
}
"""
def test_service_account_delete(
staff_api_client,
staff_user,
app,
permission_manage_orders,
permission_manage_service_accounts,
):
query = SERVICE_ACCOUNT_DELETE_MUTATION
app.permissions.add(permission_manage_orders)
staff_user.user_permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"id": id}
response = staff_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
content = get_graphql_content(response)
data = content["data"]["serviceAccountDelete"]
assert data["serviceAccount"]
assert not data["accountErrors"]
assert not App.objects.filter(id=app.id).exists()
def test_service_account_delete_for_app(
app_api_client, permission_manage_orders, permission_manage_service_accounts,
):
requestor = app_api_client.app
app = App.objects.create(name="New_sa")
query = SERVICE_ACCOUNT_DELETE_MUTATION
app.permissions.add(permission_manage_orders)
requestor.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"id": id}
response = app_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
content = get_graphql_content(response)
data = content["data"]["serviceAccountDelete"]
assert data["serviceAccount"]
assert not data["accountErrors"]
assert not App.objects.filter(id=app.id).exists()
def test_service_account_delete_out_of_scope_app(
staff_api_client,
superuser_api_client,
staff_user,
app,
permission_manage_service_accounts,
permission_manage_orders,
):
"""Ensure user can't delete service account with wider scope of permissions.
Ensure superuser pass restriction
"""
query = SERVICE_ACCOUNT_DELETE_MUTATION
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"id": id}
# for staff user
response = staff_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
content = get_graphql_content(response)
data = content["data"]["serviceAccountDelete"]
errors = data["accountErrors"]
assert not data["serviceAccount"]
assert len(errors) == 1
error = errors[0]
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_SERVICE_ACCOUNT.name
assert error["field"] == "id"
# for superuser
response = superuser_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
data = content["data"]["serviceAccountDelete"]
assert data["serviceAccount"]
assert not data["accountErrors"]
assert not App.objects.filter(id=app.id).exists()
def test_service_account_delete_for_service_account_out_of_scope_service_account(
app_api_client, permission_manage_orders, permission_manage_service_accounts,
):
app = App.objects.create(name="New_sa")
query = SERVICE_ACCOUNT_DELETE_MUTATION
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"id": id}
response = app_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
content = get_graphql_content(response)
data = content["data"]["serviceAccountDelete"]
errors = data["accountErrors"]
assert not data["serviceAccount"]
assert len(errors) == 1
error = errors[0]
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_SERVICE_ACCOUNT.name
assert error["field"] == "id"
QUERY_SERVICE_ACCOUNTS_WITH_FILTER = """
query ($filter: ServiceAccountFilterInput ){
serviceAccounts(first: 5, filter: $filter){
edges{
node{
id
isActive
permissions{
name
code
}
tokens{
authToken
}
name
}
}
}
}
"""
@pytest.mark.parametrize(
"service_account_filter, count",
(({"search": "Sample"}, 1), ({"isActive": False}, 1), ({}, 2)),
)
def test_service_accounts_query(
staff_api_client,
permission_manage_service_accounts,
app,
service_account_filter,
count,
):
second_app = App.objects.create(name="Simple service")
second_app.is_active = False
second_app.tokens.create(name="default")
second_app.save()
variables = {"filter": service_account_filter}
response = staff_api_client.post_graphql(
QUERY_SERVICE_ACCOUNTS_WITH_FILTER,
variables,
permissions=[permission_manage_service_accounts],
)
content = get_graphql_content(response)
service_accounts_data = content["data"]["serviceAccounts"]["edges"]
for service_account_data in service_accounts_data:
tokens = service_account_data["node"]["tokens"]
assert len(tokens) == 1
assert len(tokens[0]["authToken"]) == 4
assert len(service_accounts_data) == count
QUERY_SERVICE_ACCOUNTS_WITH_SORT = """
query ($sort_by: ServiceAccountSortingInput!) {
serviceAccounts(first:5, sortBy: $sort_by) {
edges{
node{
name
}
}
}
}
"""
@pytest.mark.parametrize(
"service_accounts_sort, result_order",
[
({"field": "NAME", "direction": "ASC"}, ["facebook", "google"]),
({"field": "NAME", "direction": "DESC"}, ["google", "facebook"]),
({"field": "CREATION_DATE", "direction": "ASC"}, ["google", "facebook"]),
({"field": "CREATION_DATE", "direction": "DESC"}, ["facebook", "google"]),
],
)
def test_query_service_accounts_with_sort(
service_accounts_sort,
result_order,
staff_api_client,
permission_manage_service_accounts,
):
with freeze_time("2018-05-31 12:00:01"):
App.objects.create(name="google", is_active=True)
with freeze_time("2019-05-31 12:00:01"):
App.objects.create(name="facebook", is_active=True)
variables = {"sort_by": service_accounts_sort}
staff_api_client.user.user_permissions.add(permission_manage_service_accounts)
response = staff_api_client.post_graphql(
QUERY_SERVICE_ACCOUNTS_WITH_SORT, variables
)
content = get_graphql_content(response)
service_accounts = content["data"]["serviceAccounts"]["edges"]
for order, account_name in enumerate(result_order):
assert service_accounts[order]["node"]["name"] == account_name
def test_service_accounts_query_no_permission(
staff_api_client, permission_manage_users, permission_manage_staff, app
):
variables = {"filter": {}}
response = staff_api_client.post_graphql(
QUERY_SERVICE_ACCOUNTS_WITH_FILTER, variables, permissions=[]
)
assert_no_permission(response)
response = staff_api_client.post_graphql(
QUERY_SERVICE_ACCOUNTS_WITH_FILTER,
variables,
permissions=[permission_manage_users, permission_manage_staff],
)
assert_no_permission(response)
QUERY_SERVICE_ACCOUNT = """
query ($id: ID! ){
serviceAccount(id: $id){
id
created
isActive
permissions{
code
name
}
tokens{
authToken
}
name
}
}
"""
def test_service_account_query(
staff_api_client, permission_manage_service_accounts, permission_manage_staff, app,
):
app.permissions.add(permission_manage_staff)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"id": id}
response = staff_api_client.post_graphql(
QUERY_SERVICE_ACCOUNT,
variables,
permissions=[permission_manage_service_accounts],
)
content = get_graphql_content(response)
tokens = app.tokens.all()
service_account_data = content["data"]["serviceAccount"]
tokens_data = service_account_data["tokens"]
assert tokens.count() == 1
assert tokens_data[0]["authToken"] == tokens.first().auth_token[-4:]
assert service_account_data["isActive"] == app.is_active
assert service_account_data["permissions"] == [
{"code": "MANAGE_STAFF", "name": "Manage staff."}
]
def test_service_account_query_no_permission(
staff_api_client, permission_manage_staff, permission_manage_users, app
):
app.permissions.add(permission_manage_staff)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"id": id}
response = staff_api_client.post_graphql(
QUERY_SERVICE_ACCOUNT, variables, permissions=[]
)
assert_no_permission(response)
response = staff_api_client.post_graphql(
QUERY_SERVICE_ACCOUNT,
variables,
permissions=[permission_manage_users, permission_manage_staff],
)
assert_no_permission(response)
def test_service_account_with_access_to_resources(
app_api_client, app, permission_manage_orders, order_with_lines,
):
query = """
query {
orders(first: 5) {
edges {
node {
id
}
}
}
}
"""
response = app_api_client.post_graphql(query)
assert_no_permission(response)
response = app_api_client.post_graphql(
query, permissions=[permission_manage_orders]
)
get_graphql_content(response)
SERVICE_ACCOUNT_TOKEN_CREATE_MUTATION = """
mutation serviceAccountTokenCreate($input: ServiceAccountTokenInput!) {
serviceAccountTokenCreate(input: $input){
authToken
serviceAccountToken{
name
authToken
id
}
accountErrors{
field
message
code
}
}
}
"""
def test_service_account_token_create(
permission_manage_service_accounts,
staff_api_client,
staff_user,
permission_manage_orders,
):
app = App.objects.create(name="New_sa")
query = SERVICE_ACCOUNT_TOKEN_CREATE_MUTATION
staff_user.user_permissions.add(permission_manage_orders)
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"name": "Default token", "serviceAccount": id}
response = staff_api_client.post_graphql(
query,
variables={"input": variables},
permissions=(permission_manage_service_accounts,),
)
content = get_graphql_content(response)
token_data = content["data"]["serviceAccountTokenCreate"]["serviceAccountToken"]
auth_token_data = content["data"]["serviceAccountTokenCreate"]["authToken"]
auth_token = app.tokens.get().auth_token
assert auth_token_data == auth_token
assert token_data["authToken"] == auth_token[-4:]
assert token_data["name"] == "Default token"
def test_service_account_token_create_for_app(
permission_manage_service_accounts, app_api_client, permission_manage_orders,
):
app = App.objects.create(name="New_sa")
query = SERVICE_ACCOUNT_TOKEN_CREATE_MUTATION
requestor = app_api_client.app
requestor.permissions.add(permission_manage_orders)
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"name": "Default token", "serviceAccount": id}
response = app_api_client.post_graphql(
query,
variables={"input": variables},
permissions=(permission_manage_service_accounts,),
)
content = get_graphql_content(response)
token_data = content["data"]["serviceAccountTokenCreate"]["serviceAccountToken"]
auth_token_data = content["data"]["serviceAccountTokenCreate"]["authToken"]
auth_token = app.tokens.get().auth_token
assert auth_token_data == auth_token
assert token_data["authToken"] == auth_token[-4:]
assert token_data["name"] == "Default token"
def test_service_account_token_create_out_of_scope_service_account(
permission_manage_service_accounts,
staff_api_client,
superuser_api_client,
staff_user,
permission_manage_orders,
):
"""Ensure user can't create token for service account with wider
scope of permissions.
Ensure superuser pass restrictions.
"""
app = App.objects.create(name="New_sa")
query = SERVICE_ACCOUNT_TOKEN_CREATE_MUTATION
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"name": "Default token", "serviceAccount": id}
# for staff user
response = staff_api_client.post_graphql(
query,
variables={"input": variables},
permissions=(permission_manage_service_accounts,),
)
content = get_graphql_content(response)
data = content["data"]["serviceAccountTokenCreate"]
errors = data["accountErrors"]
assert not data["serviceAccountToken"]
assert len(errors) == 1
error = errors[0]
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_SERVICE_ACCOUNT.name
assert error["field"] == "serviceAccount"
# for superuser
response = superuser_api_client.post_graphql(query, variables={"input": variables})
content = get_graphql_content(response)
token_data = content["data"]["serviceAccountTokenCreate"]["serviceAccountToken"]
auth_token_data = content["data"]["serviceAccountTokenCreate"]["authToken"]
auth_token = app.tokens.get().auth_token
assert auth_token_data == auth_token
assert token_data["authToken"] == auth_token[-4:]
assert token_data["name"] == "Default token"
def test_service_account_token_create_as_service_account_out_of_scope_service_account(
permission_manage_service_accounts, app_api_client, app, permission_manage_orders,
):
app = App.objects.create(name="New_sa")
query = SERVICE_ACCOUNT_TOKEN_CREATE_MUTATION
app.permissions.add(permission_manage_orders)
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"name": "Default token", "serviceAccount": id}
response = app_api_client.post_graphql(
query,
variables={"input": variables},
permissions=(permission_manage_service_accounts,),
)
content = get_graphql_content(response)
data = content["data"]["serviceAccountTokenCreate"]
errors = data["accountErrors"]
assert not data["serviceAccountToken"]
assert len(errors) == 1
error = errors[0]
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_SERVICE_ACCOUNT.name
assert error["field"] == "serviceAccount"
def test_service_account_token_create_no_permissions(staff_api_client, staff_user):
app = App.objects.create(name="New_sa")
query = SERVICE_ACCOUNT_TOKEN_CREATE_MUTATION
id = graphene.Node.to_global_id("ServiceAccount", app.id)
variables = {"name": "Default token", "serviceAccount": id}
response = staff_api_client.post_graphql(query, variables={"input": variables})
assert_no_permission(response)
SERVICE_ACCOUNT_TOKEN_DELETE_MUTATION = """
mutation serviceAccountTokenDelete($id: ID!){
serviceAccountTokenDelete(id: $id){
accountErrors{
field
message
code
}
serviceAccountToken{
name
authToken
}
}
}
"""
def test_service_account_token_delete(
permission_manage_service_accounts,
permission_manage_products,
staff_api_client,
staff_user,
app,
):
query = SERVICE_ACCOUNT_TOKEN_DELETE_MUTATION
token = app.tokens.get()
staff_user.user_permissions.add(permission_manage_products)
app.permissions.add(permission_manage_products)
id = graphene.Node.to_global_id("ServiceAccountToken", token.id)
variables = {"id": id}
response = staff_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
get_graphql_content(response)
assert not AppToken.objects.filter(id=token.id).first()
def test_service_account_token_delete_for_app(
permission_manage_service_accounts, app_api_client, permission_manage_products,
):
app = App.objects.create(name="New_sa", is_active=True)
token = AppToken.objects.create(app=app)
query = SERVICE_ACCOUNT_TOKEN_DELETE_MUTATION
token = app.tokens.get()
requestor = app_api_client.app
requestor.permissions.add(permission_manage_products)
app.permissions.add(permission_manage_products)
id = graphene.Node.to_global_id("ServiceAccountToken", token.id)
variables = {"id": id}
response = app_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
get_graphql_content(response)
assert not AppToken.objects.filter(id=token.id).first()
def test_service_account_token_delete_no_permissions(staff_api_client, staff_user, app):
query = SERVICE_ACCOUNT_TOKEN_DELETE_MUTATION
token = app.tokens.get()
id = graphene.Node.to_global_id("ServiceAccountToken", token.id)
variables = {"id": id}
response = staff_api_client.post_graphql(query, variables=variables)
assert_no_permission(response)
token.refresh_from_db()
def test_service_account_token_delete_out_of_scope_service_account(
permission_manage_service_accounts,
staff_api_client,
superuser_api_client,
staff_user,
app,
permission_manage_products,
):
"""Ensure user can't delete service account token with wider scope of permissions.
Ensure superuser pass restrictions.
"""
query = SERVICE_ACCOUNT_TOKEN_DELETE_MUTATION
token = app.tokens.get()
app.permissions.add(permission_manage_products)
id = graphene.Node.to_global_id("ServiceAccountToken", token.id)
variables = {"id": id}
# for staff user
response = staff_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
content = get_graphql_content(response)
data = content["data"]["serviceAccountTokenDelete"]
errors = data["accountErrors"]
assert not data["serviceAccountToken"]
assert len(errors) == 1
error = errors[0]
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_SERVICE_ACCOUNT.name
assert error["field"] == "id"
assert AppToken.objects.filter(id=token.id).exists()
# for superuser
response = superuser_api_client.post_graphql(query, variables=variables)
content = get_graphql_content(response)
data = content["data"]["serviceAccountTokenDelete"]
errors = data["accountErrors"]
assert data["serviceAccountToken"]
assert not errors
assert not AppToken.objects.filter(id=token.id).exists()
def test_service_account_token_delete_for_service_account_out_of_scope_app(
permission_manage_service_accounts, app_api_client, permission_manage_products,
):
app = App.objects.create(name="New_sa", is_active=True)
token = AppToken.objects.create(app=app)
query = SERVICE_ACCOUNT_TOKEN_DELETE_MUTATION
app.permissions.add(permission_manage_products)
id = graphene.Node.to_global_id("ServiceAccountToken", token.id)
variables = {"id": id}
response = app_api_client.post_graphql(
query, variables=variables, permissions=(permission_manage_service_accounts,)
)
content = get_graphql_content(response)
data = content["data"]["serviceAccountTokenDelete"]
errors = data["accountErrors"]
assert not data["serviceAccountToken"]
assert len(errors) == 1
error = errors[0]
assert error["code"] == AccountErrorCode.OUT_OF_SCOPE_SERVICE_ACCOUNT.name
assert error["field"] == "id"
assert AppToken.objects.filter(id=token.id).exists()
| 32.063867 | 88 | 0.696827 | 3,992 | 36,649 | 6.057114 | 0.041583 | 0.097932 | 0.049462 | 0.066667 | 0.902192 | 0.871919 | 0.850951 | 0.821175 | 0.794624 | 0.785029 | 0 | 0.00271 | 0.204562 | 36,649 | 1,142 | 89 | 32.091944 | 0.826736 | 0.023711 | 0 | 0.75873 | 0 | 0 | 0.181839 | 0.020796 | 0 | 0 | 0 | 0 | 0.130159 | 1 | 0.034921 | false | 0 | 0.008466 | 0.001058 | 0.044444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4c0dc7ebd2fdd94923b93030e1b08aec765754dc | 12,709 | py | Python | prez/services/spaceprez_service.py | surroundaustralia/Prez | 325043e5fa6088a027cbd2100a33494facc3febd | [
"BSD-3-Clause"
] | 2 | 2021-12-21T06:53:18.000Z | 2022-03-23T21:14:49.000Z | prez/services/spaceprez_service.py | surroundaustralia/Prez | 325043e5fa6088a027cbd2100a33494facc3febd | [
"BSD-3-Clause"
] | 20 | 2021-12-03T01:47:04.000Z | 2022-03-31T04:33:08.000Z | prez/services/spaceprez_service.py | surroundaustralia/Prez | 325043e5fa6088a027cbd2100a33494facc3febd | [
"BSD-3-Clause"
] | null | null | null | from typing import Optional
from rdflib.namespace import RDFS, DCAT, DCTERMS, XSD
from config import *
from services.sparql_utils import *
async def count_datasets():
q = f"""
PREFIX dcat: <{DCAT}>
SELECT (COUNT(?d) as ?count)
WHERE {{
?d a dcat:Dataset .
}}
"""
r = await sparql_query(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
async def list_datasets(page: int, per_page: int):
q = f"""
PREFIX dcat: <{DCAT}>
PREFIX dcterms: <{DCTERMS}>
PREFIX rdfs: <{RDFS}>
PREFIX skos: <{SKOS}>
PREFIX xsd: <{XSD}>
SELECT DISTINCT ?d ?id ?label
WHERE {{
?d a dcat:Dataset ;
dcterms:identifier ?id ;
dcterms:title ?label .
OPTIONAL {{
?d dcterms:description ?desc .
}}
FILTER((lang(?label) = "" || lang(?label) = "en") && DATATYPE(?id) = xsd:token)
}} LIMIT {per_page} OFFSET {(page - 1) * per_page}
"""
r = await sparql_query(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
async def get_dataset_construct(
dataset_id: Optional[str] = None, dataset_uri: Optional[str] = None
):
if dataset_id is None and dataset_uri is None:
raise ValueError("Either an ID or a URI must be provided for a SPARQL query")
# when querying by ID via regular URL path
query_by_id = f"""
?d dcterms:identifier ?id ;
a dcat:Dataset .
FILTER (STR(?id) = "{dataset_id}")
"""
# when querying by URI via /object?uri=...
query_by_uri = f"""
BIND (<{dataset_uri}> as ?d)
?d a dcat:Dataset .
"""
q = f"""
PREFIX dcat: <{DCAT}>
PREFIX dcterms: <{DCTERMS}>
PREFIX rdfs: <{RDFS}>
PREFIX skos: <{SKOS}>
CONSTRUCT {{
?d ?p1 ?o1 .
{construct_all_prop_obj_info}
{construct_all_bnode_prop_obj_info}
}}
WHERE {{
{query_by_id if dataset_id is not None else query_by_uri}
?d ?p1 ?o1 .
{get_all_bnode_prop_obj_info}
{get_all_prop_obj_info}
}}
"""
r = await sparql_construct(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
async def count_collections(dataset_id: str):
q = f"""
PREFIX dcat: <{DCAT}>
PREFIX dcterms: <{DCTERMS}>
PREFIX geo: <{GEO}>
PREFIX rdfs: <{RDFS}>
PREFIX xsd: <{XSD}>
SELECT (COUNT(?coll) as ?count)
WHERE {{
?d dcterms:identifier ?d_id ;
a dcat:Dataset ;
rdfs:member ?coll .
FILTER (STR(?d_id) = "{dataset_id}" && DATATYPE(?d_id) = xsd:token)
?coll a geo:FeatureCollection .
}}
"""
r = await sparql_query(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
async def list_collections(dataset_id: str, page: int, per_page: int):
q = f"""
PREFIX dcat: <{DCAT}>
PREFIX dcterms: <{DCTERMS}>
PREFIX geo: <{GEO}>
PREFIX rdfs: <{RDFS}>
PREFIX skos: <{SKOS}>
PREFIX xsd: <{XSD}>
SELECT DISTINCT *
WHERE {{
?d dcterms:identifier ?d_id ;
a dcat:Dataset ;
dcterms:title ?d_label ;
rdfs:member ?coll .
FILTER(lang(?d_label) = "" || lang(?d_label) = "en")
FILTER (STR(?d_id) = "{dataset_id}" && DATATYPE(?d_id) = xsd:token)
?coll a geo:FeatureCollection ;
dcterms:identifier ?id ;
dcterms:title ?label .
OPTIONAL {{
?coll dcterms:description ?desc .
}}
FILTER(lang(?label) = "" || lang(?label) = "en")
FILTER(DATATYPE(?id) = xsd:token)
}} LIMIT {per_page} OFFSET {(page - 1) * per_page}
"""
r = await sparql_query(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
async def get_collection_construct_1(
dataset_id: Optional[str] = None,
collection_id: Optional[str] = None,
collection_uri: Optional[str] = None,
):
if collection_id is None and collection_uri is None:
raise ValueError("Either an ID or a URI must be provided for a SPARQL query")
# when querying by ID via regular URL path
query_by_id = f"""
FILTER (STR(?d_id) = "{dataset_id}")
?coll a geo:FeatureCollection ;
dcterms:identifier ?id .
?d a dcat:Dataset ;
rdfs:member ?fc ;
dcterms:identifier ?d_id .
FILTER (STR(?id) = "{collection_id}")
"""
# when querying by URI via /object?uri=...
query_by_uri = f"""
BIND (<{collection_uri}> as ?coll)
?coll a geo:FeatureCollection .
"""
q = f"""
PREFIX dcat: <{DCAT}>
PREFIX dcterms: <{DCTERMS}>
PREFIX geo: <{GEO}>
PREFIX rdfs: <{RDFS}>
PREFIX skos: <{SKOS}>
PREFIX xsd: <{XSD}>
CONSTRUCT {{
?coll ?p1 ?o1 .
{construct_all_prop_obj_info}
{construct_all_bnode_prop_obj_info}
?d a dcat:Dataset ;
dcterms:identifier ?d_id ;
dcterms:title ?d_label ;
rdfs:member ?coll .
}}
WHERE {{
{query_by_id if collection_id is not None else query_by_uri}
?coll ?p1 ?o1 .
FILTER(!STRENDS(STR(?p1), "member"))
?d a dcat:Dataset ;
rdfs:member ?fc ;
dcterms:identifier ?d_id ;
dcterms:title ?d_label .
FILTER(DATATYPE(?d_id) = xsd:token)
{get_all_bnode_prop_obj_info}
{get_all_prop_obj_info}
}}
"""
r = await sparql_construct(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
async def get_collection_construct_2(
dataset_id: Optional[str] = None,
collection_id: Optional[str] = None,
collection_uri: Optional[str] = None,
):
if collection_id is None and collection_uri is None:
raise ValueError("Either an ID or a URI must be provided for a SPARQL query")
# when querying by ID via regular URL path
query_by_id = f"""
FILTER (STR(?d_id) = "{dataset_id}")
?coll a geo:FeatureCollection ;
dcterms:identifier ?id .
?d a dcat:Dataset ;
dcterms:identifier ?d_id ;
rdfs:member ?fc .
FILTER (STR(?id) = "{collection_id}")
"""
# when querying by URI via /object?uri=...
query_by_uri = f"""
BIND (<{collection_uri}> as ?coll)
?coll a geo:FeatureCollection .
"""
q = f"""
PREFIX dcat: <{DCAT}>
PREFIX dcterms: <{DCTERMS}>
PREFIX geo: <{GEO}>
PREFIX rdfs: <{RDFS}>
PREFIX skos: <{SKOS}>
CONSTRUCT {{
?coll rdfs:member ?mem .
}}
WHERE {{
{query_by_id if collection_id is not None else query_by_uri}
?coll rdfs:member ?mem .
}} LIMIT 20
"""
r = await sparql_construct(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
async def count_features(dataset_id: str, collection_id: str):
q = f"""
PREFIX dcat: <{DCAT}>
PREFIX dcterms: <{DCTERMS}>
PREFIX geo: <{GEO}>
PREFIX rdfs: <{RDFS}>
PREFIX xsd: <{XSD}>
SELECT (COUNT(?f) as ?count)
WHERE {{
?d dcterms:identifier ?d_id ;
a dcat:Dataset ;
rdfs:member ?coll .
FILTER (STR(?d_id) = "{dataset_id}" && DATATYPE(?d_id) = xsd:token)
?coll dcterms:identifier ?coll_id ;
a geo:FeatureCollection ;
rdfs:member ?f .
FILTER (STR(?coll_id) = "{collection_id}" && DATATYPE(?coll_id) = xsd:token)
?f a geo:Feature .
}}
"""
r = await sparql_query(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
async def list_features(dataset_id: str, collection_id: str, page: int, per_page: int):
q = f"""
PREFIX dcat: <{DCAT}>
PREFIX dcterms: <{DCTERMS}>
PREFIX geo: <{GEO}>
PREFIX rdfs: <{RDFS}>
PREFIX skos: <{SKOS}>
PREFIX xsd: <{XSD}>
SELECT DISTINCT *
WHERE {{
?d dcterms:identifier ?d_id ;
a dcat:Dataset ;
dcterms:title ?d_label ;
rdfs:member ?coll .
FILTER(lang(?d_label) = "" || lang(?d_label) = "en")
FILTER (STR(?d_id) = "{dataset_id}" && DATATYPE(?d_id) = xsd:token)
?coll a geo:FeatureCollection ;
dcterms:identifier ?coll_id ;
dcterms:title ?coll_label ;
rdfs:member ?f .
FILTER(lang(?coll_label) = "" || lang(?coll_label) = "en")
FILTER (STR(?coll_id) = "{collection_id}" && DATATYPE(?coll_id) = xsd:token)
?f a geo:Feature ;
dcterms:identifier ?id .
FILTER(DATATYPE(?id) = xsd:token)
OPTIONAL {{
?f dcterms:description ?desc .
}}
OPTIONAL {{
?f dcterms:title ?label .
FILTER(lang(?label) = "" || lang(?label) = "en")
}}
}} LIMIT {per_page} OFFSET {(page - 1) * per_page}
"""
r = await sparql_query(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
async def get_feature_construct(
dataset_id: Optional[str] = None,
collection_id: Optional[str] = None,
feature_id: Optional[str] = None,
feature_uri: Optional[str] = None,
):
if feature_id is None and feature_uri is None:
raise ValueError("Either an ID or a URI must be provided for a SPARQL query")
# when querying by ID via regular URL path
query_by_id = f"""
FILTER (STR(?d_id) = "{dataset_id}")
?d rdfs:member ?coll .
?coll a geo:FeatureCollection ;
dcterms:identifier ?coll_id ;
rdfs:member ?f .
FILTER (STR(?coll_id) = "{collection_id}")
?f a geo:Feature ;
dcterms:identifier ?id .
FILTER (STR(?id) = "{feature_id}")
"""
# when querying by URI via /object?uri=...
query_by_uri = f"""
BIND (<{feature_uri}> as ?f)
?f a geo:Feature .
"""
q = f"""
PREFIX dcat: <{DCAT}>
PREFIX dcterms: <{DCTERMS}>
PREFIX geo: <{GEO}>
PREFIX rdfs: <{RDFS}>
PREFIX skos: <{SKOS}>
PREFIX xsd: <{XSD}>
CONSTRUCT {{
?f ?p1 ?o1 ;
dcterms:title ?title .
{construct_all_prop_obj_info}
{construct_all_bnode_prop_obj_info}
dcterms:title rdfs:label "Title" .
?coll a geo:FeatureCollection ;
dcterms:identifier ?coll_id ;
dcterms:title ?coll_label ;
rdfs:member ?f .
?d a dcat:Dataset ;
dcterms:identifier ?d_id ;
dcterms:title ?d_label .
}}
WHERE {{
{query_by_id if feature_id is not None else query_by_uri}
?coll rdfs:member ?f .
?f ?p1 ?o1 .
OPTIONAL {{
?f dcterms:title ?label .
}}
BIND(COALESCE(?label, CONCAT("Feature ", ?id)) AS ?title)
?coll a geo:FeatureCollection ;
dcterms:identifier ?coll_id ;
dcterms:title ?coll_label .
?d a dcat:Dataset ;
dcterms:identifier ?d_id ;
dcterms:title ?d_label .
{get_all_bnode_prop_obj_info}
{get_all_prop_obj_info}
}}
"""
r = await sparql_construct(q, "SpacePrez")
if r[0]:
return r[1]
else:
raise Exception(f"SPARQL query error code {r[1]['code']}: {r[1]['message']}")
| 31.614428 | 91 | 0.518688 | 1,507 | 12,709 | 4.230259 | 0.071002 | 0.009412 | 0.018824 | 0.018824 | 0.890196 | 0.852392 | 0.839686 | 0.814118 | 0.78698 | 0.764235 | 0 | 0.00726 | 0.349673 | 12,709 | 401 | 92 | 31.693267 | 0.764065 | 0.02573 | 0 | 0.852273 | 0 | 0.002841 | 0.760769 | 0.057625 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.011364 | 0 | 0.039773 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4c2037f9e922a5e5e40c96d45161c779bad8d10c | 8,538 | py | Python | skidl/libs/NXP_sklib.py | arjenroodselaar/skidl | 0bf801bd3b74e6ef94bd9aa1b68eef756b568276 | [
"MIT"
] | 700 | 2016-08-16T21:12:50.000Z | 2021-10-10T02:15:18.000Z | skidl/libs/NXP_sklib.py | 0dvictor/skidl | 458709a10b28a864d25ae2c2b44c6103d4ddb291 | [
"MIT"
] | 118 | 2016-08-16T20:51:05.000Z | 2021-10-10T08:07:18.000Z | skidl/libs/NXP_sklib.py | 0dvictor/skidl | 458709a10b28a864d25ae2c2b44c6103d4ddb291 | [
"MIT"
] | 94 | 2016-08-25T14:02:28.000Z | 2021-09-12T05:17:08.000Z | from skidl import SKIDL, TEMPLATE, Part, Pin, SchLib
SKIDL_lib_version = '0.0.1'
NXP = SchLib(tool=SKIDL).add_parts(*[
Part(name='PCA9536D',dest=TEMPLATE,tool=SKIDL,keywords='i2c io port',description='4-bit I2C-bus and SMBus IO port, SOIC8 package',ref_prefix='U',num_units=1,fplist=['SOIC*3.9x4.9mm*Pitch1.27mm*'],do_erc=True,pins=[
Pin(num='1',name='IO0',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='IO1',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='IO2',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='IO3',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='VDD',func=Pin.PWRIN,do_erc=True)]),
Part(name='PCA9536DP',dest=TEMPLATE,tool=SKIDL,keywords='i2c io port',description='4-bit I2C-bus and SMBus IO port, TSSOP8 package',ref_prefix='U',num_units=1,fplist=['TSSOP*3x3mm*Pitch0.65mm*'],do_erc=True,pins=[
Pin(num='1',name='IO0',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='IO1',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='IO2',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='IO3',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='VDD',func=Pin.PWRIN,do_erc=True)]),
Part(name='PCA9544AD',dest=TEMPLATE,tool=SKIDL,keywords='i2c multiplexer',description='4-channel I2C-bus multiplexer with interrupt logic, SOIC20 package',ref_prefix='U',num_units=1,fplist=['SOIC*7.5x12.8mm*Pitch1.27mm*'],do_erc=True,pins=[
Pin(num='1',name='A0',do_erc=True),
Pin(num='2',name='A1',do_erc=True),
Pin(num='3',name='A2',do_erc=True),
Pin(num='4',name='~INT0',do_erc=True),
Pin(num='5',name='SD0',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='SC0',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='~INT1',do_erc=True),
Pin(num='8',name='SD1',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='SC1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='20',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='~INT2',do_erc=True),
Pin(num='12',name='SD2',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='SC2',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='~INT3',do_erc=True),
Pin(num='15',name='SD3',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='SC3',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='~INT',func=Pin.OPENCOLL,do_erc=True),
Pin(num='18',name='SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='SDA',func=Pin.BIDIR,do_erc=True)]),
Part(name='PCA9544APW',dest=TEMPLATE,tool=SKIDL,keywords='i2c multiplexer',description='4-channel I2C-bus multiplexer with interrupt logic, TSSOP20 package',ref_prefix='U',num_units=1,fplist=['TSSOP*4.4x6.5mm*Pitch0.65mm*'],do_erc=True,pins=[
Pin(num='1',name='A0',do_erc=True),
Pin(num='2',name='A1',do_erc=True),
Pin(num='3',name='A2',do_erc=True),
Pin(num='4',name='~INT0',do_erc=True),
Pin(num='5',name='SD0',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='SC0',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='~INT1',do_erc=True),
Pin(num='8',name='SD1',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='SC1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='20',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='~INT2',do_erc=True),
Pin(num='12',name='SD2',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='SC2',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='~INT3',do_erc=True),
Pin(num='15',name='SD3',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='SC3',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='~INT',func=Pin.OPENCOLL,do_erc=True),
Pin(num='18',name='SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='SDA',func=Pin.BIDIR,do_erc=True)]),
Part(name='PCA9685BS',dest=TEMPLATE,tool=SKIDL,keywords='PWM LED driver I2C QFN',description='16-channel 12-bit PWM Fm+ I2C-bus LED controller RGBA QFN',ref_prefix='U',num_units=1,fplist=['QFN*6x6mm*Pitch0.65mm*'],do_erc=True,pins=[
Pin(num='1',name='A0',do_erc=True),
Pin(num='2',name='A1',do_erc=True),
Pin(num='3',name='A2',do_erc=True),
Pin(num='4',name='A3',do_erc=True),
Pin(num='5',name='A4',do_erc=True),
Pin(num='6',name='OUT0',func=Pin.OPENCOLL,do_erc=True),
Pin(num='7',name='OUT1',func=Pin.OPENCOLL,do_erc=True),
Pin(num='8',name='OUT2',func=Pin.OPENCOLL,do_erc=True),
Pin(num='9',name='OUT3',func=Pin.OPENCOLL,do_erc=True),
Pin(num='10',name='OUT4',func=Pin.OPENCOLL,do_erc=True),
Pin(num='20',name='OUT13',func=Pin.OPENCOLL,do_erc=True),
Pin(num='11',name='OUT5',func=Pin.OPENCOLL,do_erc=True),
Pin(num='21',name='OUT14',func=Pin.OPENCOLL,do_erc=True),
Pin(num='12',name='OUT6',func=Pin.OPENCOLL,do_erc=True),
Pin(num='22',name='OUT15',func=Pin.OPENCOLL,do_erc=True),
Pin(num='13',name='OUT7',func=Pin.OPENCOLL,do_erc=True),
Pin(num='23',name='~OE~',do_erc=True),
Pin(num='14',name='GND',func=Pin.PWRIN,do_erc=True),
Pin(num='24',name='A5',do_erc=True),
Pin(num='15',name='OUT8',func=Pin.OPENCOLL,do_erc=True),
Pin(num='25',name='EXTCLK',do_erc=True),
Pin(num='16',name='OUT9',func=Pin.OPENCOLL,do_erc=True),
Pin(num='26',name='SCL',do_erc=True),
Pin(num='17',name='OUT10',func=Pin.OPENCOLL,do_erc=True),
Pin(num='27',name='SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='OUT11',func=Pin.OPENCOLL,do_erc=True),
Pin(num='28',name='VCC',func=Pin.PWRIN,do_erc=True),
Pin(num='19',name='OUT12',func=Pin.OPENCOLL,do_erc=True),
Pin(num='29',name='GND',func=Pin.PWRIN,do_erc=True)]),
Part(name='PCA9685PW',dest=TEMPLATE,tool=SKIDL,keywords='PWM LED driver I2C TSSOP',description='16-channel 12-bit PWM Fm+ I2C-bus LED controller RGBA TSSOP',ref_prefix='U',num_units=1,fplist=['TSSOP*4.4x9.7mm*Pitch0.65mm*'],do_erc=True,pins=[
Pin(num='1',name='A0',do_erc=True),
Pin(num='2',name='A1',do_erc=True),
Pin(num='3',name='A2',do_erc=True),
Pin(num='4',name='A3',do_erc=True),
Pin(num='5',name='A4',do_erc=True),
Pin(num='6',name='OUT0',func=Pin.OPENCOLL,do_erc=True),
Pin(num='7',name='OUT1',func=Pin.OPENCOLL,do_erc=True),
Pin(num='8',name='OUT2',func=Pin.OPENCOLL,do_erc=True),
Pin(num='9',name='OUT3',func=Pin.OPENCOLL,do_erc=True),
Pin(num='10',name='OUT4',func=Pin.OPENCOLL,do_erc=True),
Pin(num='20',name='OUT13',func=Pin.OPENCOLL,do_erc=True),
Pin(num='11',name='OUT5',func=Pin.OPENCOLL,do_erc=True),
Pin(num='21',name='OUT14',func=Pin.OPENCOLL,do_erc=True),
Pin(num='12',name='OUT6',func=Pin.OPENCOLL,do_erc=True),
Pin(num='22',name='OUT15',func=Pin.OPENCOLL,do_erc=True),
Pin(num='13',name='OUT7',func=Pin.OPENCOLL,do_erc=True),
Pin(num='23',name='~OE~',do_erc=True),
Pin(num='14',name='GND',func=Pin.PWRIN,do_erc=True),
Pin(num='24',name='A5',do_erc=True),
Pin(num='15',name='OUT8',func=Pin.OPENCOLL,do_erc=True),
Pin(num='25',name='EXTCLK',do_erc=True),
Pin(num='16',name='OUT9',func=Pin.OPENCOLL,do_erc=True),
Pin(num='26',name='SCL',do_erc=True),
Pin(num='17',name='OUT10',func=Pin.OPENCOLL,do_erc=True),
Pin(num='27',name='SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='OUT11',func=Pin.OPENCOLL,do_erc=True),
Pin(num='28',name='VCC',func=Pin.PWRIN,do_erc=True),
Pin(num='19',name='OUT12',func=Pin.OPENCOLL,do_erc=True)])])
| 68.304 | 250 | 0.591239 | 1,403 | 8,538 | 3.502495 | 0.109765 | 0.121083 | 0.217949 | 0.261294 | 0.953398 | 0.953398 | 0.953398 | 0.948311 | 0.939764 | 0.885633 | 0 | 0.053306 | 0.187046 | 8,538 | 124 | 251 | 68.854839 | 0.654661 | 0 | 0 | 0.901639 | 0 | 0 | 0.14371 | 0.018388 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008197 | 0 | 0.008197 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4c2b2187b8ae06ba9de06dc2e7e81e4a99a58507 | 2,619 | py | Python | app/selenium_ui/jira_ui.py | contibaadmin/dc-app-performance-toolkit | 0739730079960fb037ec64b264585f1fa0221b39 | [
"Apache-2.0"
] | null | null | null | app/selenium_ui/jira_ui.py | contibaadmin/dc-app-performance-toolkit | 0739730079960fb037ec64b264585f1fa0221b39 | [
"Apache-2.0"
] | null | null | null | app/selenium_ui/jira_ui.py | contibaadmin/dc-app-performance-toolkit | 0739730079960fb037ec64b264585f1fa0221b39 | [
"Apache-2.0"
] | null | null | null | from selenium_ui.jira import modules
from extension.jira import extension_ui # noqa F401
# this action should be the first one
def test_0_selenium_a_login(jira_webdriver, jira_datasets, jira_screen_shots):
modules.login(jira_webdriver, jira_datasets)
def test_1_selenium_browse_projects_list(jira_webdriver, jira_datasets, jira_screen_shots):
modules.browse_projects_list(jira_webdriver, jira_datasets)
def test_1_selenium_browse_boards_list(jira_webdriver, jira_datasets, jira_screen_shots):
modules.browse_boards_list(jira_webdriver, jira_datasets)
def test_1_selenium_create_issue(jira_webdriver, jira_datasets, jira_screen_shots):
modules.create_issue(jira_webdriver, jira_datasets)
def test_1_selenium_edit_issue(jira_webdriver, jira_datasets, jira_screen_shots):
modules.edit_issue(jira_webdriver, jira_datasets)
def test_1_selenium_save_comment(jira_webdriver, jira_datasets, jira_screen_shots):
modules.save_comment(jira_webdriver, jira_datasets)
def test_1_selenium_search_jql(jira_webdriver, jira_datasets, jira_screen_shots):
modules.search_jql(jira_webdriver, jira_datasets)
def test_1_selenium_view_backlog_for_scrum_board(jira_webdriver, jira_datasets, jira_screen_shots):
modules.view_backlog_for_scrum_board(jira_webdriver, jira_datasets)
def test_1_selenium_view_scrum_board(jira_webdriver, jira_datasets, jira_screen_shots):
modules.view_scrum_board(jira_webdriver, jira_datasets)
def test_1_selenium_view_kanban_board(jira_webdriver, jira_datasets, jira_screen_shots):
modules.view_kanban_board(jira_webdriver, jira_datasets)
def test_1_selenium_view_dashboard(jira_webdriver, jira_datasets, jira_screen_shots):
modules.view_dashboard(jira_webdriver, jira_datasets)
def test_1_selenium_view_issue(jira_webdriver, jira_datasets, jira_screen_shots):
modules.view_issue(jira_webdriver, jira_datasets)
def test_1_selenium_view_project_summary(jira_webdriver, jira_datasets, jira_screen_shots):
modules.view_project_summary(jira_webdriver, jira_datasets)
"""
Add custom actions anywhere between login and log out action. Move this to a different line as needed.
Write your custom selenium scripts in `app/extension/jira/extension_ui.py`.
Refer to `app/selenium_ui/jira/modules.py` for examples.
"""
def test_1_selenium_custom_action(jira_webdriver, jira_datasets, jira_screen_shots):
extension_ui.app_specific_action(jira_webdriver, jira_datasets)
# this action should be the last one
def test_2_selenium_z_log_out(jira_webdriver, jira_datasets, jira_screen_shots):
modules.log_out(jira_webdriver, jira_datasets)
| 37.414286 | 102 | 0.846124 | 386 | 2,619 | 5.256477 | 0.186529 | 0.192213 | 0.251355 | 0.36964 | 0.82553 | 0.781666 | 0.724988 | 0.659438 | 0.546575 | 0.22277 | 0 | 0.007569 | 0.09202 | 2,619 | 69 | 103 | 37.956522 | 0.845669 | 0.030546 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.46875 | false | 0 | 0.0625 | 0 | 0.53125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
4c50c80dff95d7753803c0765ac101c623043852 | 4,251 | py | Python | pdb2pqr-1.9.0/contrib/ZSI-2.1-a1/test/tests_bad.py | Acpharis/protein_prep | 8cc2f0caedefd5a3fdaa764ed013c2660a4df1b8 | [
"BSD-3-Clause"
] | null | null | null | pdb2pqr-1.9.0/contrib/ZSI-2.1-a1/test/tests_bad.py | Acpharis/protein_prep | 8cc2f0caedefd5a3fdaa764ed013c2660a4df1b8 | [
"BSD-3-Clause"
] | null | null | null | pdb2pqr-1.9.0/contrib/ZSI-2.1-a1/test/tests_bad.py | Acpharis/protein_prep | 8cc2f0caedefd5a3fdaa764ed013c2660a4df1b8 | [
"BSD-3-Clause"
] | null | null | null | test01 = '''<SOAP-ENV:Envelope foo='bar'
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:Body>
<m:GetLastTradePrice xmlns:m="Some-URI">
<symbol>DIS</symbol>
</m:GetLastTradePrice>
</SOAP-ENV:Body>
</SOAP-ENV:Envelope>'''
test02 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/">
<SOAP-ENV:Header>
<t:Transaction xmlns:t="some-URI" SOAP-ENV:mustUnderstand="1">
5
</t:Transaction>
</SOAP-ENV:Header>
<SOAP-ENV:Body/>
</SOAP-ENV:Envelope>'''
test03 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/">
<SOAP-ENV:Body>
<SOAP-ENV:Fault>
<faultcode>SOAP-ENV:MustUnderstand</faultcode>
<faultstring>SOAP Must Understand Error</faultstring>
<?MYPI spenser?>
</SOAP-ENV:Fault>
</SOAP-ENV:Body>
</SOAP-ENV:Envelope>'''
test04 = '''<SOAP-ENV:Envelope fooattr='bar'
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/">
<SOAP-ENV:Body>
<SOAP-ENV:Fault>
<faultcode>SOAP-ENV:Server</faultcode>
<faultstring>Server Error</faultstring>
<detail>
<e:myfaultdetails xmlns:e="Some-URI">
<message>
My application didn't work
</message>
<errorcode>
1001
</errorcode>
</e:myfaultdetails>
</detail>
</SOAP-ENV:Fault>
</SOAP-ENV:Body>
</SOAP-ENV:Envelope>'''
test05 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:Body></SOAP-ENV:Body>
<SOAP-ENV:Body></SOAP-ENV:Body>
</SOAP-ENV:Envelope>'''
test06 = '''<SOAP-ENV:ChemicalX
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:Body></SOAP-ENV:Body>
<SOAP-ENV:Body></SOAP-ENV:Body>
</SOAP-ENV:ChemicalX>'''
test07 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:Body></SOAP-ENV:Body>
<SOAP-ENV:Header></SOAP-ENV:Header>
</SOAP-ENV:Envelope>'''
test08 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:zBody></SOAP-ENV:zBody>
</SOAP-ENV:Envelope>'''
test09 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:Header></SOAP-ENV:Header>
<SOAP-ENV:Header></SOAP-ENV:Header>
<SOAP-ENV:Body></SOAP-ENV:Body>
</SOAP-ENV:Envelope>'''
test10 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:Header></SOAP-ENV:Header>
<SOAP-ENV:Body></SOAP-ENV:Body>
<SOAP-ENV:Header></SOAP-ENV:Header>
</SOAP-ENV:Envelope>'''
test11 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:Header></SOAP-ENV:Header>
<SOAP-ENV:Body></SOAP-ENV:Body>
<m:data xmlns:m="data-URI">
<symbol>DEF</symbol>
</m:data>
</SOAP-ENV:Envelope>'''
test12 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:Header></SOAP-ENV:Header>
<SOAP-ENV:Body></SOAP-ENV:Body>
<m:data xmlns:m="data-URI">
<?PIE?>
<symbol>DEF</symbol>
</m:data>
</SOAP-ENV:Envelope>'''
test13 = '''<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<?xoo?>
<SOAP-ENV:Header></SOAP-ENV:Header>
<SOAP-ENV:Body></SOAP-ENV:Body>
<m:data xmlns:m="data-URI">
<symbol>DEF</symbol>
</m:data>
</SOAP-ENV:Envelope>'''
| 33.472441 | 69 | 0.639849 | 556 | 4,251 | 4.892086 | 0.115108 | 0.265074 | 0.109191 | 0.177574 | 0.80625 | 0.797426 | 0.784926 | 0.784926 | 0.760662 | 0.726838 | 0 | 0.008859 | 0.150318 | 4,251 | 126 | 70 | 33.738095 | 0.744186 | 0 | 0 | 0.637168 | 0 | 0 | 0.948235 | 0.183059 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4c5ef9ada11c481888f1e1018c79eb3aaf1a9de3 | 4,011 | py | Python | docs/example/example_database_setup.py | kitjosh1050/DataCompare | 2a410c69ef5a97b000a4eef254a41f44f2c53c4b | [
"MIT"
] | 3 | 2016-04-20T10:13:24.000Z | 2019-07-31T22:36:08.000Z | docs/example/example_database_setup.py | kitjosh1050/DataCompare | 2a410c69ef5a97b000a4eef254a41f44f2c53c4b | [
"MIT"
] | 1 | 2016-03-02T21:59:12.000Z | 2016-03-03T14:32:49.000Z | docs/example/example_database_setup.py | kitjosh1050/DataCompare | 2a410c69ef5a97b000a4eef254a41f44f2c53c4b | [
"MIT"
] | 1 | 2016-04-09T17:09:07.000Z | 2016-04-09T17:09:07.000Z | import pyodbc
connection_string = "Driver=SQLite3 ODBC Driver;Database=sqlite.db"
sql_connection = pyodbc.connect(connection_string)
try:
with sql_connection.cursor() as sql_cursor:
sql_cursor.execute(
"CREATE TABLE sales_new(salesdate DATETIME, product VARCHAR(255), sales_quantity INTEGER, sales_amount FLOAT)")
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-01', 'Bicycle', 10,
round(10 * 250.41, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-02', 'Bicycle', 5,
round(5 * 250.41, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-03', 'Bicycle', 2,
round(2 * 250.41, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-05', 'Bicycle', 7,
round(7 * 250.41, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-01', 'Jersey', 8, round(8 * 52.4, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-02', 'Jersey', 4, round(4 * 52.4, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-03', 'Jersey', 3, round(3 * 52.4, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-05', 'Jersey', 1, round(1 * 52.4, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-01', 'Helmet', 10, round(10 * 27, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-02', 'Helmet', None, None)
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-03', 'Helmet', 5, round(5 * 27, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-04-05', 'Helmet', 4, round(4 * 27, 2))
sql_cursor.execute("INSERT INTO sales_new VALUES (?,?,?,?)", '2015-05-05', 'Helmet', 4, 1)
sql_cursor.execute(
"CREATE TABLE sales_old(salesdate DATETIME, product VARCHAR(255), sales_quantity INTEGER, sales_amount FLOAT)")
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-01', 'Bicycle', 10,
round(10 * 240.80, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-02', 'Bicycle', 5,
round(5 * 250.41, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-03', 'Bicycle', 2,
round(2 * 250.41, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-04', 'Bicycle', 4,
round(4 * 250.41, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-05', 'Bicycle', 7,
round(7 * 250.41, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-01', 'Jersey', 8, round(8 * 52.4, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-02', 'Jersey', 4, round(4 * 52.4, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-03', 'Jersey', 3, round(3 * 52.4, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-04', 'Jersey', 2, round(2 * 52.4, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-05', 'Jersey', 1, round(1 * 52.4, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-01', 'Helmet', 10, round(10 * 27, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-02', 'Helmet', 1, round(1 * 27, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-03', 'Helmet', 5, round(5 * 27, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-04', 'Helmet', 8, round(8 * 27, 2))
sql_cursor.execute("INSERT INTO sales_old VALUES (?,?,?,?)", '2015-04-05', 'Helmet', 4, round(4 * 27, 2))
finally:
sql_connection.close()
| 77.134615 | 123 | 0.569933 | 553 | 4,011 | 4.007233 | 0.110307 | 0.125903 | 0.216607 | 0.277978 | 0.887635 | 0.88222 | 0.853339 | 0.853339 | 0.838899 | 0.838899 | 0 | 0.130282 | 0.221142 | 4,011 | 51 | 124 | 78.647059 | 0.579065 | 0 | 0 | 0.166667 | 0 | 0 | 0.444278 | 0.006233 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.020833 | 0 | 0.020833 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d5cc127cb9cab7014b9d5697242b1c512313e8ce | 224 | bzl | Python | debian/patchelf.bzl | Ewpratten/frc_971_mirror | 3a8a0c4359f284d29547962c2b4c43d290d8065c | [
"BSD-2-Clause"
] | 39 | 2021-06-18T03:22:30.000Z | 2022-03-21T15:23:43.000Z | debian/patchelf.bzl | Ewpratten/frc_971_mirror | 3a8a0c4359f284d29547962c2b4c43d290d8065c | [
"BSD-2-Clause"
] | 10 | 2021-06-18T03:22:19.000Z | 2022-03-18T22:14:15.000Z | debian/patchelf.bzl | Ewpratten/frc_971_mirror | 3a8a0c4359f284d29547962c2b4c43d290d8065c | [
"BSD-2-Clause"
] | 4 | 2021-08-19T19:20:04.000Z | 2022-03-08T07:33:18.000Z | files = {
"libstdc++6_4.9.2-10+deb8u1_amd64.deb": "a8f4ef6773b90bb39a8a8a0a5e3e20ca8501de6896204f665eb114d5b79f164f",
"patchelf_0.8-2_amd64.deb": "5d506507df7c02766ae6c3ca0d15b4234f4cb79a80799190ded9d3ca0ac28c0c",
}
| 44.8 | 111 | 0.816964 | 18 | 224 | 9.944444 | 0.833333 | 0.089385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.451923 | 0.071429 | 224 | 4 | 112 | 56 | 0.408654 | 0 | 0 | 0 | 0 | 0 | 0.839286 | 0.839286 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d5daf30db869922cf1b49f431ad0ee176ca1add6 | 3,230 | py | Python | vobject/__init__.py | karalan/google-tasks-porter | 58754f4ee5d478a780bc316bbeea5a5a82e1f6f7 | [
"Apache-2.0"
] | 3 | 2015-12-25T14:45:36.000Z | 2016-11-28T09:58:03.000Z | vobject/__init__.py | karalan/google-tasks-porter | 58754f4ee5d478a780bc316bbeea5a5a82e1f6f7 | [
"Apache-2.0"
] | 4 | 2021-03-19T15:38:56.000Z | 2021-09-08T02:47:16.000Z | vendor-local/lib/python/vobject/__init__.py | Acidburn0zzz/airmozilla | 7b03af6d6efe9af00a6070f5327e10fb755c3766 | [
"BSD-3-Clause"
] | 1 | 2019-11-02T23:29:13.000Z | 2019-11-02T23:29:13.000Z | """
VObject Overview
================
vobject parses vCard or vCalendar files, returning a tree of Python objects.
It also provids an API to create vCard or vCalendar data structures which
can then be serialized.
Parsing existing streams
------------------------
Streams containing one or many L{Component<base.Component>}s can be
parsed using L{readComponents<base.readComponents>}. As each Component
is parsed, vobject will attempt to give it a L{Behavior<behavior.Behavior>}.
If an appropriate Behavior is found, any base64, quoted-printable, or
backslash escaped data will automatically be decoded. Dates and datetimes
will be transformed to datetime.date or datetime.datetime instances.
Components containing recurrence information will have a special rruleset
attribute (a dateutil.rrule.rruleset instance).
Validation
----------
L{Behavior<behavior.Behavior>} classes implement validation for
L{Component<base.Component>}s. To validate, an object must have all
required children. There (TODO: will be) a toggle to raise an exception or
just log unrecognized, non-experimental children and parameters.
Creating objects programatically
--------------------------------
A L{Component<base.Component>} can be created from scratch. No encoding
is necessary, serialization will encode data automatically. Factory
functions (TODO: will be) available to create standard objects.
Serializing objects
-------------------
Serialization:
- Looks for missing required children that can be automatically generated,
like a UID or a PRODID, and adds them
- Encodes all values that can be automatically encoded
- Checks to make sure the object is valid (unless this behavior is
explicitly disabled)
- Appends the serialized object to a buffer, or fills a new
buffer and returns it
Examples
--------
>>> import datetime
>>> import dateutil.rrule as rrule
>>> x = iCalendar()
>>> x.add('vevent')
<VEVENT| []>
>>> x
<VCALENDAR| [<VEVENT| []>]>
>>> v = x.vevent
>>> utc = icalendar.utc
>>> v.add('dtstart').value = datetime.datetime(2004, 12, 15, 14, tzinfo = utc)
>>> v
<VEVENT| [<DTSTART{}2004-12-15 14:00:00+00:00>]>
>>> x
<VCALENDAR| [<VEVENT| [<DTSTART{}2004-12-15 14:00:00+00:00>]>]>
>>> newrule = rrule.rruleset()
>>> newrule.rrule(rrule.rrule(rrule.WEEKLY, count=2, dtstart=v.dtstart.value))
>>> v.rruleset = newrule
>>> list(v.rruleset)
[datetime.datetime(2004, 12, 15, 14, 0, tzinfo=tzutc()), datetime.datetime(2004, 12, 22, 14, 0, tzinfo=tzutc())]
>>> v.add('uid').value = "randomuid@MYHOSTNAME"
>>> print x.serialize()
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//PYVOBJECT//NONSGML Version 1//EN
BEGIN:VEVENT
UID:randomuid@MYHOSTNAME
DTSTART:20041215T140000Z
RRULE:FREQ=WEEKLY;COUNT=2
END:VEVENT
END:VCALENDAR
"""
import base, icalendar, vcard
from base import readComponents, readOne, newFromBehavior
def iCalendar():
return newFromBehavior('vcalendar', '2.0')
def vCard():
return newFromBehavior('vcard', '3.0') | 37.55814 | 116 | 0.660991 | 404 | 3,230 | 5.284653 | 0.443069 | 0.011241 | 0.014988 | 0.018735 | 0.075878 | 0.053396 | 0.02904 | 0.02904 | 0.02904 | 0.02904 | 0 | 0.036357 | 0.20805 | 3,230 | 86 | 117 | 37.55814 | 0.79828 | 0.93065 | 0 | 0 | 0 | 0 | 0.09434 | 0 | 0 | 0 | 0 | 0.023256 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
d5ee188295fb3d51444e113d8d10d953415f8f23 | 256 | py | Python | dcosdev/oper/__init__.py | deluxor/dcosdev | 83c11920ecc2145fdf52b12743d6636c5a652d0f | [
"Apache-2.0"
] | null | null | null | dcosdev/oper/__init__.py | deluxor/dcosdev | 83c11920ecc2145fdf52b12743d6636c5a652d0f | [
"Apache-2.0"
] | null | null | null | dcosdev/oper/__init__.py | deluxor/dcosdev | 83c11920ecc2145fdf52b12743d6636c5a652d0f | [
"Apache-2.0"
] | 3 | 2020-01-08T17:03:52.000Z | 2020-07-31T11:05:01.000Z | import sys
sys.dont_write_bytecode=True
from . import svc, package, mjm, config, resource, main_java, build_gradle, settings_gradle, tests
__all__ = ['svc', 'package', 'mjm', 'config', 'resource', 'main_java', 'build_gradle', 'settings_gradle', 'tests']
| 36.571429 | 114 | 0.734375 | 34 | 256 | 5.176471 | 0.558824 | 0.113636 | 0.147727 | 0.215909 | 0.738636 | 0.738636 | 0.738636 | 0.738636 | 0.738636 | 0.738636 | 0 | 0 | 0.109375 | 256 | 6 | 115 | 42.666667 | 0.77193 | 0 | 0 | 0 | 0 | 0 | 0.265625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
d5f1ae69ae2295f3ad0a3a6e333961f8c049a674 | 236,091 | py | Python | python_modules/dagster/dagster_tests/core_tests/snap_tests/snapshots/snap_test_pipeline_snap.py | kstennettlull/dagster | dd6f57e170ff03bf145f1dd1417e0b2c3156b1d6 | [
"Apache-2.0"
] | null | null | null | python_modules/dagster/dagster_tests/core_tests/snap_tests/snapshots/snap_test_pipeline_snap.py | kstennettlull/dagster | dd6f57e170ff03bf145f1dd1417e0b2c3156b1d6 | [
"Apache-2.0"
] | null | null | null | python_modules/dagster/dagster_tests/core_tests/snap_tests/snapshots/snap_test_pipeline_snap.py | kstennettlull/dagster | dd6f57e170ff03bf145f1dd1417e0b2c3156b1d6 | [
"Apache-2.0"
] | 1 | 2019-09-11T03:02:27.000Z | 2019-09-11T03:02:27.000Z | # -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_basic_dep_fan_out 1'] = '''{
"__class__": "PipelineSnapshot",
"config_schema_snapshot": {
"__class__": "ConfigSchemaSnapshot",
"all_config_snaps_by_key": {
"Any": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": "Any",
"key": "Any",
"kind": {
"__enum__": "ConfigTypeKind.ANY"
},
"scalar_kind": null,
"type_param_keys": null
},
"Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
]
},
"Array.String": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.String",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.String",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"String"
]
},
"Bool": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Bool",
"key": "Bool",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.BOOL"
},
"type_param_keys": null
},
"Float": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Float",
"key": "Float",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.FLOAT"
},
"type_param_keys": null
},
"Int": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Int",
"key": "Int",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.INT"
},
"type_param_keys": null
},
"ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Bool",
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59"
]
},
"ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Float",
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3"
]
},
"ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Int",
"Selector.a9799b971d12ace70a2d8803c883c863417d0725"
]
},
"ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"String",
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269"
]
},
"Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "forkserver",
"type_key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "spawn",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "disabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "enabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Int"
}
],
"given_name": null,
"key": "Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Bool"
}
],
"given_name": null,
"key": "Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Float"
}
],
"given_name": null,
"key": "Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "String"
}
],
"given_name": null,
"key": "Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e52fa3afbe531d9522fae1206f3ae9d248775742": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
}
],
"given_name": null,
"key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Any"
}
],
"given_name": null,
"key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "in_process",
"type_key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "multiprocess",
"type_key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7"
}
],
"given_name": null,
"key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.0bb49540f1708dcf5378009c9571eba999502e19": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "io_manager",
"type_key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7"
}
],
"given_name": null,
"key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4"
}
],
"given_name": null,
"key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.241ac489ffa5f718db6444bae7849fb86a62e441": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"INFO\\"",
"description": null,
"is_required": false,
"name": "log_level",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"dagster\\"",
"description": null,
"is_required": false,
"name": "name",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.3baab16166bacfaf4705811e64d356112fd733cb": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
}
],
"given_name": null,
"key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "result",
"type_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742"
}
],
"given_name": null,
"key": "Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.4277013c8c05368bc2b9a69c9b3d0ba9a592f831": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"in_process\\": {}}",
"description": null,
"is_required": false,
"name": "execution",
"type_key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "loggers",
"type_key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"io_manager\\": {}}",
"description": null,
"is_required": false,
"name": "resources",
"type_key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"passone\\": {}, \\"passtwo\\": {}, \\"return_one\\": {}}",
"description": null,
"is_required": false,
"name": "solids",
"type_key": "Shape.efd6e48220d7eb65a0b9e8814dd15fa00be63496"
}
],
"given_name": null,
"key": "Shape.4277013c8c05368bc2b9a69c9b3d0ba9a592f831",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Explicit modules to preload in the forkserver.",
"is_required": false,
"name": "preload_modules",
"type_key": "Array.String"
}
],
"given_name": null,
"key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "path",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.69ff9be621991cc7961ea5e667d43edaac9d2339": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "outputs",
"type_key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
}
],
"given_name": null,
"key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.743e47901855cb245064dd633e217bfcb49a11a7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
}
],
"given_name": null,
"key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "marker_to_close",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
}
],
"given_name": null,
"key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c"
}
],
"given_name": null,
"key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [],
"given_name": null,
"key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "0",
"description": null,
"is_required": false,
"name": "max_concurrent",
"type_key": "Int"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Select how subprocesses are created. Defaults to spawn.\\nWhen forkserver is selected, set_forkserver_preload will be called with either:\\n* the preload_modules list if provided by config\\n* the module containing the Job if it was loaded from a module\\n* dagster\\nhttps://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods",
"is_required": false,
"name": "start_method",
"type_key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47"
}
],
"given_name": null,
"key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "console",
"type_key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb"
}
],
"given_name": null,
"key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.efd6e48220d7eb65a0b9e8814dd15fa00be63496": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "passone",
"type_key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "passtwo",
"type_key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "return_one",
"type_key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339"
}
],
"given_name": null,
"key": "Shape.efd6e48220d7eb65a0b9e8814dd15fa00be63496",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"String": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "String",
"key": "String",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.STRING"
},
"type_param_keys": null
}
}
},
"dagster_type_namespace_snapshot": {
"__class__": "DagsterTypeNamespaceSnapshot",
"all_dagster_type_snaps_by_key": {
"Any": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Any",
"is_builtin": true,
"key": "Any",
"kind": {
"__enum__": "DagsterTypeKind.ANY"
},
"loader_schema_key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Any",
"type_param_keys": []
},
"Bool": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Bool",
"is_builtin": true,
"key": "Bool",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Bool",
"type_param_keys": []
},
"Float": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Float",
"is_builtin": true,
"key": "Float",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Float",
"type_param_keys": []
},
"Int": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Int",
"is_builtin": true,
"key": "Int",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Int",
"type_param_keys": []
},
"Nothing": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Nothing",
"is_builtin": true,
"key": "Nothing",
"kind": {
"__enum__": "DagsterTypeKind.NOTHING"
},
"loader_schema_key": null,
"materializer_schema_key": null,
"name": "Nothing",
"type_param_keys": []
},
"String": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "String",
"is_builtin": true,
"key": "String",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "String",
"type_param_keys": []
}
}
},
"dep_structure_snapshot": {
"__class__": "DependencyStructureSnapshot",
"solid_invocation_snaps": [
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [
{
"__class__": "InputDependencySnap",
"input_name": "value",
"is_dynamic_collect": false,
"upstream_output_snaps": [
{
"__class__": "OutputHandleSnap",
"output_name": "result",
"solid_name": "return_one"
}
]
}
],
"is_dynamic_mapped": false,
"solid_def_name": "passthrough",
"solid_name": "passone",
"tags": {}
},
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [
{
"__class__": "InputDependencySnap",
"input_name": "value",
"is_dynamic_collect": false,
"upstream_output_snaps": [
{
"__class__": "OutputHandleSnap",
"output_name": "result",
"solid_name": "return_one"
}
]
}
],
"is_dynamic_mapped": false,
"solid_def_name": "passthrough",
"solid_name": "passtwo",
"tags": {}
},
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [],
"is_dynamic_mapped": false,
"solid_def_name": "return_one",
"solid_name": "return_one",
"tags": {}
}
]
},
"description": null,
"graph_def_name": "single_dep_pipeline",
"lineage_snapshot": null,
"mode_def_snaps": [
{
"__class__": "ModeDefSnap",
"description": null,
"logger_def_snaps": [
{
"__class__": "LoggerDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
},
"description": "The default colored console logger.",
"name": "console"
}
],
"name": "default",
"resource_def_snaps": [
{
"__class__": "ResourceDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"name": "io_manager"
}
],
"root_config_key": "Shape.4277013c8c05368bc2b9a69c9b3d0ba9a592f831"
}
],
"name": "single_dep_pipeline",
"solid_definitions_snapshot": {
"__class__": "SolidDefinitionsSnapshot",
"composite_solid_def_snaps": [],
"solid_def_snaps": [
{
"__class__": "SolidDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"input_def_snaps": [
{
"__class__": "InputDefSnap",
"dagster_type_key": "Int",
"description": null,
"name": "value"
}
],
"name": "passthrough",
"output_def_snaps": [
{
"__class__": "OutputDefSnap",
"dagster_type_key": "Any",
"description": null,
"is_dynamic": false,
"is_required": true,
"name": "result"
}
],
"required_resource_keys": [],
"tags": {}
},
{
"__class__": "SolidDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"input_def_snaps": [],
"name": "return_one",
"output_def_snaps": [
{
"__class__": "OutputDefSnap",
"dagster_type_key": "Any",
"description": null,
"is_dynamic": false,
"is_required": true,
"name": "result"
}
],
"required_resource_keys": [],
"tags": {}
}
]
},
"tags": {}
}'''
snapshots['test_basic_dep_fan_out 2'] = '54968239118e5e3237a729ca821a0454a553e1e8'
snapshots['test_basic_fan_in 1'] = '''{
"__class__": "PipelineSnapshot",
"config_schema_snapshot": {
"__class__": "ConfigSchemaSnapshot",
"all_config_snaps_by_key": {
"Any": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": "Any",
"key": "Any",
"kind": {
"__enum__": "ConfigTypeKind.ANY"
},
"scalar_kind": null,
"type_param_keys": null
},
"Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
]
},
"Array.String": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.String",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.String",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"String"
]
},
"Bool": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Bool",
"key": "Bool",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.BOOL"
},
"type_param_keys": null
},
"Float": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Float",
"key": "Float",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.FLOAT"
},
"type_param_keys": null
},
"Int": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Int",
"key": "Int",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.INT"
},
"type_param_keys": null
},
"ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Bool",
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59"
]
},
"ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Float",
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3"
]
},
"ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Int",
"Selector.a9799b971d12ace70a2d8803c883c863417d0725"
]
},
"ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"String",
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269"
]
},
"Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "forkserver",
"type_key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "spawn",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "disabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "enabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Int"
}
],
"given_name": null,
"key": "Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Bool"
}
],
"given_name": null,
"key": "Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Float"
}
],
"given_name": null,
"key": "Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "String"
}
],
"given_name": null,
"key": "Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e52fa3afbe531d9522fae1206f3ae9d248775742": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
}
],
"given_name": null,
"key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Any"
}
],
"given_name": null,
"key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "in_process",
"type_key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "multiprocess",
"type_key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7"
}
],
"given_name": null,
"key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.0bb49540f1708dcf5378009c9571eba999502e19": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "io_manager",
"type_key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7"
}
],
"given_name": null,
"key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.17b6a168d89648299f5fa63c548ecef2405875ca": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
}
],
"given_name": null,
"key": "Shape.17b6a168d89648299f5fa63c548ecef2405875ca",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4"
}
],
"given_name": null,
"key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.241ac489ffa5f718db6444bae7849fb86a62e441": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"INFO\\"",
"description": null,
"is_required": false,
"name": "log_level",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"dagster\\"",
"description": null,
"is_required": false,
"name": "name",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.3baab16166bacfaf4705811e64d356112fd733cb": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
}
],
"given_name": null,
"key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "result",
"type_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742"
}
],
"given_name": null,
"key": "Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Explicit modules to preload in the forkserver.",
"is_required": false,
"name": "preload_modules",
"type_key": "Array.String"
}
],
"given_name": null,
"key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.49870c4c11bde22648d88253b62f2823fb7873be": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"in_process\\": {}}",
"description": null,
"is_required": false,
"name": "execution",
"type_key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "loggers",
"type_key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"io_manager\\": {}}",
"description": null,
"is_required": false,
"name": "resources",
"type_key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"nothing_one\\": {}, \\"nothing_two\\": {}, \\"take_nothings\\": {}}",
"description": null,
"is_required": false,
"name": "solids",
"type_key": "Shape.7666198738d531f40c136b24e46d12ee0ca3dc25"
}
],
"given_name": null,
"key": "Shape.49870c4c11bde22648d88253b62f2823fb7873be",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "path",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.69ff9be621991cc7961ea5e667d43edaac9d2339": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "outputs",
"type_key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
}
],
"given_name": null,
"key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.743e47901855cb245064dd633e217bfcb49a11a7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
}
],
"given_name": null,
"key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.7666198738d531f40c136b24e46d12ee0ca3dc25": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "nothing_one",
"type_key": "Shape.17b6a168d89648299f5fa63c548ecef2405875ca"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "nothing_two",
"type_key": "Shape.17b6a168d89648299f5fa63c548ecef2405875ca"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "take_nothings",
"type_key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339"
}
],
"given_name": null,
"key": "Shape.7666198738d531f40c136b24e46d12ee0ca3dc25",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "marker_to_close",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
}
],
"given_name": null,
"key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c"
}
],
"given_name": null,
"key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [],
"given_name": null,
"key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "0",
"description": null,
"is_required": false,
"name": "max_concurrent",
"type_key": "Int"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Select how subprocesses are created. Defaults to spawn.\\nWhen forkserver is selected, set_forkserver_preload will be called with either:\\n* the preload_modules list if provided by config\\n* the module containing the Job if it was loaded from a module\\n* dagster\\nhttps://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods",
"is_required": false,
"name": "start_method",
"type_key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47"
}
],
"given_name": null,
"key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "console",
"type_key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb"
}
],
"given_name": null,
"key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"String": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "String",
"key": "String",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.STRING"
},
"type_param_keys": null
}
}
},
"dagster_type_namespace_snapshot": {
"__class__": "DagsterTypeNamespaceSnapshot",
"all_dagster_type_snaps_by_key": {
"Any": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Any",
"is_builtin": true,
"key": "Any",
"kind": {
"__enum__": "DagsterTypeKind.ANY"
},
"loader_schema_key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Any",
"type_param_keys": []
},
"Bool": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Bool",
"is_builtin": true,
"key": "Bool",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Bool",
"type_param_keys": []
},
"Float": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Float",
"is_builtin": true,
"key": "Float",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Float",
"type_param_keys": []
},
"Int": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Int",
"is_builtin": true,
"key": "Int",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Int",
"type_param_keys": []
},
"Nothing": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Nothing",
"is_builtin": true,
"key": "Nothing",
"kind": {
"__enum__": "DagsterTypeKind.NOTHING"
},
"loader_schema_key": null,
"materializer_schema_key": null,
"name": "Nothing",
"type_param_keys": []
},
"String": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "String",
"is_builtin": true,
"key": "String",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "String",
"type_param_keys": []
}
}
},
"dep_structure_snapshot": {
"__class__": "DependencyStructureSnapshot",
"solid_invocation_snaps": [
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [],
"is_dynamic_mapped": false,
"solid_def_name": "return_nothing",
"solid_name": "nothing_one",
"tags": {}
},
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [],
"is_dynamic_mapped": false,
"solid_def_name": "return_nothing",
"solid_name": "nothing_two",
"tags": {}
},
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [
{
"__class__": "InputDependencySnap",
"input_name": "nothing",
"is_dynamic_collect": false,
"upstream_output_snaps": [
{
"__class__": "OutputHandleSnap",
"output_name": "result",
"solid_name": "nothing_one"
},
{
"__class__": "OutputHandleSnap",
"output_name": "result",
"solid_name": "nothing_two"
}
]
}
],
"is_dynamic_mapped": false,
"solid_def_name": "take_nothings",
"solid_name": "take_nothings",
"tags": {}
}
]
},
"description": null,
"graph_def_name": "fan_in_test",
"lineage_snapshot": null,
"mode_def_snaps": [
{
"__class__": "ModeDefSnap",
"description": null,
"logger_def_snaps": [
{
"__class__": "LoggerDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
},
"description": "The default colored console logger.",
"name": "console"
}
],
"name": "default",
"resource_def_snaps": [
{
"__class__": "ResourceDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"name": "io_manager"
}
],
"root_config_key": "Shape.49870c4c11bde22648d88253b62f2823fb7873be"
}
],
"name": "fan_in_test",
"solid_definitions_snapshot": {
"__class__": "SolidDefinitionsSnapshot",
"composite_solid_def_snaps": [],
"solid_def_snaps": [
{
"__class__": "SolidDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"input_def_snaps": [],
"name": "return_nothing",
"output_def_snaps": [
{
"__class__": "OutputDefSnap",
"dagster_type_key": "Nothing",
"description": null,
"is_dynamic": false,
"is_required": true,
"name": "result"
}
],
"required_resource_keys": [],
"tags": {}
},
{
"__class__": "SolidDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"input_def_snaps": [
{
"__class__": "InputDefSnap",
"dagster_type_key": "Nothing",
"description": null,
"name": "nothing"
}
],
"name": "take_nothings",
"output_def_snaps": [
{
"__class__": "OutputDefSnap",
"dagster_type_key": "Any",
"description": null,
"is_dynamic": false,
"is_required": true,
"name": "result"
}
],
"required_resource_keys": [],
"tags": {}
}
]
},
"tags": {}
}'''
snapshots['test_basic_fan_in 2'] = '8dae931737fe874d74aeec2d7fd12011c68860a2'
snapshots['test_deserialize_solid_def_snaps_multi_type_config 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "bar",
"type_key": "Selector.c12ab659793f168246640a294e913ac9d90a242a"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "foo",
"type_key": "Array.Float"
}
],
"given_name": null,
"key": "Permissive.bda2965be6725b48329d76783336ed442951fd54",
"kind": {
"__enum__": "ConfigTypeKind.PERMISSIVE_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
}'''
snapshots['test_empty_pipeline_snap_props 1'] = '''{
"__class__": "PipelineSnapshot",
"config_schema_snapshot": {
"__class__": "ConfigSchemaSnapshot",
"all_config_snaps_by_key": {
"Any": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": "Any",
"key": "Any",
"kind": {
"__enum__": "ConfigTypeKind.ANY"
},
"scalar_kind": null,
"type_param_keys": null
},
"Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
]
},
"Array.String": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.String",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.String",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"String"
]
},
"Bool": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Bool",
"key": "Bool",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.BOOL"
},
"type_param_keys": null
},
"Float": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Float",
"key": "Float",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.FLOAT"
},
"type_param_keys": null
},
"Int": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Int",
"key": "Int",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.INT"
},
"type_param_keys": null
},
"ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Bool",
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59"
]
},
"ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Float",
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3"
]
},
"ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Int",
"Selector.a9799b971d12ace70a2d8803c883c863417d0725"
]
},
"ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"String",
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269"
]
},
"Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "forkserver",
"type_key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "spawn",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "disabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "enabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Int"
}
],
"given_name": null,
"key": "Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Bool"
}
],
"given_name": null,
"key": "Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Float"
}
],
"given_name": null,
"key": "Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "String"
}
],
"given_name": null,
"key": "Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e52fa3afbe531d9522fae1206f3ae9d248775742": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
}
],
"given_name": null,
"key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Any"
}
],
"given_name": null,
"key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "in_process",
"type_key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "multiprocess",
"type_key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7"
}
],
"given_name": null,
"key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.0bb49540f1708dcf5378009c9571eba999502e19": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "io_manager",
"type_key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7"
}
],
"given_name": null,
"key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4"
}
],
"given_name": null,
"key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.241ac489ffa5f718db6444bae7849fb86a62e441": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"INFO\\"",
"description": null,
"is_required": false,
"name": "log_level",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"dagster\\"",
"description": null,
"is_required": false,
"name": "name",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.32aa7ec6e7407e8a502d0a6094909a9365103a8e": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"in_process\\": {}}",
"description": null,
"is_required": false,
"name": "execution",
"type_key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "loggers",
"type_key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"io_manager\\": {}}",
"description": null,
"is_required": false,
"name": "resources",
"type_key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"noop_solid\\": {}}",
"description": null,
"is_required": false,
"name": "solids",
"type_key": "Shape.ba913521099bed4314e25592059869c8f3a3c96e"
}
],
"given_name": null,
"key": "Shape.32aa7ec6e7407e8a502d0a6094909a9365103a8e",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.3baab16166bacfaf4705811e64d356112fd733cb": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
}
],
"given_name": null,
"key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "result",
"type_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742"
}
],
"given_name": null,
"key": "Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Explicit modules to preload in the forkserver.",
"is_required": false,
"name": "preload_modules",
"type_key": "Array.String"
}
],
"given_name": null,
"key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "path",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.69ff9be621991cc7961ea5e667d43edaac9d2339": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "outputs",
"type_key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
}
],
"given_name": null,
"key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.743e47901855cb245064dd633e217bfcb49a11a7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
}
],
"given_name": null,
"key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "marker_to_close",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
}
],
"given_name": null,
"key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ba913521099bed4314e25592059869c8f3a3c96e": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "noop_solid",
"type_key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339"
}
],
"given_name": null,
"key": "Shape.ba913521099bed4314e25592059869c8f3a3c96e",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c"
}
],
"given_name": null,
"key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [],
"given_name": null,
"key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "0",
"description": null,
"is_required": false,
"name": "max_concurrent",
"type_key": "Int"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Select how subprocesses are created. Defaults to spawn.\\nWhen forkserver is selected, set_forkserver_preload will be called with either:\\n* the preload_modules list if provided by config\\n* the module containing the Job if it was loaded from a module\\n* dagster\\nhttps://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods",
"is_required": false,
"name": "start_method",
"type_key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47"
}
],
"given_name": null,
"key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "console",
"type_key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb"
}
],
"given_name": null,
"key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"String": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "String",
"key": "String",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.STRING"
},
"type_param_keys": null
}
}
},
"dagster_type_namespace_snapshot": {
"__class__": "DagsterTypeNamespaceSnapshot",
"all_dagster_type_snaps_by_key": {
"Any": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Any",
"is_builtin": true,
"key": "Any",
"kind": {
"__enum__": "DagsterTypeKind.ANY"
},
"loader_schema_key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Any",
"type_param_keys": []
},
"Bool": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Bool",
"is_builtin": true,
"key": "Bool",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Bool",
"type_param_keys": []
},
"Float": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Float",
"is_builtin": true,
"key": "Float",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Float",
"type_param_keys": []
},
"Int": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Int",
"is_builtin": true,
"key": "Int",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Int",
"type_param_keys": []
},
"Nothing": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Nothing",
"is_builtin": true,
"key": "Nothing",
"kind": {
"__enum__": "DagsterTypeKind.NOTHING"
},
"loader_schema_key": null,
"materializer_schema_key": null,
"name": "Nothing",
"type_param_keys": []
},
"String": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "String",
"is_builtin": true,
"key": "String",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "String",
"type_param_keys": []
}
}
},
"dep_structure_snapshot": {
"__class__": "DependencyStructureSnapshot",
"solid_invocation_snaps": [
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [],
"is_dynamic_mapped": false,
"solid_def_name": "noop_solid",
"solid_name": "noop_solid",
"tags": {}
}
]
},
"description": null,
"graph_def_name": "noop_pipeline",
"lineage_snapshot": null,
"mode_def_snaps": [
{
"__class__": "ModeDefSnap",
"description": null,
"logger_def_snaps": [
{
"__class__": "LoggerDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
},
"description": "The default colored console logger.",
"name": "console"
}
],
"name": "default",
"resource_def_snaps": [
{
"__class__": "ResourceDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"name": "io_manager"
}
],
"root_config_key": "Shape.32aa7ec6e7407e8a502d0a6094909a9365103a8e"
}
],
"name": "noop_pipeline",
"solid_definitions_snapshot": {
"__class__": "SolidDefinitionsSnapshot",
"composite_solid_def_snaps": [],
"solid_def_snaps": [
{
"__class__": "SolidDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"input_def_snaps": [],
"name": "noop_solid",
"output_def_snaps": [
{
"__class__": "OutputDefSnap",
"dagster_type_key": "Any",
"description": null,
"is_dynamic": false,
"is_required": true,
"name": "result"
}
],
"required_resource_keys": [],
"tags": {}
}
]
},
"tags": {}
}'''
snapshots['test_empty_pipeline_snap_props 2'] = '7ffd65ba8633d4c172a7b15dfee5927bed301724'
snapshots['test_empty_pipeline_snap_snapshot 1'] = '''{
"__class__": "PipelineSnapshot",
"config_schema_snapshot": {
"__class__": "ConfigSchemaSnapshot",
"all_config_snaps_by_key": {
"Any": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": "Any",
"key": "Any",
"kind": {
"__enum__": "ConfigTypeKind.ANY"
},
"scalar_kind": null,
"type_param_keys": null
},
"Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
]
},
"Array.String": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.String",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.String",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"String"
]
},
"Bool": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Bool",
"key": "Bool",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.BOOL"
},
"type_param_keys": null
},
"Float": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Float",
"key": "Float",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.FLOAT"
},
"type_param_keys": null
},
"Int": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Int",
"key": "Int",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.INT"
},
"type_param_keys": null
},
"ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Bool",
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59"
]
},
"ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Float",
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3"
]
},
"ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Int",
"Selector.a9799b971d12ace70a2d8803c883c863417d0725"
]
},
"ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"String",
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269"
]
},
"Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "forkserver",
"type_key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "spawn",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "disabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "enabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Int"
}
],
"given_name": null,
"key": "Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Bool"
}
],
"given_name": null,
"key": "Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Float"
}
],
"given_name": null,
"key": "Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "String"
}
],
"given_name": null,
"key": "Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e52fa3afbe531d9522fae1206f3ae9d248775742": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
}
],
"given_name": null,
"key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Any"
}
],
"given_name": null,
"key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "in_process",
"type_key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "multiprocess",
"type_key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7"
}
],
"given_name": null,
"key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.0bb49540f1708dcf5378009c9571eba999502e19": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "io_manager",
"type_key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7"
}
],
"given_name": null,
"key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4"
}
],
"given_name": null,
"key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.241ac489ffa5f718db6444bae7849fb86a62e441": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"INFO\\"",
"description": null,
"is_required": false,
"name": "log_level",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"dagster\\"",
"description": null,
"is_required": false,
"name": "name",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.32aa7ec6e7407e8a502d0a6094909a9365103a8e": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"in_process\\": {}}",
"description": null,
"is_required": false,
"name": "execution",
"type_key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "loggers",
"type_key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"io_manager\\": {}}",
"description": null,
"is_required": false,
"name": "resources",
"type_key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"noop_solid\\": {}}",
"description": null,
"is_required": false,
"name": "solids",
"type_key": "Shape.ba913521099bed4314e25592059869c8f3a3c96e"
}
],
"given_name": null,
"key": "Shape.32aa7ec6e7407e8a502d0a6094909a9365103a8e",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.3baab16166bacfaf4705811e64d356112fd733cb": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
}
],
"given_name": null,
"key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "result",
"type_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742"
}
],
"given_name": null,
"key": "Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Explicit modules to preload in the forkserver.",
"is_required": false,
"name": "preload_modules",
"type_key": "Array.String"
}
],
"given_name": null,
"key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "path",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.69ff9be621991cc7961ea5e667d43edaac9d2339": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "outputs",
"type_key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
}
],
"given_name": null,
"key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.743e47901855cb245064dd633e217bfcb49a11a7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
}
],
"given_name": null,
"key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "marker_to_close",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
}
],
"given_name": null,
"key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ba913521099bed4314e25592059869c8f3a3c96e": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "noop_solid",
"type_key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339"
}
],
"given_name": null,
"key": "Shape.ba913521099bed4314e25592059869c8f3a3c96e",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c"
}
],
"given_name": null,
"key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [],
"given_name": null,
"key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "0",
"description": null,
"is_required": false,
"name": "max_concurrent",
"type_key": "Int"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Select how subprocesses are created. Defaults to spawn.\\nWhen forkserver is selected, set_forkserver_preload will be called with either:\\n* the preload_modules list if provided by config\\n* the module containing the Job if it was loaded from a module\\n* dagster\\nhttps://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods",
"is_required": false,
"name": "start_method",
"type_key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47"
}
],
"given_name": null,
"key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "console",
"type_key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb"
}
],
"given_name": null,
"key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"String": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "String",
"key": "String",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.STRING"
},
"type_param_keys": null
}
}
},
"dagster_type_namespace_snapshot": {
"__class__": "DagsterTypeNamespaceSnapshot",
"all_dagster_type_snaps_by_key": {
"Any": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Any",
"is_builtin": true,
"key": "Any",
"kind": {
"__enum__": "DagsterTypeKind.ANY"
},
"loader_schema_key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Any",
"type_param_keys": []
},
"Bool": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Bool",
"is_builtin": true,
"key": "Bool",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Bool",
"type_param_keys": []
},
"Float": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Float",
"is_builtin": true,
"key": "Float",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Float",
"type_param_keys": []
},
"Int": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Int",
"is_builtin": true,
"key": "Int",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Int",
"type_param_keys": []
},
"Nothing": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Nothing",
"is_builtin": true,
"key": "Nothing",
"kind": {
"__enum__": "DagsterTypeKind.NOTHING"
},
"loader_schema_key": null,
"materializer_schema_key": null,
"name": "Nothing",
"type_param_keys": []
},
"String": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "String",
"is_builtin": true,
"key": "String",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "String",
"type_param_keys": []
}
}
},
"dep_structure_snapshot": {
"__class__": "DependencyStructureSnapshot",
"solid_invocation_snaps": [
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [],
"is_dynamic_mapped": false,
"solid_def_name": "noop_solid",
"solid_name": "noop_solid",
"tags": {}
}
]
},
"description": null,
"graph_def_name": "noop_pipeline",
"lineage_snapshot": null,
"mode_def_snaps": [
{
"__class__": "ModeDefSnap",
"description": null,
"logger_def_snaps": [
{
"__class__": "LoggerDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
},
"description": "The default colored console logger.",
"name": "console"
}
],
"name": "default",
"resource_def_snaps": [
{
"__class__": "ResourceDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"name": "io_manager"
}
],
"root_config_key": "Shape.32aa7ec6e7407e8a502d0a6094909a9365103a8e"
}
],
"name": "noop_pipeline",
"solid_definitions_snapshot": {
"__class__": "SolidDefinitionsSnapshot",
"composite_solid_def_snaps": [],
"solid_def_snaps": [
{
"__class__": "SolidDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"input_def_snaps": [],
"name": "noop_solid",
"output_def_snaps": [
{
"__class__": "OutputDefSnap",
"dagster_type_key": "Any",
"description": null,
"is_dynamic": false,
"is_required": true,
"name": "result"
}
],
"required_resource_keys": [],
"tags": {}
}
]
},
"tags": {}
}'''
snapshots['test_multi_type_config_array_dict_fields[Permissive] 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": "List of Array.Permissive.1f37a068c7c51aba23e9c41475c78eebc4e58471",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Permissive.1f37a068c7c51aba23e9c41475c78eebc4e58471",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Permissive.1f37a068c7c51aba23e9c41475c78eebc4e58471"
]
}'''
snapshots['test_multi_type_config_array_dict_fields[Selector] 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": "List of Array.Selector.1f37a068c7c51aba23e9c41475c78eebc4e58471",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Selector.1f37a068c7c51aba23e9c41475c78eebc4e58471",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Selector.1f37a068c7c51aba23e9c41475c78eebc4e58471"
]
}'''
snapshots['test_multi_type_config_array_dict_fields[Shape] 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": "List of Array.Shape.1f37a068c7c51aba23e9c41475c78eebc4e58471",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Shape.1f37a068c7c51aba23e9c41475c78eebc4e58471",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Shape.1f37a068c7c51aba23e9c41475c78eebc4e58471"
]
}'''
snapshots['test_multi_type_config_array_map 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": "List of Array.Map.String.Int",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Map.String.Int",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Map.String.Int"
]
}'''
snapshots['test_multi_type_config_nested_dicts[nested_dict_types0] 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "foo",
"type_key": "Permissive.c1ae6abf6c3c9e951eeefe4fde820cafc053ee40"
}
],
"given_name": null,
"key": "Selector.cb18f2a8fc9fa17668d8f4fd6b44c86c30c56774",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
}'''
snapshots['test_multi_type_config_nested_dicts[nested_dict_types1] 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "foo",
"type_key": "Shape.9bbda63934c371bf9be9a1cbb6fff9f5ee0be828"
}
],
"given_name": null,
"key": "Selector.b188a7737a2fecf0fca8cf94d331be517176dddf",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
}'''
snapshots['test_multi_type_config_nested_dicts[nested_dict_types2] 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "foo",
"type_key": "Selector.c1ae6abf6c3c9e951eeefe4fde820cafc053ee40"
}
],
"given_name": null,
"key": "Permissive.84180c8bd71a154af9d2965c8955925c228dc2bf",
"kind": {
"__enum__": "ConfigTypeKind.PERMISSIVE_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
}'''
snapshots['test_multi_type_config_nested_dicts[nested_dict_types3] 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "foo",
"type_key": "Shape.3d03240a3cdb5557305a2118fb3a059896368dd1"
}
],
"given_name": null,
"key": "Permissive.31f842392439e3c949b44f9e0e36bd1ed050a6b5",
"kind": {
"__enum__": "ConfigTypeKind.PERMISSIVE_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
}'''
snapshots['test_multi_type_config_nested_dicts[nested_dict_types4] 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "foo",
"type_key": "Selector.9bbda63934c371bf9be9a1cbb6fff9f5ee0be828"
}
],
"given_name": null,
"key": "Shape.88efc4d6ed14b1d35062d1e50a0227f606049e87",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
}'''
snapshots['test_multi_type_config_nested_dicts[nested_dict_types5] 1'] = '''{
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "foo",
"type_key": "Permissive.3d03240a3cdb5557305a2118fb3a059896368dd1"
}
],
"given_name": null,
"key": "Shape.0117583609bbf6ddcd1b1c9586aca163c454ed9d",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
}'''
snapshots['test_pipeline_snap_all_props 1'] = '''{
"__class__": "PipelineSnapshot",
"config_schema_snapshot": {
"__class__": "ConfigSchemaSnapshot",
"all_config_snaps_by_key": {
"Any": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": "Any",
"key": "Any",
"kind": {
"__enum__": "ConfigTypeKind.ANY"
},
"scalar_kind": null,
"type_param_keys": null
},
"Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
]
},
"Array.String": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.String",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.String",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"String"
]
},
"Bool": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Bool",
"key": "Bool",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.BOOL"
},
"type_param_keys": null
},
"Float": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Float",
"key": "Float",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.FLOAT"
},
"type_param_keys": null
},
"Int": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Int",
"key": "Int",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.INT"
},
"type_param_keys": null
},
"ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Bool",
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59"
]
},
"ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Float",
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3"
]
},
"ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Int",
"Selector.a9799b971d12ace70a2d8803c883c863417d0725"
]
},
"ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"String",
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269"
]
},
"Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "forkserver",
"type_key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "spawn",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "disabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "enabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Int"
}
],
"given_name": null,
"key": "Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Bool"
}
],
"given_name": null,
"key": "Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Float"
}
],
"given_name": null,
"key": "Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "String"
}
],
"given_name": null,
"key": "Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e52fa3afbe531d9522fae1206f3ae9d248775742": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
}
],
"given_name": null,
"key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Any"
}
],
"given_name": null,
"key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "in_process",
"type_key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "multiprocess",
"type_key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7"
}
],
"given_name": null,
"key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.0bb49540f1708dcf5378009c9571eba999502e19": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "io_manager",
"type_key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7"
}
],
"given_name": null,
"key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4"
}
],
"given_name": null,
"key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.241ac489ffa5f718db6444bae7849fb86a62e441": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"INFO\\"",
"description": null,
"is_required": false,
"name": "log_level",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"dagster\\"",
"description": null,
"is_required": false,
"name": "name",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.32aa7ec6e7407e8a502d0a6094909a9365103a8e": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"in_process\\": {}}",
"description": null,
"is_required": false,
"name": "execution",
"type_key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "loggers",
"type_key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"io_manager\\": {}}",
"description": null,
"is_required": false,
"name": "resources",
"type_key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"noop_solid\\": {}}",
"description": null,
"is_required": false,
"name": "solids",
"type_key": "Shape.ba913521099bed4314e25592059869c8f3a3c96e"
}
],
"given_name": null,
"key": "Shape.32aa7ec6e7407e8a502d0a6094909a9365103a8e",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.3baab16166bacfaf4705811e64d356112fd733cb": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
}
],
"given_name": null,
"key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "result",
"type_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742"
}
],
"given_name": null,
"key": "Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Explicit modules to preload in the forkserver.",
"is_required": false,
"name": "preload_modules",
"type_key": "Array.String"
}
],
"given_name": null,
"key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "path",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.69ff9be621991cc7961ea5e667d43edaac9d2339": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "outputs",
"type_key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
}
],
"given_name": null,
"key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.743e47901855cb245064dd633e217bfcb49a11a7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
}
],
"given_name": null,
"key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "marker_to_close",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
}
],
"given_name": null,
"key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ba913521099bed4314e25592059869c8f3a3c96e": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "noop_solid",
"type_key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339"
}
],
"given_name": null,
"key": "Shape.ba913521099bed4314e25592059869c8f3a3c96e",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c"
}
],
"given_name": null,
"key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [],
"given_name": null,
"key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "0",
"description": null,
"is_required": false,
"name": "max_concurrent",
"type_key": "Int"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Select how subprocesses are created. Defaults to spawn.\\nWhen forkserver is selected, set_forkserver_preload will be called with either:\\n* the preload_modules list if provided by config\\n* the module containing the Job if it was loaded from a module\\n* dagster\\nhttps://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods",
"is_required": false,
"name": "start_method",
"type_key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47"
}
],
"given_name": null,
"key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "console",
"type_key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb"
}
],
"given_name": null,
"key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"String": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "String",
"key": "String",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.STRING"
},
"type_param_keys": null
}
}
},
"dagster_type_namespace_snapshot": {
"__class__": "DagsterTypeNamespaceSnapshot",
"all_dagster_type_snaps_by_key": {
"Any": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Any",
"is_builtin": true,
"key": "Any",
"kind": {
"__enum__": "DagsterTypeKind.ANY"
},
"loader_schema_key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Any",
"type_param_keys": []
},
"Bool": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Bool",
"is_builtin": true,
"key": "Bool",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Bool",
"type_param_keys": []
},
"Float": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Float",
"is_builtin": true,
"key": "Float",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Float",
"type_param_keys": []
},
"Int": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Int",
"is_builtin": true,
"key": "Int",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Int",
"type_param_keys": []
},
"Nothing": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Nothing",
"is_builtin": true,
"key": "Nothing",
"kind": {
"__enum__": "DagsterTypeKind.NOTHING"
},
"loader_schema_key": null,
"materializer_schema_key": null,
"name": "Nothing",
"type_param_keys": []
},
"String": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "String",
"is_builtin": true,
"key": "String",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "String",
"type_param_keys": []
}
}
},
"dep_structure_snapshot": {
"__class__": "DependencyStructureSnapshot",
"solid_invocation_snaps": [
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [],
"is_dynamic_mapped": false,
"solid_def_name": "noop_solid",
"solid_name": "noop_solid",
"tags": {}
}
]
},
"description": "desc",
"graph_def_name": "noop_pipeline",
"lineage_snapshot": null,
"mode_def_snaps": [
{
"__class__": "ModeDefSnap",
"description": null,
"logger_def_snaps": [
{
"__class__": "LoggerDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
},
"description": "The default colored console logger.",
"name": "console"
}
],
"name": "default",
"resource_def_snaps": [
{
"__class__": "ResourceDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"name": "io_manager"
}
],
"root_config_key": "Shape.32aa7ec6e7407e8a502d0a6094909a9365103a8e"
}
],
"name": "noop_pipeline",
"solid_definitions_snapshot": {
"__class__": "SolidDefinitionsSnapshot",
"composite_solid_def_snaps": [],
"solid_def_snaps": [
{
"__class__": "SolidDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"input_def_snaps": [],
"name": "noop_solid",
"output_def_snaps": [
{
"__class__": "OutputDefSnap",
"dagster_type_key": "Any",
"description": null,
"is_dynamic": false,
"is_required": true,
"name": "result"
}
],
"required_resource_keys": [],
"tags": {}
}
]
},
"tags": {
"key": "value"
}
}'''
snapshots['test_pipeline_snap_all_props 2'] = '694ecc99696f5f5578d02efbac52c36d91915ed9'
snapshots['test_two_invocations_deps_snap 1'] = '''{
"__class__": "PipelineSnapshot",
"config_schema_snapshot": {
"__class__": "ConfigSchemaSnapshot",
"all_config_snaps_by_key": {
"Any": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": "Any",
"key": "Any",
"kind": {
"__enum__": "ConfigTypeKind.ANY"
},
"scalar_kind": null,
"type_param_keys": null
},
"Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
]
},
"Array.String": {
"__class__": "ConfigTypeSnap",
"description": "List of Array.String",
"enum_values": null,
"fields": null,
"given_name": null,
"key": "Array.String",
"kind": {
"__enum__": "ConfigTypeKind.ARRAY"
},
"scalar_kind": null,
"type_param_keys": [
"String"
]
},
"Bool": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Bool",
"key": "Bool",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.BOOL"
},
"type_param_keys": null
},
"Float": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Float",
"key": "Float",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.FLOAT"
},
"type_param_keys": null
},
"Int": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "Int",
"key": "Int",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.INT"
},
"type_param_keys": null
},
"ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Bool",
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59"
]
},
"ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Float",
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3"
]
},
"ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"Int",
"Selector.a9799b971d12ace70a2d8803c883c863417d0725"
]
},
"ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": null,
"given_name": null,
"key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR_UNION"
},
"scalar_kind": null,
"type_param_keys": [
"String",
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269"
]
},
"Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "forkserver",
"type_key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "spawn",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "disabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "enabled",
"type_key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709"
}
],
"given_name": null,
"key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.a9799b971d12ace70a2d8803c883c863417d0725": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Int"
}
],
"given_name": null,
"key": "Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Bool"
}
],
"given_name": null,
"key": "Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Float"
}
],
"given_name": null,
"key": "Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e04723c9d9937e3ab21206435b22247cfbe58269": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "String"
}
],
"given_name": null,
"key": "Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.e52fa3afbe531d9522fae1206f3ae9d248775742": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
}
],
"given_name": null,
"key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "json",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "pickle",
"type_key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "value",
"type_key": "Any"
}
],
"given_name": null,
"key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "in_process",
"type_key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"config\\": {\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}}",
"description": null,
"is_required": false,
"name": "multiprocess",
"type_key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7"
}
],
"given_name": null,
"key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f",
"kind": {
"__enum__": "ConfigTypeKind.SELECTOR"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.0bb49540f1708dcf5378009c9571eba999502e19": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "io_manager",
"type_key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7"
}
],
"given_name": null,
"key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"max_concurrent\\": 0, \\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4"
}
],
"given_name": null,
"key": "Shape.21277960d85eafb5579d7a10d7a715e444c5a1f7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.241ac489ffa5f718db6444bae7849fb86a62e441": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"INFO\\"",
"description": null,
"is_required": false,
"name": "log_level",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "\\"dagster\\"",
"description": null,
"is_required": false,
"name": "name",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.3baab16166bacfaf4705811e64d356112fd733cb": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
}
],
"given_name": null,
"key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.41de0e2d7b75524510155d0bdab8723c6feced3b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "result",
"type_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742"
}
],
"given_name": null,
"key": "Shape.41de0e2d7b75524510155d0bdab8723c6feced3b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Explicit modules to preload in the forkserver.",
"is_required": false,
"name": "preload_modules",
"type_key": "Array.String"
}
],
"given_name": null,
"key": "Shape.45a8f1f21db73ecbfa5b4e07b9aedc1835cef1ef",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": true,
"name": "path",
"type_key": "String"
}
],
"given_name": null,
"key": "Shape.4b53b73df342381d0d05c5f36183dc99cb9676e2",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.57b350a3b96de4345480fa93fb0b7f37bd600d4f": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"in_process\\": {}}",
"description": null,
"is_required": false,
"name": "execution",
"type_key": "Selector.fd22b7b986baf6998a8c16e63e78f44dd5e3f78f"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "loggers",
"type_key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"io_manager\\": {}}",
"description": null,
"is_required": false,
"name": "resources",
"type_key": "Shape.0bb49540f1708dcf5378009c9571eba999502e19"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"one\\": {}, \\"two\\": {}}",
"description": null,
"is_required": false,
"name": "solids",
"type_key": "Shape.ba7fa03e7f2b7ee324ff5f3ed290c26cb2585795"
}
],
"given_name": null,
"key": "Shape.57b350a3b96de4345480fa93fb0b7f37bd600d4f",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.69ff9be621991cc7961ea5e667d43edaac9d2339": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "outputs",
"type_key": "Array.Shape.41de0e2d7b75524510155d0bdab8723c6feced3b"
}
],
"given_name": null,
"key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.743e47901855cb245064dd633e217bfcb49a11a7": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
}
],
"given_name": null,
"key": "Shape.743e47901855cb245064dd633e217bfcb49a11a7",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "marker_to_close",
"type_key": "String"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
}
],
"given_name": null,
"key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ba7fa03e7f2b7ee324ff5f3ed290c26cb2585795": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"field_aliases": {
"solids": "ops"
},
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "one",
"type_key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{}",
"description": null,
"is_required": false,
"name": "two",
"type_key": "Shape.69ff9be621991cc7961ea5e667d43edaac9d2339"
}
],
"given_name": null,
"key": "Shape.ba7fa03e7f2b7ee324ff5f3ed290c26cb2585795",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"retries\\": {\\"enabled\\": {}}}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.979b3d2fece4f3eb92e90f2ec9fb4c85efe9ea5c"
}
],
"given_name": null,
"key": "Shape.ca5906d9a0377218b4ee7d940ad55957afa73d1b",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [],
"given_name": null,
"key": "Shape.da39a3ee5e6b4b0d3255bfef95601890afd80709",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "0",
"description": null,
"is_required": false,
"name": "max_concurrent",
"type_key": "Int"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"enabled\\": {}}",
"description": null,
"is_required": false,
"name": "retries",
"type_key": "Selector.1bfb167aea90780aa679597800c71bd8c65ed0b2"
},
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": "Select how subprocesses are created. Defaults to spawn.\\nWhen forkserver is selected, set_forkserver_preload will be called with either:\\n* the preload_modules list if provided by config\\n* the module containing the Job if it was loaded from a module\\n* dagster\\nhttps://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods",
"is_required": false,
"name": "start_method",
"type_key": "Selector.0f5471adc2ad814d1c9fd94e2fa73c07217dea47"
}
],
"given_name": null,
"key": "Shape.e248cccc2d2206bf427e9bc9c2d22833f2aeb6d4",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c": {
"__class__": "ConfigTypeSnap",
"description": null,
"enum_values": null,
"fields": [
{
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "console",
"type_key": "Shape.3baab16166bacfaf4705811e64d356112fd733cb"
}
],
"given_name": null,
"key": "Shape.ebeaf4550c200fb540f2e1f3f2110debd8c4157c",
"kind": {
"__enum__": "ConfigTypeKind.STRICT_SHAPE"
},
"scalar_kind": null,
"type_param_keys": null
},
"String": {
"__class__": "ConfigTypeSnap",
"description": "",
"enum_values": null,
"fields": null,
"given_name": "String",
"key": "String",
"kind": {
"__enum__": "ConfigTypeKind.SCALAR"
},
"scalar_kind": {
"__enum__": "ConfigScalarKind.STRING"
},
"type_param_keys": null
}
}
},
"dagster_type_namespace_snapshot": {
"__class__": "DagsterTypeNamespaceSnapshot",
"all_dagster_type_snaps_by_key": {
"Any": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Any",
"is_builtin": true,
"key": "Any",
"kind": {
"__enum__": "DagsterTypeKind.ANY"
},
"loader_schema_key": "Selector.f2fe6dfdc60a1947a8f8e7cd377a012b47065bc4",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Any",
"type_param_keys": []
},
"Bool": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Bool",
"is_builtin": true,
"key": "Bool",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Bool-Selector.be5d518b39e86a43c5f2eecaf538c1f6c7711b59",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Bool",
"type_param_keys": []
},
"Float": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Float",
"is_builtin": true,
"key": "Float",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Float-Selector.d00a37e3807d37c9f69cc62997c4a5f4a176e5c3",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Float",
"type_param_keys": []
},
"Int": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Int",
"is_builtin": true,
"key": "Int",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.Int-Selector.a9799b971d12ace70a2d8803c883c863417d0725",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "Int",
"type_param_keys": []
},
"Nothing": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "Nothing",
"is_builtin": true,
"key": "Nothing",
"kind": {
"__enum__": "DagsterTypeKind.NOTHING"
},
"loader_schema_key": null,
"materializer_schema_key": null,
"name": "Nothing",
"type_param_keys": []
},
"String": {
"__class__": "DagsterTypeSnap",
"description": null,
"display_name": "String",
"is_builtin": true,
"key": "String",
"kind": {
"__enum__": "DagsterTypeKind.SCALAR"
},
"loader_schema_key": "ScalarUnion.String-Selector.e04723c9d9937e3ab21206435b22247cfbe58269",
"materializer_schema_key": "Selector.e52fa3afbe531d9522fae1206f3ae9d248775742",
"name": "String",
"type_param_keys": []
}
}
},
"dep_structure_snapshot": {
"__class__": "DependencyStructureSnapshot",
"solid_invocation_snaps": [
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [],
"is_dynamic_mapped": false,
"solid_def_name": "noop_solid",
"solid_name": "one",
"tags": {}
},
{
"__class__": "SolidInvocationSnap",
"input_dep_snaps": [],
"is_dynamic_mapped": false,
"solid_def_name": "noop_solid",
"solid_name": "two",
"tags": {}
}
]
},
"description": null,
"graph_def_name": "two_solid_pipeline",
"lineage_snapshot": null,
"mode_def_snaps": [
{
"__class__": "ModeDefSnap",
"description": null,
"logger_def_snaps": [
{
"__class__": "LoggerDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": true,
"default_value_as_json_str": "{\\"log_level\\": \\"INFO\\", \\"name\\": \\"dagster\\"}",
"description": null,
"is_required": false,
"name": "config",
"type_key": "Shape.241ac489ffa5f718db6444bae7849fb86a62e441"
},
"description": "The default colored console logger.",
"name": "console"
}
],
"name": "default",
"resource_def_snaps": [
{
"__class__": "ResourceDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"name": "io_manager"
}
],
"root_config_key": "Shape.57b350a3b96de4345480fa93fb0b7f37bd600d4f"
}
],
"name": "two_solid_pipeline",
"solid_definitions_snapshot": {
"__class__": "SolidDefinitionsSnapshot",
"composite_solid_def_snaps": [],
"solid_def_snaps": [
{
"__class__": "SolidDefSnap",
"config_field_snap": {
"__class__": "ConfigFieldSnap",
"default_provided": false,
"default_value_as_json_str": null,
"description": null,
"is_required": false,
"name": "config",
"type_key": "Any"
},
"description": null,
"input_def_snaps": [],
"name": "noop_solid",
"output_def_snaps": [
{
"__class__": "OutputDefSnap",
"dagster_type_key": "Any",
"description": null,
"is_dynamic": false,
"is_required": true,
"name": "result"
}
],
"required_resource_keys": [],
"tags": {}
}
]
},
"tags": {}
}'''
snapshots['test_two_invocations_deps_snap 2'] = 'b444acd938a29a58d9eae6667b067a6106a09185'
| 32.555295 | 379 | 0.531554 | 16,456 | 236,091 | 7.124575 | 0.016043 | 0.071263 | 0.071391 | 0.092544 | 0.982754 | 0.978148 | 0.975572 | 0.973201 | 0.972041 | 0.968339 | 0 | 0.111845 | 0.332639 | 236,091 | 7,251 | 380 | 32.559785 | 0.632275 | 0.000263 | 0 | 0.796678 | 0 | 0.00083 | 0.99742 | 0.243541 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.001107 | 0.000277 | 0 | 0.000277 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d5ffbb003e5d61069d26a559ffe3e9f62cd9fb58 | 17,086 | py | Python | tests/test_task_definitions.py | mikedingjan/ecs-deplojo | 0b53dea28ae72939a0ca9873b5046a66ccaf284e | [
"MIT"
] | 12 | 2018-09-18T07:08:09.000Z | 2021-05-26T17:08:42.000Z | tests/test_task_definitions.py | mikedingjan/ecs-deplojo | 0b53dea28ae72939a0ca9873b5046a66ccaf284e | [
"MIT"
] | 13 | 2018-10-30T18:38:23.000Z | 2021-05-27T07:29:22.000Z | tests/test_task_definitions.py | mikedingjan/ecs-deplojo | 0b53dea28ae72939a0ca9873b5046a66ccaf284e | [
"MIT"
] | 4 | 2018-11-28T01:00:06.000Z | 2020-12-29T11:33:46.000Z | import os
from ecs_deplojo import task_definitions
BASE_DIR = os.path.dirname(os.path.abspath(__file__))
def test_generate_task_definition(tmpdir):
task_data = """
{
"family": "default",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "${image}",
"essential": true,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [
{
"containerPort": 8080,
"hostPort": 0
}
]
}
]
}
""".strip()
filename = tmpdir.join("task_definition.json")
filename.write(task_data)
task_definition = task_definitions.generate_task_definition(
filename.strpath,
environment={},
template_vars={"image": "my-docker-image:1.0"},
overrides={},
name="my-task-def",
)
expected = task_definitions.TaskDefinition(
{
"family": "my-task-def",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "my-docker-image:1.0",
"essential": True,
"command": ["hello", "world"],
"hostname": "my-task-def",
"memory": 256,
"cpu": 0,
"portMappings": [{"containerPort": 8080, "hostPort": 0}],
"environment": {},
}
],
"tags": [{"key": "createdBy", "value": "ecs-deplojo"}],
}
)
assert task_definition == expected
def test_generate_task_definition_overrides(tmpdir):
task_data = """
{
"family": "default",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "${image}",
"essential": true,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [
{
"containerPort": 8080,
"hostPort": 0
}
]
}
]
}
""".strip()
filename = tmpdir.join("task_definition.json")
filename.write(task_data)
task_definition = task_definitions.generate_task_definition(
filename.strpath,
environment={},
template_vars={"image": "my-docker-image:1.0"},
overrides={
"default": {
"memory": 512,
"memoryReservation": 128,
"portMappings": [{"hostPort": 80, "containerPort": 9000}],
}
},
name="my-task-def",
)
expected = task_definitions.TaskDefinition(
{
"family": "my-task-def",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "my-docker-image:1.0",
"essential": True,
"command": ["hello", "world"],
"hostname": "my-task-def",
"memory": 512,
"memoryReservation": 128,
"cpu": 0,
"portMappings": [
{"containerPort": 8080, "hostPort": 0},
{"containerPort": 9000, "hostPort": 80},
],
"environment": {},
}
],
"tags": [{"key": "createdBy", "value": "ecs-deplojo"}],
}
)
assert task_definition == expected
def test_generate_multiple_task_definitions(tmpdir):
task_data = """
{
"family": "default",
"volumes": [],
"containerDefinitions": [
{
"name": "web-1",
"image": "${image}",
"essential": true,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [
{
"containerPort": 8080,
"hostPort": 0
}
]
},
{
"name": "web-2",
"image": "${image}",
"essential": true,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [
{
"containerPort": 8080,
"hostPort": 0
}
]
}
]
}
""".strip()
filename = tmpdir.join("task_definition.json")
filename.write(task_data)
config = {
"environment": {"DATABASE_URL": "postgresql://"},
"environment_groups": {
"group-1": {"ENV_CODE": "group-1"},
"group-2": {"ENV_CODE": "group-2"},
},
"task_definitions": {
"task-def-1": {
"template": filename.strpath,
"environment_group": "group-1",
"overrides": {"web-1": {"memory": 512}},
},
"task-def-2": {
"template": filename.strpath,
"environment_group": "group-2",
"overrides": {"web-1": {"memory": 512}},
},
},
}
result = task_definitions.generate_task_definitions(
config, template_vars={"image": "my-docker-image:1.0"}, base_path=None
)
expected = {
"task-def-1": task_definitions.TaskDefinition(
{
"family": "task-def-1",
"volumes": [],
"containerDefinitions": [
{
"name": "web-1",
"image": "my-docker-image:1.0",
"essential": True,
"command": ["hello", "world"],
"hostname": "task-def-1-web-1",
"memory": 512,
"cpu": 0,
"portMappings": [{"containerPort": 8080, "hostPort": 0}],
"environment": {
"DATABASE_URL": "postgresql://",
"ENV_CODE": "group-1",
},
},
{
"name": "web-2",
"image": "my-docker-image:1.0",
"essential": True,
"command": ["hello", "world"],
"hostname": "task-def-1-web-2",
"memory": 256,
"cpu": 0,
"portMappings": [{"containerPort": 8080, "hostPort": 0}],
"environment": {
"DATABASE_URL": "postgresql://",
"ENV_CODE": "group-1",
},
},
],
"tags": [{"key": "createdBy", "value": "ecs-deplojo"}],
}
),
"task-def-2": task_definitions.TaskDefinition(
{
"family": "task-def-2",
"volumes": [],
"containerDefinitions": [
{
"name": "web-1",
"image": "my-docker-image:1.0",
"essential": True,
"hostname": "task-def-2-web-1",
"command": ["hello", "world"],
"memory": 512,
"cpu": 0,
"portMappings": [{"containerPort": 8080, "hostPort": 0}],
"environment": {
"DATABASE_URL": "postgresql://",
"ENV_CODE": "group-2",
},
},
{
"name": "web-2",
"image": "my-docker-image:1.0",
"hostname": "task-def-2-web-2",
"essential": True,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [{"containerPort": 8080, "hostPort": 0}],
"environment": {
"DATABASE_URL": "postgresql://",
"ENV_CODE": "group-2",
},
},
],
"tags": [{"key": "createdBy", "value": "ecs-deplojo"}],
}
),
}
assert result == expected
def test_generate_task_definitions_write_output(tmpdir):
task_data = """
{
"family": "default",
"volumes": [],
"containerDefinitions": [
{
"name": "web-1",
"image": "${image}",
"essential": true,
"command": ["hello", "world"],
"memory": 256,
"logConfiguration": {},
"cpu": 0,
"portMappings": [
{
"containerPort": 8080,
"hostPort": 0
}
]
}
]
}
""".strip()
base_path = tmpdir.join("input").mkdir()
filename = base_path.join("task_definition.json")
filename.write(task_data)
config = {
"environment": {"DATABASE_URL": "postgresql://"},
"task_definitions": {
"task-def-1": {
"template": "task_definition.json",
"overrides": {
"web-1": {
"memory": 512,
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "default",
"awslogs-region": "eu-west-1",
},
},
}
},
}
},
}
task_definitions.generate_task_definitions(
config,
template_vars={"image": "my-docker-image:1.0"},
base_path=base_path.strpath,
output_path=tmpdir.join("output").mkdir().strpath,
)
assert tmpdir.join("output").join("task-def-1.json").exists()
def test_generate_task_definition_with_task_role_arn(tmpdir):
task_data = """
{
"family": "default",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "${image}",
"essential": true,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [
{
"containerPort": 8080,
"hostPort": 0
}
]
}
]
}
""".strip()
filename = tmpdir.join("task_definition.json")
filename.write(task_data)
result = task_definitions.generate_task_definition(
filename.strpath,
environment={},
task_role_arn="arn:my-task-role",
template_vars={"image": "my-docker-image:1.0"},
overrides={},
name="my-task-def",
)
expected = task_definitions.TaskDefinition(
{
"family": "my-task-def",
"taskRoleArn": "arn:my-task-role",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "my-docker-image:1.0",
"essential": True,
"hostname": "my-task-def",
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [{"containerPort": 8080, "hostPort": 0}],
"environment": {},
}
],
"tags": [{"key": "createdBy", "value": "ecs-deplojo"}],
}
)
assert result == expected
def test_generate_task_definition_with_execution_role_arn(tmpdir):
task_data = """
{
"family": "default",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "${image}",
"essential": true,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [
{
"containerPort": 8080,
"hostPort": 0
}
]
}
]
}
""".strip()
filename = tmpdir.join("task_definition.json")
filename.write(task_data)
result = task_definitions.generate_task_definition(
filename.strpath,
environment={},
execution_role_arn="arn:my-task-execution-role",
template_vars={"image": "my-docker-image:1.0"},
overrides={},
name="my-task-def",
)
expected = task_definitions.TaskDefinition(
{
"family": "my-task-def",
"executionRoleArn": "arn:my-task-execution-role",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "my-docker-image:1.0",
"essential": True,
"hostname": "my-task-def",
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [{"containerPort": 8080, "hostPort": 0}],
"environment": {},
}
],
"tags": [{"key": "createdBy", "value": "ecs-deplojo"}],
}
)
assert result == expected
def test_generate_task_definition_with_secrets(tmpdir):
task_data = """
{
"family": "default",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "${image}",
"essential": true,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [
{
"containerPort": 8080,
"hostPort": 0
}
]
}
]
}
""".strip()
filename = tmpdir.join("task_definition.json")
filename.write(task_data)
result = task_definitions.generate_task_definition(
filename.strpath,
environment={},
task_role_arn="arn:my-task-role",
template_vars={"image": "my-docker-image:1.0"},
overrides={},
name="my-task-def",
secrets={
"SUPER_SECRET_ENV_VAR": "/path/in/param/store",
"SUPER_SECRET_ENV_VAR2": "/other/path/in/param/store",
},
)
expected = task_definitions.TaskDefinition(
{
"family": "my-task-def",
"taskRoleArn": "arn:my-task-role",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "my-docker-image:1.0",
"essential": True,
"hostname": "my-task-def",
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [{"containerPort": 8080, "hostPort": 0}],
"environment": {},
"secrets": {
"SUPER_SECRET_ENV_VAR": "/path/in/param/store",
"SUPER_SECRET_ENV_VAR2": "/other/path/in/param/store",
},
}
],
"tags": [{"key": "createdBy", "value": "ecs-deplojo"}],
}
)
assert result == expected
def test_generate_task_definition_awsvpc(tmpdir):
task_data = """
{
"family": "default",
"networkMode": "awsvpc",
"volumes": [],
"containerDefinitions": [
{
"name": "default",
"image": "${image}",
"essential": true,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [
{
"containerPort": 8080,
"hostPort": 0
}
]
}
]
}
""".strip()
filename = tmpdir.join("task_definition.json")
filename.write(task_data)
task_definition = task_definitions.generate_task_definition(
filename.strpath,
environment={},
template_vars={"image": "my-docker-image:1.0"},
overrides={},
name="my-task-def",
)
expected = task_definitions.TaskDefinition(
{
"family": "my-task-def",
"volumes": [],
"networkMode": "awsvpc",
"containerDefinitions": [
{
"name": "default",
"image": "my-docker-image:1.0",
"essential": True,
"command": ["hello", "world"],
"memory": 256,
"cpu": 0,
"portMappings": [{"containerPort": 8080, "hostPort": 0}],
"environment": {},
}
],
"tags": [{"key": "createdBy", "value": "ecs-deplojo"}],
}
)
assert task_definition == expected
| 29.407917 | 81 | 0.408053 | 1,217 | 17,086 | 5.599014 | 0.086278 | 0.029792 | 0.047402 | 0.080863 | 0.894188 | 0.86469 | 0.820076 | 0.813913 | 0.807015 | 0.800558 | 0 | 0.028684 | 0.436849 | 17,086 | 580 | 82 | 29.458621 | 0.679485 | 0 | 0 | 0.641264 | 1 | 0 | 0.426899 | 0.019314 | 0 | 0 | 0 | 0 | 0.01487 | 1 | 0.01487 | false | 0 | 0.003717 | 0 | 0.018587 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91386fdb938c92a3c9ff8fe321c07a8f6f13f931 | 89 | py | Python | python/testData/psi/PatternMatchingRecoveryIllegalExpressionInSequencePatternItem.py | 06needhamt/intellij-community | 63d7b8030e4fdefeb4760e511e289f7e6b3a5c5b | [
"Apache-2.0"
] | null | null | null | python/testData/psi/PatternMatchingRecoveryIllegalExpressionInSequencePatternItem.py | 06needhamt/intellij-community | 63d7b8030e4fdefeb4760e511e289f7e6b3a5c5b | [
"Apache-2.0"
] | null | null | null | python/testData/psi/PatternMatchingRecoveryIllegalExpressionInSequencePatternItem.py | 06needhamt/intellij-community | 63d7b8030e4fdefeb4760e511e289f7e6b3a5c5b | [
"Apache-2.0"
] | null | null | null | match x:
case (1, 2 + (3 / 4)):
pass
case (1 << (2 / 3), 4):
pass | 17.8 | 27 | 0.337079 | 14 | 89 | 2.142857 | 0.571429 | 0.333333 | 0.4 | 0.466667 | 0.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0.460674 | 89 | 5 | 28 | 17.8 | 0.458333 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.4 | 0 | null | null | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
91475483c47e0ba3bff21aa6c849eb7b2f1b4eaa | 1,443 | py | Python | myapp/tests/test_classic_xunit_style.py | thinkAmi-sandbox/Django_pytest_setup_teardown_fixture-sample | f47f9dcf35f7b401ed173ce9535308ea6b4f455e | [
"Unlicense"
] | 2 | 2019-02-19T09:38:01.000Z | 2019-09-29T11:07:41.000Z | myapp/tests/test_classic_xunit_style.py | thinkAmi-sandbox/Django_pytest_setup_teardown_fixture-sample | f47f9dcf35f7b401ed173ce9535308ea6b4f455e | [
"Unlicense"
] | null | null | null | myapp/tests/test_classic_xunit_style.py | thinkAmi-sandbox/Django_pytest_setup_teardown_fixture-sample | f47f9dcf35f7b401ed173ce9535308ea6b4f455e | [
"Unlicense"
] | null | null | null | from django.test import TestCase
import sys
def setup_module(module):
print('classic xunit style - setup > {}'.format(sys._getframe().f_code.co_name))
def teardown_module(module):
print('classic xunit style - teardown > {}'.format(sys._getframe().f_code.co_name))
def setup_function(function):
print('classic xunit style - setup > {}'.format(sys._getframe().f_code.co_name))
def teardown_function(function):
print('classic xunit style - teardown > {}'.format(sys._getframe().f_code.co_name))
class Test_ClassicXunitStyle(TestCase):
@classmethod
def setup_class(cls):
print('classic xunit style - setup > {}'.format(sys._getframe().f_code.co_name))
@classmethod
def teardown_class(cls):
print('classic xunit style - teardown > {}'.format(sys._getframe().f_code.co_name))
def setup_method(self, method):
print('classic xunit style - setup > {}'.format(sys._getframe().f_code.co_name))
def teardown_method(self, method):
print('classic xunit style - teardown > {}'.format(sys._getframe().f_code.co_name))
# test method
def testSpam(self):
print('classic xunit style > [{}]'.format(sys._getframe().f_code.co_name))
assert True
def testHam(self):
print('classic xunit style > [{}]'.format(sys._getframe().f_code.co_name))
assert True | 36.075 | 92 | 0.63964 | 177 | 1,443 | 4.99435 | 0.175141 | 0.135747 | 0.192308 | 0.248869 | 0.829186 | 0.829186 | 0.747738 | 0.711538 | 0.711538 | 0.711538 | 0 | 0 | 0.221067 | 1,443 | 40 | 93 | 36.075 | 0.786477 | 0.007623 | 0 | 0.518519 | 0 | 0 | 0.229885 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 1 | 0.37037 | false | 0 | 0.074074 | 0 | 0.481481 | 0.37037 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e67c893dfa9befd932dd52321a7043c0065d5aa9 | 17,057 | py | Python | tests/chainer_tests/test_function_and_function_node.py | zaltoprofen/chainer | 3b03f9afc80fd67f65d5e0395ef199e9506b6ee1 | [
"MIT"
] | 3,705 | 2017-06-01T07:36:12.000Z | 2022-03-30T10:46:15.000Z | tests/chainer_tests/test_function_and_function_node.py | zaltoprofen/chainer | 3b03f9afc80fd67f65d5e0395ef199e9506b6ee1 | [
"MIT"
] | 5,998 | 2017-06-01T06:40:17.000Z | 2022-03-08T01:42:44.000Z | tests/chainer_tests/test_function_and_function_node.py | zaltoprofen/chainer | 3b03f9afc80fd67f65d5e0395ef199e9506b6ee1 | [
"MIT"
] | 1,150 | 2017-06-02T03:39:46.000Z | 2022-03-29T02:29:32.000Z | import unittest
import numpy
import chainer
from chainer import testing
import chainer.testing.backend
import chainerx
def _get_expected_xp(backend_config, is_function):
# Returns a pair of xp's expected in forward() and backward() respectively.
xp = backend_config.xp
if xp is chainerx:
forward_xp = backend_config.device.fallback_device.xp
else:
forward_xp = xp
if is_function:
# chainer.Function
backward_xp = forward_xp
else:
# chainer.FunctionNode
backward_xp = xp
return forward_xp, backward_xp
@testing.parameterize(*testing.product({
'function_node': [True, False],
}))
@testing.backend.inject_backend_tests(
None,
[
# CPU
{},
# CUDA
{'use_cuda': True, 'cuda_device': 0},
{'use_cuda': True, 'cuda_device': 1},
# ChainerX
{'use_chainerx': True, 'chainerx_device': 'native:0'},
{'use_chainerx': True, 'chainerx_device': 'cuda:0'},
{'use_chainerx': True, 'chainerx_device': 'cuda:1'},
])
class TestFunctionBackprop(unittest.TestCase):
def call_func_function(self, backend_config, x1, x2, x3):
forward_xp, backward_xp = _get_expected_xp(backend_config, True)
class Func(chainer.Function):
def __init__(self):
self.array_init = backend_config.device.send(
numpy.array([3], numpy.float32))
def forward(self, inputs):
# Inputs
assert isinstance(inputs, tuple)
# x1, x3: float32
# x2: int32
x1, x2, x3 = inputs
assert isinstance(x1, forward_xp.ndarray)
assert isinstance(x2, forward_xp.ndarray)
assert isinstance(x3, forward_xp.ndarray)
# attribute fallback
assert isinstance(self.array_init, forward_xp.ndarray)
self.array_forward = forward_xp.array([2], numpy.float32)
assert isinstance(self.array_forward, forward_xp.ndarray)
y1 = x2 - 1 # int32
y2 = x1 * x3 + x2.astype(x1.dtype)
y3 = x1 + x3
self.retain_inputs((0, 2))
self.retain_outputs((0, 1,))
return y1, y2, y3
def backward(self, inputs, grad_outputs):
# Retained inputs
assert isinstance(inputs, tuple)
x1, x2, x3 = inputs
assert isinstance(x1, backward_xp.ndarray)
assert x2 is None # not retained
assert isinstance(x3, backward_xp.ndarray)
# Output gradients
assert isinstance(grad_outputs, tuple)
gy1, gy2, gy3 = grad_outputs
assert gy1 is None # y1 is int32
# y3 is disconnected
# TODO(niboshi): Expression after "or" is workaround for
# chainerx. ChainerX backward should return None for
# disconnected output and this workaround should be removed.
assert (gy3 is None
or (float(gy3.max()) == 0
and float((-gy3).max()) == 0))
# Retained outputs
output_data = self.output_data
assert isinstance(output_data, tuple)
y1, y2, y3 = output_data
assert isinstance(y1, backward_xp.ndarray)
assert isinstance(y2, backward_xp.ndarray)
assert y3 is None
# attribute fallback
assert isinstance(self.array_init, backward_xp.ndarray)
assert isinstance(self.array_forward, backward_xp.ndarray)
self.array_backward = backward_xp.array([4], numpy.float32)
assert isinstance(self.array_backward, backward_xp.ndarray)
gx1 = x3 * gy2 # + gy3
gx2 = None
gx3 = x1 * gy2 # + gy3
return gx1, gx2, gx3
return Func()(x1, x2, x3)
def call_func_function_node(self, backend_config, x1, x2, x3):
forward_xp, backward_xp = _get_expected_xp(backend_config, False)
class Func(chainer.FunctionNode):
def __init__(self):
self.array_init = backend_config.device.send(
numpy.array([3], numpy.float32))
def forward(self, inputs):
# Inputs
# x1, x3: float32
# x2: int32
x1, x2, x3 = inputs
assert isinstance(x1, forward_xp.ndarray)
assert isinstance(x2, forward_xp.ndarray)
assert isinstance(x3, forward_xp.ndarray)
# attribute fallback
assert isinstance(self.array_init, forward_xp.ndarray)
self.array_forward = forward_xp.array([2], numpy.float32)
assert isinstance(self.array_forward, forward_xp.ndarray)
y1 = x2 - 1 # int32
y2 = x1 * x3 + x2.astype(x1.dtype)
y3 = x1 + x3
self.retain_inputs((0, 2))
self.retain_outputs((0, 1,))
return y1, y2, y3
def backward(self, input_indexes, grad_outputs):
# Input indexes
assert isinstance(input_indexes, tuple)
assert input_indexes == (0, 2)
# Retained inputs
retained_inputs = self.get_retained_inputs()
assert isinstance(retained_inputs, tuple)
x1, x3 = retained_inputs
assert isinstance(x1.array, backward_xp.ndarray)
assert isinstance(x3.array, backward_xp.ndarray)
# Output gradients
assert isinstance(grad_outputs, tuple)
gy1, gy2, gy3 = grad_outputs
assert gy1 is None # y1 is int32
assert isinstance(gy2.array, backward_xp.ndarray)
# y3 is disconnected
# TODO(niboshi): Expression after "or" is workaround for
# chainerx. ChainerX backward should return None for
# disconnected output and this workaround should be removed.
assert (gy3 is None
or (float(gy3.array.max()) == 0
and float((-gy3.array).max()) == 0))
# Retained outputs
retained_outputs = self.get_retained_outputs()
assert isinstance(retained_outputs, tuple)
y1, y2, = retained_outputs
assert isinstance(y1.array, backward_xp.ndarray)
assert isinstance(y2.array, backward_xp.ndarray)
# attribute fallback
assert isinstance(self.array_init, backward_xp.ndarray)
assert isinstance(self.array_forward, backward_xp.ndarray)
self.array_backward = backward_xp.array([4], numpy.float32)
assert isinstance(self.array_backward, backward_xp.ndarray)
gx1 = x3 * gy2 # + gy3
gx2 = None
gx3 = x1 * gy2 # + gy3
return gx1, gx2, gx3
return Func().apply((x1, x2, x3))
def call_func(self, backend_config, x1, x2, x3):
if self.function_node:
return self.call_func_function_node(backend_config, x1, x2, x3)
else:
return self.call_func_function(backend_config, x1, x2, x3)
def test_backprop(self, backend_config):
x1_arr = numpy.array([2, 3], numpy.float32)
x2_arr = numpy.array([3, 1], numpy.int32)
x3_arr = numpy.array([5, 2], numpy.float32)
gy2_arr = numpy.array([2, 4], numpy.float32)
x1_arr, x2_arr, x3_arr, gy2_arr = backend_config.get_array(
(x1_arr, x2_arr, x3_arr, gy2_arr))
x1 = chainer.Variable(x1_arr)
x2 = chainer.Variable(x2_arr, requires_grad=False)
x3 = chainer.Variable(x3_arr)
# Forward
y1, y2, y3 = self.call_func(backend_config, x1, x2, x3)
assert isinstance(y1.array, backend_config.xp.ndarray)
assert isinstance(y2.array, backend_config.xp.ndarray)
assert isinstance(y3.array, backend_config.xp.ndarray)
# Backward
y2.grad = gy2_arr
y2.backward()
assert isinstance(x1.grad, backend_config.xp.ndarray)
assert x2.grad is None
assert isinstance(x3.grad, backend_config.xp.ndarray)
@testing.parameterize(*testing.product({
'function_node': [True, False],
}))
@testing.backend.inject_backend_tests(
None,
[
# CPU
{},
# CUDA
{'use_cuda': True, 'cuda_device': 0},
{'use_cuda': True, 'cuda_device': 1},
# ChainerX
{'use_chainerx': True, 'chainerx_device': 'native:0'},
{'use_chainerx': True, 'chainerx_device': 'cuda:0'},
{'use_chainerx': True, 'chainerx_device': 'cuda:1'},
])
class TestFunctionInputNone(unittest.TestCase):
def call_func_function(self, backend_config, x2):
forward_xp, backward_xp = _get_expected_xp(backend_config, True)
class Func(chainer.Function):
def forward(self, inputs):
# Inputs
assert isinstance(inputs, tuple)
x1, x2, x3 = inputs
assert x1 is None
assert isinstance(x2, forward_xp.ndarray)
assert x3 is None
y1 = x2 * 3
self.retain_inputs((1, 2))
self.retain_outputs(())
return y1,
def backward(self, inputs, grad_outputs):
# Retained inputs
assert isinstance(inputs, tuple)
x1, x2, x3 = inputs
assert x1 is None
assert isinstance(x2, backward_xp.ndarray)
assert x3 is None
# Output gradients
assert isinstance(grad_outputs, tuple)
gy1, = grad_outputs
assert isinstance(gy1, backward_xp.ndarray)
# Retained outputs
output_data = self.output_data
assert isinstance(output_data, tuple)
y1, = output_data
assert y1 is None
gx2 = 3 * gy1
return None, gx2, None
return Func()(None, x2, None),
def call_func_function_node(self, backend_config, x2):
forward_xp, backward_xp = _get_expected_xp(backend_config, False)
class Func(chainer.FunctionNode):
def forward(self, inputs):
# Inputs
x1, x2, x3 = inputs
assert x1 is None
assert isinstance(x2, forward_xp.ndarray)
assert x3 is None
y1 = x2 * 3
self.retain_inputs((1, 2))
self.retain_outputs(())
return y1,
def backward(self, input_indexes, grad_outputs):
# Input indexes
assert isinstance(input_indexes, tuple)
assert input_indexes == (1,)
# Retained inputs
retained_inputs = self.get_retained_inputs()
assert isinstance(retained_inputs, tuple)
x2, x3 = retained_inputs
assert isinstance(x2.array, backward_xp.ndarray)
assert x3 is None
# Output grads
assert isinstance(grad_outputs, tuple)
gy1, = grad_outputs
assert isinstance(gy1.array, backward_xp.ndarray)
# Retained outputs
retained_outputs = self.get_retained_outputs()
assert retained_outputs is ()
gx2 = 3 * gy1
return None, gx2, None
return Func().apply((None, x2, None))
def call_func(self, backend_config, x1):
if self.function_node:
return self.call_func_function_node(backend_config, x1)
else:
return self.call_func_function(backend_config, x1)
def test_backprop(self, backend_config):
x2_arr = numpy.array([2, 3], numpy.float32)
gy1_arr = numpy.array([2, 4], numpy.float32)
x2_arr, gy1_arr = backend_config.get_array((x2_arr, gy1_arr))
x2 = chainer.Variable(x2_arr, requires_grad=True)
# Forward
y1, = self.call_func(backend_config, x2)
assert isinstance(y1.array, backend_config.xp.ndarray)
# Backward
y1.grad = gy1_arr
y1.backward()
assert isinstance(x2.grad, backend_config.xp.ndarray)
@testing.parameterize(*testing.product({
'function_node': [True, False],
}))
@testing.backend.inject_backend_tests(
None,
[
# CPU
{},
# CUDA
{'use_cuda': True, 'cuda_device': 0},
{'use_cuda': True, 'cuda_device': 1},
# ChainerX
{'use_chainerx': True, 'chainerx_device': 'native:0'},
{'use_chainerx': True, 'chainerx_device': 'cuda:0'},
{'use_chainerx': True, 'chainerx_device': 'cuda:1'},
])
class TestFunctionOutputNone(unittest.TestCase):
def call_func_function(self, backend_config, x1):
forward_xp, backward_xp = _get_expected_xp(backend_config, True)
class Func(chainer.Function):
def forward(self, inputs):
# Inputs
assert isinstance(inputs, tuple)
x1, = inputs
assert isinstance(x1, forward_xp.ndarray)
y2 = x1 * 3 + 2
self.retain_inputs(())
self.retain_outputs((1, 2,))
return None, y2, None
def backward(self, inputs, grad_outputs):
# Retained inputs
assert isinstance(inputs, tuple)
x1, = inputs
assert x1 is None
# Output gradients
assert isinstance(grad_outputs, tuple)
gy1, gy2, gy3 = grad_outputs
assert gy1 is None
assert isinstance(gy2, backward_xp.ndarray)
assert gy3 is None
# Retained outputs
output_data = self.output_data
assert isinstance(output_data, tuple)
assert len(output_data) == 3
y1, y2, y3 = output_data
assert y1 is None
assert isinstance(y2, backward_xp.ndarray)
assert y3 is None
gx1 = 3 * gy2
return gx1,
return Func()(x1)
def call_func_function_node(self, backend_config, x1):
forward_xp, backward_xp = _get_expected_xp(backend_config, False)
class Func(chainer.FunctionNode):
def forward(self, inputs):
# Inputs
x1, = inputs
assert isinstance(x1, forward_xp.ndarray)
y2 = x1 * 3 + 2
self.retain_outputs((1, 2))
return None, y2, None
def backward(self, input_indexes, grad_outputs):
# Input indexes
assert isinstance(input_indexes, tuple)
assert input_indexes == (0,)
# Retained inputs
retained_inputs = self.get_retained_inputs()
assert isinstance(retained_inputs, tuple)
assert retained_inputs == ()
# Output grads
assert isinstance(grad_outputs, tuple)
gy1, gy2, gy3 = grad_outputs
assert gy1 is None
assert isinstance(gy2.array, backward_xp.ndarray)
assert gy3 is None
# Retained outputs
retained_outputs = self.get_retained_outputs()
assert isinstance(retained_outputs, tuple)
y2, y3 = retained_outputs
assert y3 is None
assert isinstance(y2.array, backward_xp.ndarray)
gx1 = 3 * gy2
return gx1,
return Func().apply((x1,))
def call_func(self, backend_config, x1):
if self.function_node:
return self.call_func_function_node(backend_config, x1)
else:
return self.call_func_function(backend_config, x1)
def test_backprop(self, backend_config):
x1_arr = numpy.array([2, 3], numpy.float32)
gy2_arr = numpy.array([2, 4], numpy.float32)
x1_arr, gy2_arr = backend_config.get_array((x1_arr, gy2_arr))
x1 = chainer.Variable(x1_arr, requires_grad=True)
# Forward
y1, y2, y3 = self.call_func(backend_config, x1)
assert y1.array is None
assert isinstance(y2.array, backend_config.xp.ndarray)
assert y3.array is None
# Backward
y2.grad = gy2_arr
y2.backward()
assert isinstance(x1.grad, backend_config.xp.ndarray)
testing.run_module(__name__, __file__)
| 34.31992 | 79 | 0.554025 | 1,851 | 17,057 | 4.910859 | 0.063749 | 0.121452 | 0.043014 | 0.030363 | 0.90286 | 0.876018 | 0.846315 | 0.816062 | 0.777888 | 0.723542 | 0 | 0.037285 | 0.360028 | 17,057 | 496 | 80 | 34.389113 | 0.795438 | 0.068066 | 0 | 0.722561 | 0 | 0 | 0.028815 | 0 | 0 | 0 | 0 | 0.002016 | 0.304878 | 1 | 0.082317 | false | 0 | 0.018293 | 0 | 0.204268 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e6b4ecbc6a9296c0ad92cae47610b7d7d7f80a32 | 503,841 | py | Python | fuzz_drivers/fuzz/quickfuzz/test_fuzz_repos.py | gustavopinto/entente | 19b65d8cafd77c198c9c441f4f5e01503360309b | [
"BSD-2-Clause"
] | 5 | 2018-03-20T21:53:38.000Z | 2018-12-28T21:08:47.000Z | fuzz_drivers/fuzz/quickfuzz/test_fuzz_repos.py | gustavopinto/entente | 19b65d8cafd77c198c9c441f4f5e01503360309b | [
"BSD-2-Clause"
] | 14 | 2018-04-09T20:16:00.000Z | 2019-06-11T12:31:10.000Z | fuzz_drivers/fuzz/quickfuzz/test_fuzz_repos.py | gustavopinto/entente | 19b65d8cafd77c198c9c441f4f5e01503360309b | [
"BSD-2-Clause"
] | 12 | 2018-04-06T00:52:24.000Z | 2018-07-10T19:44:16.000Z | import os
import pytest
from fuzz_drivers import * #pylint: disable=W0614
from jsfuzz.fuzzer.validator import validate
from jsfuzz.utils import multicall, constants
from subprocess import call
# @pytest.mark.skip(reason="temporarilly disabling")
def test_repos():
path_name = os.path.join(constants.seeds_dir, 'repos')
projects = [
os.path.join(path_name, project_name)
for project_name in os.listdir(path_name)
]
for project_path in projects:
multicall.multicall_directories(
project_path, fuzzer='quickfuzz', validator=validate
)
# def test_repos_100_javascript_projects():
# path_name = os.path.join(constants.seeds_dir, 'repos', '100-javascript-projects')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_12_javascript_quirks():
# path_name = os.path.join(constants.seeds_dir, 'repos', '12-javascript-quirks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_123_Essential_JavaScript_Interview_Questions():
# path_name = os.path.join(constants.seeds_dir, 'repos', '123-Essential-JavaScript-Interview-Questions')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_30_seconds_of_code():
# path_name = os.path.join(constants.seeds_dir, 'repos', '30-seconds-of-code')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_30DaysOfJavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', '30DaysOfJavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_30daysJavascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', '30daysJavascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_33_js_concepts():
# path_name = os.path.join(constants.seeds_dir, 'repos', '33-js-concepts')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_4bottle():
# path_name = os.path.join(constants.seeds_dir, 'repos', '4bottle')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_500lines():
# path_name = os.path.join(constants.seeds_dir, 'repos', '500lines')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ADE():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ADE')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AMD_feature():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AMD-feature')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_APE_JSF():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'APE_JSF')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_API():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'API')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_APIs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'APIs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Absolution():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Absolution')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AdNauseamV1():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AdNauseamV1')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AdminLTE():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AdminLTE')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AgileDwarf():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AgileDwarf')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Amanda():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Amanda')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Amplitude_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Amplitude-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AndEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AndEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AngularAdmin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AngularAdmin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AngularJS_Animation_Article():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AngularJS-Animation-Article')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AngularJS_SEO_Article():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AngularJS-SEO-Article')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AngularJs_UI():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AngularJs-UI')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AngularOverlay():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AngularOverlay')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AngularSlideables():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AngularSlideables')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AnimatedFrameSlideshow():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AnimatedFrameSlideshow')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AppScroll_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AppScroll.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Apprise():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Apprise')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Apricot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Apricot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AtomicGameEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AtomicGameEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AudioKeys():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AudioKeys')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AuthenticationAngularJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AuthenticationAngularJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Awesome_Design_Tools():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Awesome-Design-Tools')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Awesome_JavaScript_Interviews():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Awesome-JavaScript-Interviews')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_AwesomeChartJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'AwesomeChartJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Babylon_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Babylon.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Backbone_Mediator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Backbone-Mediator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Backbone_Rel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Backbone.Rel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Balloon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Balloon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_BananaBread():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'BananaBread')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Behaviour_Assertion_Sheets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Behaviour-Assertion-Sheets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Better_Autocomplete():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Better-Autocomplete')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_BitGoJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'BitGoJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Bitcoin_JavaScript_Miner():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Bitcoin-JavaScript-Miner')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_BlogEngine_NET():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'BlogEngine.NET')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Boostnote():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Boostnote')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Bootstrap_Scroll_Modal():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Bootstrap-Scroll-Modal')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Boxer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Boxer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Brackets_InteractiveLinter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Brackets-InteractiveLinter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Brewr_Site():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Brewr-Site')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Broadway():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Broadway')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_BrowserQuest():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'BrowserQuest')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_CPS_OCR_Engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'CPS-OCR-Engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Chakra_Samples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Chakra-Samples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ChakraCore():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ChakraCore')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ChariTi():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ChariTi')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Chart_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Chart.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Chatty():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Chatty')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Chroma_Hash():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Chroma-Hash')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ChromeAppHeroes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ChromeAppHeroes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Cinderblock():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Cinderblock')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Citrus_Engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Citrus-Engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_CodeMirror():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'CodeMirror')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_CoderDeck():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'CoderDeck')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Codestrong():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Codestrong')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Cookies():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Cookies')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Cordova():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Cordova')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Crafty():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Crafty')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Croppie():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Croppie')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Crunch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Crunch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_CtCI_6th_Edition_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'CtCI-6th-Edition-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_CtCI_6th_Edition_JavaScript_ES2015():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'CtCI-6th-Edition-JavaScript-ES2015')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_CuraEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'CuraEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_D3xter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'D3xter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_DEPRECATED_node_wit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'DEPRECATED-node-wit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_DEPRECATED_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'DEPRECATED.javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Daily_Interview_Question():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Daily-Interview-Question')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Datatables_Bootstrap3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Datatables-Bootstrap3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Datejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Datejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Demo_for_National_Geographic_Forest_Giant():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Demo-for-National-Geographic-Forest-Giant')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Design_Patterns_in_Javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Design-Patterns-in-Javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Device_Art_Generator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Device-Art-Generator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_DoFler():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'DoFler')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_DoloresLabsTechTalk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'DoloresLabsTechTalk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Donatello():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Donatello')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Dory():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Dory')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_DragonBonesJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'DragonBonesJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Dualx():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Dualx')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_DynamicGrid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'DynamicGrid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ES5_DOM_SHIM():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ES5-DOM-SHIM')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_EasyWebsocket():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'EasyWebsocket')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Editr_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Editr.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Effect_Games():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Effect-Games')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Efficient_Mobile_Web_FE_Development():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Efficient-Mobile-Web-FE-Development')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ElastiStack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ElastiStack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Eloquent_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Eloquent-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_EmberSockets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'EmberSockets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Emmet_codaplugin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Emmet.codaplugin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Espruino():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Espruino')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_EtherSheet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'EtherSheet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_EventEmitter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'EventEmitter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_FLAnimatedImage():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'FLAnimatedImage')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Fable():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Fable')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Face_Detection_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Face-Detection-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Face_Recognition_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Face-Recognition-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_FaustCplus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'FaustCplus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Fe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Fe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_FlappyBird_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'FlappyBird-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Flickable_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Flickable.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Flotr2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Flotr2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Font_Awesome():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Font-Awesome')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Force_com_JavaScript_REST_Toolkit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Force.com-JavaScript-REST-Toolkit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_FramerTeachExamples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'FramerTeachExamples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Front_End_Checklist():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Front-End-Checklist')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Full_Stack_JavaScript_Engineering():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Full-Stack-JavaScript-Engineering')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Functional_Light_JS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Functional-Light-JS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Fuse():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Fuse')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_G6():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'G6')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_GLOW():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'GLOW')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_GSAP():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'GSAP')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_GameBoy_Online():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'GameBoy-Online')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Garbochess_JS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Garbochess-JS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_GeoMap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'GeoMap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Ghost():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Ghost')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_GitHubPopular_SJ():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'GitHubPopular-SJ')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Glisse_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Glisse.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_GloveBox():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'GloveBox')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Google_Earth_Engine_JavaScript_Examples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Google-Earth-Engine-JavaScript-Examples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_GraphEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'GraphEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Grid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Grid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Guiders_JS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Guiders-JS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_HTML5_Asteroids():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'HTML5-Asteroids')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Hardy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Hardy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Hasher():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Hasher')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Hazel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Hazel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Head_First_JavaScript_Programming():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Head-First-JavaScript-Programming')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_HoneyProxy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'HoneyProxy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Honeypot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Honeypot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Hyperglot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Hyperglot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Hyperlapse_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Hyperlapse.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Illustrator_Layer_Exporter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Illustrator-Layer-Exporter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ImageResolver():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ImageResolver')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ImmersiveEngineering():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ImmersiveEngineering')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Interpose():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Interpose')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Isomorphism_react_todomvc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Isomorphism-react-todomvc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JQuery_Combinators():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JQuery-Combinators')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JQuery_Mobile_Slide_Menu():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JQuery-Mobile-Slide-Menu')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JS_Interpreter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JS-Interpreter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JS_humanize():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JS-humanize')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JSARToolKit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JSARToolKit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JSIL():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JSIL')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JSLint():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JSLint')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JSMin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JSMin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JSON_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JSON-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JSONloops():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JSONloops')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JSPatch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JSPatch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JSVerbalExpressions():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JSVerbalExpressions')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JSbooks():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JSbooks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JZoopraxiscope():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JZoopraxiscope')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_21_Days_Challenge():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-21-Days-Challenge')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Algorithms():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Algorithms')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Applications():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Applications')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Canvas_to_Blob():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Canvas-to-Blob')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Completions():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Completions')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_DOM_Tutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-DOM-Tutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Data_Structure():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Data-Structure')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Data_Structures():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Data-Structures')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Demos():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Demos')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Design_Patterns():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Design-Patterns')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Equality_Table():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Equality-Table')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Fundamentals():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Fundamentals')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Garden():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Garden')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_I():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-I')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_ID3_Reader():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-ID3-Reader')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_II():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-II')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_III():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-III')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_IV():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-IV')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Koans():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Koans')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Lessons():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Lessons')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Load_Image():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Load-Image')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_MD5():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-MD5')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_OOP():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-OOP')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Particle_System():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Particle-System')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Playing_Cards():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Playing-Cards')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Quiz():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Quiz')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Scope_Context_Coloring():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Scope-Context-Coloring')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Snake():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Snake')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Templates():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Templates')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_Utilities():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-Utilities')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_autoComplete():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-autoComplete')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_cheat_sheet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-cheat-sheet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_for_Everyone():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-for-Everyone')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript_snippets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript-snippets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript1():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript1')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript30():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript30')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScript30_liyuechun():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScript30-liyuechun')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptAlgorithms():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptAlgorithms')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptBridge():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptBridge')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptCore_Demo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptCore-Demo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptCore_iOS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptCore-iOS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptEngineSwitcher():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptEngineSwitcher')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptEnhancements():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptEnhancements')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptIssuesStudy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptIssuesStudy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptServices():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptServices')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptStudy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptStudy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptTraining():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptTraining')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptTutorials():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptTutorials')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScriptViewEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScriptViewEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavaScripter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavaScripter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Javascript_Backdoor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Javascript-Backdoor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Javascript_Equal_Height_Responsive_Rows():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Javascript-Equal-Height-Responsive-Rows')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Javascript_Keylogger():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Javascript-Keylogger')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Javascript_Undo_Manager():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Javascript-Undo-Manager')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Javascript_Voronoi():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Javascript-Voronoi')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Javascript_the_Good_Parts_notes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Javascript-the-Good-Parts-notes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Javascript_Net():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Javascript.Net')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JavascriptSubtitlesOctopus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JavascriptSubtitlesOctopus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Jcrop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Jcrop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Jquery_Price_Format():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Jquery-Price-Format')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JsBridge():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JsBridge')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JsFormat():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JsFormat')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JsSIP():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JsSIP')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_JsSpeechRecognizer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'JsSpeechRecognizer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Jsome():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Jsome')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Jsource():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Jsource')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Juicer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Juicer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Kalendae():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Kalendae')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Kalm():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Kalm')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Kizzy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Kizzy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Kojak():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Kojak')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_LABjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'LABjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_LJSON():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'LJSON')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_LLJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'LLJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Leaflet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Leaflet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_LeapJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'LeapJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Learn_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Learn-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Leo_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Leo-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_LiteAccordion():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'LiteAccordion')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_LokiJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'LokiJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Lottery():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Lottery')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_LumixEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'LumixEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_MGTwitterEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'MGTwitterEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Maple_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Maple.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_MarkdownPresenter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'MarkdownPresenter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Marketing_for_Engineers():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Marketing-for-Engineers')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_MegEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'MegEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Memeye():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Memeye')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_MeteorRider():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'MeteorRider')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Mock():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Mock')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Modern_JavaScript_Curriculum():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Modern-JavaScript-Curriculum')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Modernizr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Modernizr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Monorail_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Monorail.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Motrix():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Motrix')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_NG6_todomvc_starter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'NG6-todomvc-starter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_NativeBase():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'NativeBase')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_New_Media_Image_Uploader():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'New-Media-Image-Uploader')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_NiL_JS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'NiL.JS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Numeral_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Numeral-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_OS_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'OS.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Object_observe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Object.observe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Octosplit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Octosplit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_OfflineMbTiles():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'OfflineMbTiles')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Oimo_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Oimo.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Openframe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Openframe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_OverReact():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'OverReact')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PHP_Vars_To_Js_Transformer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PHP-Vars-To-Js-Transformer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PNGDrive():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PNGDrive')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Parse_SDK_JS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Parse-SDK-JS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PexJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PexJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PhantomXHR():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PhantomXHR')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PhoneNumber_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PhoneNumber.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Phonegap_SQLitePlugin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Phonegap-SQLitePlugin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PhotoSwipe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PhotoSwipe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PhysicsJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PhysicsJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PixelJihad():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PixelJihad')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PowerBI_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PowerBI-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PptxGenJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PptxGenJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Presenteer_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Presenteer.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PreventSpider():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PreventSpider')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Programing_In_Javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Programing-In-Javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Proton():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Proton')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PubSubJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PubSubJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Pumpkin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Pumpkin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Pure_JavaScript_HTML5_Parser():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Pure-JavaScript-HTML5-Parser')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_PureSlider():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'PureSlider')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Push_It():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Push-It')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_QuickJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'QuickJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_QuoJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'QuoJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_RCSS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'RCSS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_RN_NavigationExperimental_Redux_Example():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'RN-NavigationExperimental-Redux-Example')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ROMManagerManifest():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ROMManagerManifest')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Radio():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Radio')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_RazorEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'RazorEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Reasons_Craft():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Reasons-Craft')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ReplayLastGoal():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ReplayLastGoal')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_RequireJS_Backbone_Starter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'RequireJS-Backbone-Starter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Revenant():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Revenant')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Rocket_Chat():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Rocket.Chat')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Rucksack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Rucksack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_RxJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'RxJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SJSJ():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SJSJ')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ScriptCommunicator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ScriptCommunicator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ScriptCraft():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ScriptCraft')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ScrollMagic():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ScrollMagic')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Scrolld_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Scrolld.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Scroller():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Scroller')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SeetaFaceEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SeetaFaceEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Selecter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Selecter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Semantic_UI():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Semantic-UI')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SendBird_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SendBird-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Serious_Engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Serious-Engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SilkJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SilkJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Sketch_Layer_Tools():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Sketch-Layer-Tools')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SketchGit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SketchGit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SketchSquares():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SketchSquares')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SketchToSwift():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SketchToSwift')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SlickGrid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SlickGrid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Snake_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Snake-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Snap_svg():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Snap.svg')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SocialFeed_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SocialFeed.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Sortable():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Sortable')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SpaceEngineers():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SpaceEngineers')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Sparky_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Sparky.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SpeechToText_WebSockets_Javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SpeechToText-WebSockets-Javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Sprint_Challenge__JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Sprint-Challenge--JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Sprint_Challenge_Applied_Javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Sprint-Challenge-Applied-Javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Starling_Framework():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Starling-Framework')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_StatusPage():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'StatusPage')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Stockfish():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Stockfish')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Streamus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Streamus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Strelki_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Strelki.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SublimeRubyMotionBuilder():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SublimeRubyMotionBuilder')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_SublimeTextSetupWiki():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'SublimeTextSetupWiki')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Switcheroo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Switcheroo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Syte2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Syte2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TOMODOkorz():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TOMODOkorz')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TableTools():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TableTools')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Tangle():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Tangle')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Tangram_base():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Tangram-base')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Tangram_component():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Tangram-component')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Tangram2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Tangram2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Tasks():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Tasks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TemplateBinding():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TemplateBinding')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_The_complete_guide_to_modern_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'The-complete-guide-to-modern-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TheAmazingAudioEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TheAmazingAudioEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ThreeNodes_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ThreeNodes.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Throttle():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Throttle')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TiIconicFont():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TiIconicFont')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TimelineJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TimelineJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Titanium_Tools():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Titanium-Tools')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Todo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Todo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TopLevel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TopLevel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TouchyBP():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TouchyBP')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Tracker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Tracker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TransformJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TransformJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Tuiter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Tuiter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_TypeScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'TypeScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_UIWebView_TS_JavaScriptContext():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'UIWebView-TS_JavaScriptContext')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_URI_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'URI.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_UTiL():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'UTiL')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_UglifyJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'UglifyJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_UglifyJS2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'UglifyJS2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_UnrealEnginePython():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'UnrealEnginePython')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_V2EX_Vue():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'V2EX-Vue')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_V8():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'V8')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Validator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Validator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Vanilla_JavaScript_Calculator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Vanilla-JavaScript-Calculator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ViewerJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ViewerJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_VvvebJs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'VvvebJs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_WKWebViewJavascriptBridge():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'WKWebViewJavascriptBridge')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Walkable_App():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Walkable-App')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_WalletGenerator_net():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'WalletGenerator.net')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_WasAPlayer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'WasAPlayer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_WeApp_Workflow():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'WeApp-Workflow')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_WebCola():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'WebCola')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_WebODF():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'WebODF')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_WebViewJavascriptBridge():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'WebViewJavascriptBridge')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Webiny():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Webiny')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Webplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Webplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_WickedEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'WickedEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_WickedGrid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'WickedGrid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_You_Dont_Know_JS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'You-Dont-Know-JS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_You_Dont_Need_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'You-Dont-Need-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_You_Dont_Need_jQuery():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'You-Dont-Need-jQuery')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_YouCompleteMe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'YouCompleteMe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_Zoombox():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'Zoombox')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_a_triangle_everyday():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'a-triangle-everyday')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_abba():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'abba')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_acc_wizard():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'acc-wizard')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ace():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ace')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_acorn():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'acorn')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_acs_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'acs-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_activejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'activejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_adblock_to_bitcoin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'adblock-to-bitcoin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_adminjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'adminjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_advanced_jquery_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'advanced-jquery-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_aima_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'aima-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_airbrake_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'airbrake-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_airtable_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'airtable.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_aks_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'aks-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_alertify_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'alertify.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_algoliasearch_client_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'algoliasearch-client-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_algorithm_visualizer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'algorithm-visualizer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_allora():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'allora')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_alphabeta():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'alphabeta')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_amazeui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'amazeui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_amazon_cognito_identity_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'amazon-cognito-identity-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_amcharts3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'amcharts3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ammo_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ammo.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_amphtml():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'amphtml')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_amplify_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'amplify-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_amqp_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'amqp-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_anchor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'anchor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_aside():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-aside')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_block_ui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-block-ui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_bootstrap_switch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-bootstrap-switch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_chartjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-chartjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_clipboard():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-clipboard')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_collection():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-collection')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_d3_demo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-d3-demo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_deferred_bootstrap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-deferred-bootstrap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_directive_g_signin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-directive.g-signin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_drop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-drop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_electron():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-electron')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_ellipsis():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-ellipsis')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_examples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-examples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_fabric():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-fabric')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_formly():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-formly')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_google_maps():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-google-maps')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_guide_zh():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-guide-zh')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_gulp_browserify_livereload_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-gulp-browserify-livereload-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_hateoas():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-hateoas')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_history():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-history')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_indexedDB():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-indexedDB')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_load():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-load')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_md5():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-md5')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_mobile_ui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-mobile-ui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_multiselect():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-multiselect')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_notifications():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-notifications')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_oauth():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-oauth')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_once():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-once')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_parse():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-parse')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_patternfly():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-patternfly')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_promise_buttons():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-promise-buttons')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_react_native_seed():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-react-native-seed')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_redactor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-redactor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_responsive():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-responsive')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_sails_bind():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-sails-bind')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_socket_io_im():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-socket-io-im')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_timeago():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-timeago')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_toArrayFilter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-toArrayFilter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_typeahead():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-typeahead')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_ui_tour():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-ui-tour')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_validation_match():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-validation-match')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_validator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-validator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_video_bg():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-video-bg')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_virtual_scroll():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-virtual-scroll')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_webstorage():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular-webstorage')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular1_systemjs_seed():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular1-systemjs-seed')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular1_webpack_starter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular1-webpack-starter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular2_now():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular2-now')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angular2_the_new_horizon_sample():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angular2-the-new-horizon-sample')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularJS_CafeTownsend():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularJS-CafeTownsend')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_FlightDashboard():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-FlightDashboard')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_geolocation():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-geolocation')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_imageupload_directive():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-imageupload-directive')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_modal_service():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-modal-service')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_performance_tips():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-performance-tips')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_periscope():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-periscope')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_requirejs_lazy_controllers():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-requirejs-lazy-controllers')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_seed_repo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-seed-repo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_server():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-server')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs_utilities():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs-utilities')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_angularjs1():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'angularjs1')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_anime():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'anime')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ansi_canvas():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ansi-canvas')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ant_design():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ant-design')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_apejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'apejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_apexcharts_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'apexcharts.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_app_id_sanity():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'app-id-sanity')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_appengine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'appengine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_appframework():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'appframework')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_aprendendo_padroes_de_projeto_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'aprendendo-padroes-de-projeto-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_archived_morkdown():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'archived-morkdown')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_art_template():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'art-template')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_asch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'asch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_assemblies():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'assemblies')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_asteroid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'asteroid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_async():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'async')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_async_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'async-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_async_javascript_cheatsheet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'async-javascript-cheatsheet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_async_javascript_workshop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'async-javascript-workshop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_atmosphere_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'atmosphere-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_atom():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'atom')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_atom_javascript_snippets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'atom-javascript-snippets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_atom_react_snippets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'atom-react-snippets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_atom_turbo_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'atom-turbo-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_atomus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'atomus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_aurora_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'aurora.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_auth0_javascript_samples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'auth0-javascript-samples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_authenticator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'authenticator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_autobahn_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'autobahn-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_automaton():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'automaton')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_autopolyfiller():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'autopolyfiller')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_autoprefixer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'autoprefixer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_autoprefixer_loader():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'autoprefixer-loader')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ava():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ava')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ava_spec():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ava-spec')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_avsc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'avsc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_chaos_engineering():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-chaos-engineering')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_data_engineering():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-data-engineering')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_javascript_cn():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-javascript-cn')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_javascript_learning():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-javascript-learning')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_mac():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-mac')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_react_native():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-react-native')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_reverse_engineering():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-reverse-engineering')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_selfhosted():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-selfhosted')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awesome_vscode():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awesome-vscode')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awponent():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awponent')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_aws_lambda_debugger():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'aws-lambda-debugger')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_aws2js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'aws2js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_awsdetailedbilling():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'awsdetailedbilling')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_axios():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'axios')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_babel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'babel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_babel_plugin_webpack_loaders():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'babel-plugin-webpack-loaders')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_babel_webpack_tree_shaking():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'babel-webpack-tree-shaking')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_express_spa():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone-express-spa')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_jquerymobile():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone-jquerymobile')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_mobile_search():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone-mobile-search')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_pouch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone-pouch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_analytics():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone.analytics')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_directives():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone.directives')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_geppetto():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone.geppetto')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_googlemaps():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone.googlemaps')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_memento():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone.memento')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbone_modal():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbone.modal')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backbonefire():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backbonefire')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_backtick():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'backtick')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bank():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bank')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bbGrid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bbGrid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bcoin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bcoin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_be_MEAN_resources():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'be-MEAN-resources')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_beamjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'beamjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_beatdetektor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'beatdetektor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_beginner_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'beginner-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bem_bl():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bem-bl')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bem_tools():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bem-tools')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_benjamin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'benjamin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_better_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'better.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_between():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'between')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_between_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'between.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bgiframe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bgiframe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bic_calendar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bic_calendar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_big_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'big.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bitaddress_org():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bitaddress.org')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bitcoinjs_lib():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bitcoinjs-lib')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bitcore_lib():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bitcore-lib')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_blessed_contrib():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'blessed-contrib')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bliss():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bliss')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_blockparty():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'blockparty')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_blog_swift():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'blog.swift')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bluebird():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bluebird')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bn_javascript_info():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bn.javascript.info')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bn_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bn.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_boa():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'boa')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_boilerplatejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'boilerplatejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_boilerstrap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'boilerstrap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bonescript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bonescript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_boom():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'boom')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_boombox_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'boombox.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bootstrap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bootstrap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bootstrap_datepaginator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bootstrap-datepaginator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bootstrap_file_input():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bootstrap-file-input')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bootstrap_ios7():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bootstrap-ios7')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bootstrap_tldr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bootstrap-tldr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bootstrap_toggle_buttons():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bootstrap-toggle-buttons')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bootstrap_xtra():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bootstrap-xtra')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bootstrap_isotope():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bootstrap_isotope')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_botui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'botui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bower():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bower')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_box2d():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'box2d')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_box2d_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'box2d-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bpipe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bpipe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bpmn_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bpmn-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_brackets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'brackets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_brain_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'brain.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_brfv4_javascript_examples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'brfv4_javascript_examples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_brisket():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'brisket')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_broadway():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'broadway')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_browser_pwn():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'browser_pwn')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_browserify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'browserify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_brozula():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'brozula')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bselect():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bselect')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bubbletree():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bubbletree')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bui_default():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bui-default')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_building_products_with_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'building-products-with-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bus_io():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bus.io')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_bwip_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'bwip-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_c3netmon_public():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'c3netmon-public')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cactbot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cactbot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_calendarHTML_Javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'calendarHTML-Javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_canibekikked():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'canibekikked')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cannon_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cannon.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_canvg():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'canvg')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_capt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'capt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_car_lease_demo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'car-lease-demo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_carbon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'carbon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_caribou():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'caribou')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_carto_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'carto.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cash():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cash')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_casual():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'casual')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cats():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cats')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ccss():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ccss')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ccxt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ccxt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cdir():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cdir')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cedar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cedar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chain_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chain.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chalk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chalk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chancejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chancejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_channel_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'channel.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chaosocket():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chaosocket')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chapters():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chapters')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_charlie_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'charlie.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chartkick():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chartkick')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chatter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chatter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cheat_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cheat-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cheatsheets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cheatsheets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_checkboxes_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'checkboxes.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_checkerboard():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'checkerboard')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_checkpoint_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'checkpoint-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cheerio():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cheerio')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chessboardjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chessboardjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chevrotain():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chevrotain')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chinese_poetry():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chinese-poetry')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chroma_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chroma.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chrome_nfc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chrome-nfc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chromecast_gb():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chromecast-gb')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_chrono():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'chrono')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_circleci_demo_javascript_express():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'circleci-demo-javascript-express')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_clarifai_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'clarifai-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_clarifyjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'clarifyjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_classnames():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'classnames')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_clean_code_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'clean-code-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_clean_code_javascript_tr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'clean-code-javascript-tr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_clean_code_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'clean-code-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cleverstack_cli():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cleverstack-cli')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cli():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cli')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_client_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'client-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_clipboard_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'clipboard.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cloak():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cloak')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_closure_compiler():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'closure-compiler')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_closure_library():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'closure-library')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cloudboost():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cloudboost')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cloudinary_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cloudinary_js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cms_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cms.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_co_express():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'co-express')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_co_mocha():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'co-mocha')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_co_request():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'co-request')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cocos2d_html5():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cocos2d-html5')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cocos2d_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cocos2d-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_codaslider():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'codaslider')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_code_editor_app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'code-editor-app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_code_splitting_react_webpack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'code-splitting-react-webpack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_codeblock_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'codeblock.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_codem_transcode():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'codem-transcode')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_codemirror_movie():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'codemirror-movie')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_codesurgeon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'codesurgeon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_coffeescript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'coffeescript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_coffin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'coffin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_coloor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'coloor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_color():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'color')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_color_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'color.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_colovely():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'colovely')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_combinatorics_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'combinatorics.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_command_and_conquer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'command-and-conquer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_commander_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'commander.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_commonmark_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'commonmark.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_compass_bootstrap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'compass-bootstrap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_complete():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'complete')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_complete_javascript_course():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'complete-javascript-course')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_complexity_report():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'complexity-report')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_component_installer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'component-installer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_composition_examples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'composition-examples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_compressorjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'compressorjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_computer_science_in_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'computer-science-in-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_conductor_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'conductor.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_congo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'congo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_connect_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'connect-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_connect_mongodb():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'connect-mongodb')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_contracts():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'contracts')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_converse_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'converse.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_coquette():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'coquette')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cordova_plugin_console():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cordova-plugin-console')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cordova_plugin_wkwebview_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cordova-plugin-wkwebview-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_couchpubtato():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'couchpubtato')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_coverify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'coverify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_covert():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'covert')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_create_react_app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'create-react-app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_creditcard_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'creditcard_js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_creditly():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'creditly')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cropit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cropit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cropperjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cropperjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_crossbow_sites():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'crossbow-sites')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_crossroads_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'crossroads.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cruncher():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cruncher')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_crypto_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'crypto-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_crypto_pouch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'crypto-pouch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_csonv_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'csonv.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cspjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cspjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_css_modules_demos():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'css-modules-demos')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_css_modules_require_hook():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'css-modules-require-hook')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_css_regions_polyfill():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'css-regions-polyfill')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_css_reporter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'css-reporter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_css3_mediaqueries_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'css3-mediaqueries-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cssConsole():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cssConsole')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_csscritic():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'csscritic')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cssfilterlab():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cssfilterlab')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cst():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cst')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cubism():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cubism')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cucumber_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cucumber-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cult():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cult')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_currency_io():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'currency.io')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_curso_definitivo_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'curso-definitivo-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_curso_javascript_avanzado():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'curso-javascript-avanzado')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_curso_javascript_ninja():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'curso-javascript-ninja')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_curso_javascript_projeto_usuarios():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'curso-javascript-projeto-usuarios')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_curso_sistemas_web_com_spring_javascript_bootstrap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'curso-sistemas-web-com-spring-javascript-bootstrap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cyclejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cyclejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_cypress():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'cypress')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_d3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'd3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_d3_cloud():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'd3-cloud')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_d3_dot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'd3-dot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_d3_react_squared():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'd3-react-squared')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_d3_starterkit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'd3-starterkit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_d3AngularIntegration():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'd3AngularIntegration')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_d3talk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'd3talk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dagre():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dagre')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_darkstripes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'darkstripes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dat_gui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dat.gui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_data_projector():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'data-projector')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_data_structures_and_algorithms_using_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'data_structures_and_algorithms_using_javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dataflow():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dataflow')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_datavore():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'datavore')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_date_fns():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'date-fns')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_daterangepicker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'daterangepicker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_datui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'datui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dayjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dayjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_db():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'db')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_deamdify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'deamdify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_debug_http():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'debug-http')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_decaffeinate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'decaffeinate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_decimal_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'decimal.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_decking():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'decking')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_def_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'def.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_delaunay_fast():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'delaunay-fast')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_delorean():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'delorean')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_demopack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'demopack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_demystifying_js_engines():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'demystifying-js-engines')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_den():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'den')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_deno():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'deno')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_denodeify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'denodeify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_deprecated_electrode_archetype_react_app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'deprecated-electrode-archetype-react-app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_desantapp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'desantapp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_descartes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'descartes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_destiny():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'destiny')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_devoops():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'devoops')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dffptch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dffptch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dialogflow_javascript_client():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dialogflow-javascript-client')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_director():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'director')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_directory_backbone_bootstrap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'directory-backbone-bootstrap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_discord_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'discord.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dive_into_python3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dive-into-python3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dizzy_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dizzy.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_django_chosen():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'django-chosen')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_django_drf_react_quickstart():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'django-drf-react-quickstart')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_do():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'do')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_doT():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'doT')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_docblockr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'docblockr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_docker_node():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'docker-node')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_docker_parse_server_git_deploy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'docker-parse-server-git-deploy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_docker_private_registry():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'docker-private-registry')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dockunit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dockunit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_doclets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'doclets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_docs2epub():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'docs2epub')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_doctestjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'doctestjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_documentation():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'documentation')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_docusaurus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'docusaurus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dom_elements():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dom-elements')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dotjs_addon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dotjs-addon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dotty():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dotty')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_download():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'download')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_downworthy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'downworthy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dpicker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dpicker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_draft_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'draft-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_draggable():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'draggable')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_draggable_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'draggable.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dragula():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dragula')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_drive_dredit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'drive-dredit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_drive_zipextractor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'drive-zipextractor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_driver_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'driver.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_drools():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'drools')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dropbox_sdk_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dropbox-sdk-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dropchop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dropchop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dropfile():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dropfile')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dropmocks():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dropmocks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dropzone():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dropzone')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dsa_js_data_structures_algorithms_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dsa.js-data-structures-algorithms-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_duktape():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'duktape')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_duktape_android():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'duktape-android')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dva():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dva')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dynamics_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dynamics.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dynamo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dynamo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_dynode():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'dynode')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_earthengine_api():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'earthengine-api')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_easyModal_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'easyModal.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ec2_fleet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ec2-fleet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_echojs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'echojs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_echowaves():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'echowaves')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_eclipse_zencoding():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'eclipse-zencoding')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ect():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ect')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_effective_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'effective-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_egg():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'egg')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_egghead_react_flux_example():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'egghead-react-flux-example')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ej2_javascript_ui_controls():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ej2-javascript-ui-controls')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_elastic():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'elastic')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_elasticsearch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'elasticsearch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_electrode_server():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'electrode-server')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_electron():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'electron')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_electron_accelerator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'electron-accelerator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_elixirscript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'elixirscript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_elliptic():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'elliptic')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_elm_hot_loader():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'elm-hot-loader')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_eloquente_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'eloquente-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_animate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-animate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_async_button():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-async-button')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_auth():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-auth')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_burger_menu():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-burger-menu')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_cli_i18n():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-cli-i18n')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_cli_pace():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-cli-pace')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_cli_pagination():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-cli-pagination')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_cpm():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-cpm')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_crud():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-crud')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_data_django_rest_adapter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-data-django-rest-adapter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_data_route():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-data-route')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_forms():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-forms')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_graphql_adapter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-graphql-adapter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_islands():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-islands')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_json_api():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-json-api')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_parse_adapter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-parse-adapter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_resource():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-resource')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_rest():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-rest')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_restless():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-restless')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_select_2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-select-2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_skeleton():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-skeleton')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_touch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember-touch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ember_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ember.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_en_javascript_info():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'en.javascript.info')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_encog_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'encog-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_engine_io():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'engine.io')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_engineer_manager():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'engineer-manager')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_engineercms():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'engineercms')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_engineering_blogs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'engineering-blogs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_engineering_management():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'engineering-management')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_enigma_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'enigma.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_enquire_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'enquire.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_enzyme():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'enzyme')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_epf():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'epf')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_es_papp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'es-papp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_es_javascript_info():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'es.javascript.info')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_es5_shim():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'es5-shim')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_es6_babel_browserify_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'es6-babel-browserify-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_es6_design_patterns():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'es6-design-patterns')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_es6_project_starter_kit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'es6-project-starter-kit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_es6_react_mixins():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'es6-react-mixins')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_es6tutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'es6tutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_esbuild():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'esbuild')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_escargot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'escargot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_esdoc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'esdoc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_esercizi_di_programmazione_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'esercizi-di-programmazione-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_esformatter_jsx():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'esformatter-jsx')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_eslint():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'eslint')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_eslint_config_defaults():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'eslint-config-defaults')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_eslint_formatter_pretty():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'eslint-formatter-pretty')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_esperanto():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'esperanto')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_espree():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'espree')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_essage():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'essage')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_essential_javascript_links():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'essential-javascript-links')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_essential_js_design_patterns():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'essential-js-design-patterns')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_evee_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'evee.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_evernote_sdk_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'evernote-sdk-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ews_javascript_api():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ews-javascript-api')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ex_navigator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ex-navigator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_example_backbone_app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'example-backbone-app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_example_node():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'example-node')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_excel_builder_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'excel-builder.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_excellentexport():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'excellentexport')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_execjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'execjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_exercises():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'exercises')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_exokit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'exokit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_exoskeleton():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'exoskeleton')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_express():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'express')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_express_angular():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'express-angular')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_express_di():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'express-di')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_express_happiness():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'express-happiness')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_express_partials():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'express-partials')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_express_train():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'express-train')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_exterminate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'exterminate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_eyeballs_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'eyeballs.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_f8app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'f8app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fabric_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fabric.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_facebook_circles():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'facebook-circles')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_facebook_js_sdk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'facebook-js-sdk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_faced():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'faced')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fairy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fairy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_faker_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'faker.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_falcor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'falcor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_falkor_archived():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'falkor-archived')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fancy_zoom():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fancy-zoom')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fann_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fann.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fantasy_land():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fantasy-land')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fastify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fastify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fbt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fbt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fe_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fe.javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_feather():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'feather')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_feature_engineering_book():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'feature-engineering-book')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_feelingrestful_theme():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'feelingrestful-theme')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fela():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fela')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_felt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'felt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fetch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fetch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fhir_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fhir.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fibjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fibjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fieldval_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fieldval-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_filepond():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'filepond')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fileupload():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fileupload')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_finitio():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'finitio')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_firebase_angular_starter_pack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'firebase-angular-starter-pack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fireloop_io():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fireloop.io')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_firequery():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'firequery')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fireunit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fireunit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fireworks_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fireworks.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fishbone_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fishbone.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fiveby():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fiveby')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fixto():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fixto')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flaskr_tdd():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flaskr-tdd')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flatpickr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flatpickr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flexibility():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flexibility')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flight():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flight')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flipcountdown():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flipcountdown')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flipload():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flipload')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flotsam():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flotsam')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flow_jsdoc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flow-jsdoc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flowable_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flowable-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flowy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flowy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flux():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flux')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_flux_router_component():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'flux-router-component')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fmt_obj():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fmt-obj')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fn_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fn.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_font_awesome_webpack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'font-awesome-webpack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_forest():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'forest')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_formacao_javascript_mestre_jedi():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'formacao-javascript-mestre-jedi')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_formaline():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'formaline')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_formhub():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'formhub')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_formio_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'formio.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_formspree():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'formspree')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fourk_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fourk.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_foxjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'foxjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_frame_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'frame.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_framer_sketch_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'framer-sketch-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_framer_templates():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'framer-templates')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_framework():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'framework')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_framework7():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'framework7')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_framework7_react_base():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'framework7-react-base')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_freeCodeCamp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'freeCodeCamp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_front_end_interview_handbook():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'front-end-interview-handbook')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_front_end_separate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'front-end-separate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_front_ui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'front-ui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_frozen():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'frozen')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_frpjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'frpjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fruitmachine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fruitmachine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fseditor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fseditor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fuckitjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fuckitjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fullPage_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fullPage.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fullproof():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fullproof')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fullstack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fullstack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fullstack_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fullstack-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fullstack_javascript_architecture():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fullstack-javascript-architecture')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_functional_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'functional-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_functional_javascript_workshop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'functional-javascript-workshop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_functional_programming_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'functional-programming-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_fuzzilli():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'fuzzilli')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_galleria():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'galleria')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gamblers_dice():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gamblers-dice')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gameQuery():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gameQuery')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ganache_cli():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ganache-cli')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ganon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ganon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gantt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gantt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gatsby():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gatsby')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gauge_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gauge.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_geierlein():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'geierlein')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_generator_angular_go_martini():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'generator-angular-go-martini')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_generator_angulpify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'generator-angulpify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_generator_jhipster():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'generator-jhipster')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_generator_jhipster_react():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'generator-jhipster-react')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_generator_jquery_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'generator-jquery-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_generator_phaser():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'generator-phaser')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_generator_polymer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'generator-polymer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_generator_redux():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'generator-redux')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_geo_googledocs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'geo-googledocs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_geohash_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'geohash-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_getdocs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'getdocs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_getting_started_with_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'getting-started-with-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gh_emoji():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gh-emoji')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ghcjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ghcjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ghost_town():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ghost-town')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gif_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gif.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gijgo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gijgo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gilded_rose_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gilded-rose-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_git_watcher():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'git-watcher')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_git_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'git.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gitbook():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gitbook')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gitflowanimated():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gitflowanimated')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_github_notetaker_egghead():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'github-notetaker-egghead')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_github_s3_deploy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'github-s3-deploy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gitlet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gitlet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gl_matrix():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gl-matrix')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gloria():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gloria')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gmail_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gmail.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gnode():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gnode')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_go_duktape():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'go-duktape')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_go_for_javascript_developers():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'go-for-javascript-developers')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_go_v8():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'go-v8')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_goangular():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'goangular')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_godot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'godot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_godot_docs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'godot-docs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_goodnight():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'goodnight')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_google_api_javascript_client():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'google-api-javascript-client')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_google_plus_extension_jsapi():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'google-plus-extension-jsapi')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_google_spreadsheet_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'google-spreadsheet-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_google_tts():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'google-tts')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gplus_quickstart_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gplus-quickstart-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gpu_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gpu.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_graphitejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'graphitejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_graphql_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'graphql-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_graphql_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'graphql-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gremlin_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gremlin-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_growl4rails():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'growl4rails')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_bake():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-bake')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_bower_requirejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-bower-requirejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_browserify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-browserify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_closure_compiler():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-closure-compiler')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_contrib_compress():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-contrib-compress')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_contrib_csslint():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-contrib-csslint')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_css():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-css')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_email_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-email-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_express_server():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-express-server')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_git():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-git')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_githooks():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-githooks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_html_snapshot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-html-snapshot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_html2js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-html2js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_hub():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-hub')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_imagine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-imagine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_includes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-includes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_inline_css():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-inline-css')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_jekyll():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-jekyll')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_markdown():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-markdown')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_nodemon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-nodemon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_phantomas():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-phantomas')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_premailer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-premailer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_reload():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-reload')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_requirejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-requirejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_s3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-s3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_grunt_text_replace():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'grunt-text-replace')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gua_game_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gua.game.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_guitar_bro():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'guitar_bro')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gulp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gulp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gulp_awspublish():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gulp-awspublish')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gulp_filter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gulp-filter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gulp_ignore():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gulp-ignore')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gulp_jscs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gulp-jscs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gulp_ng_config():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gulp-ng-config')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gulp_react():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gulp-react')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gulpfiction():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gulpfiction')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gulpman():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gulpman')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_gury():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'gury')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_h5ive_DEPRECATED():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'h5ive-DEPRECATED')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hackathon_casperjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hackathon-casperjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hackathon_starter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hackathon-starter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hacker_scripts():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hacker-scripts')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hacking_with_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hacking-with-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hackynote():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hackynote')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_haloword():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'haloword')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hammer_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hammer.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_handlebars_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'handlebars.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_handsontable():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'handsontable')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hapi_universal_redux():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hapi-universal-redux')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_haskell_ide_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'haskell-ide-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_haunt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'haunt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_headtrackr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'headtrackr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_heapbox():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'heapbox')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_heartbeat_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'heartbeat.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_heatmap_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'heatmap.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hello_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hello-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_henryyan_github_com():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'henryyan.github.com')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hermes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hermes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hexo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hexo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_highcharts():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'highcharts')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_highlight():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'highlight')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_highlight_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'highlight.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hiring_engineers():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hiring-engineers')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hiring_without_whiteboards():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hiring-without-whiteboards')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_history():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'history')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_history_of_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'history-of-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hitagi_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hitagi.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hivemind():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hivemind')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hls_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hls.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hn_ng2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hn-ng2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hn_reader():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hn-reader')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_holla():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'holla')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_homebridge():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'homebridge')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hoodie():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hoodie')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hookbox():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hookbox')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_horizon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'horizon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hotdot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hotdot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hotspots():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hotspots')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_how_javascript_works():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'how-javascript-works')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_how_to_sane():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'how-to-sane')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_howler_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'howler.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hpm():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hpm')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_html_minifier():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'html-minifier')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_html2canvas():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'html2canvas')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_html2jade_website():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'html2jade-website')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_html5_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'html5-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_htmljs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'htmljs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_http_client():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'http-client')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hubot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hubot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hummingbird():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hummingbird')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_huntsman():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'huntsman')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_husky():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'husky')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hyper_pokemon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hyper-pokemon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hyperglue():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hyperglue')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hyperscript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hyperscript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_hypher():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'hypher')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iClient_JavaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iClient-JavaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iCreator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iCreator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iD():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iD')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iMessageWebClient():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iMessageWebClient')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iOS_HTML5_Tethering():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iOS-HTML5-Tethering')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iOSAppReverseEngineering():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iOSAppReverseEngineering')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ice():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ice')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_icons():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'icons')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iconv_lite():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iconv-lite')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_idiomatic_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'idiomatic.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ie8():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ie8')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iioEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iioEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_imaskjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'imaskjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_imgr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'imgr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_imgsible():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'imgsible')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_immer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'immer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_immstruct():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'immstruct')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_immutable_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'immutable-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_impresionante_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'impresionante-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_impress_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'impress.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_incubator_echarts():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'incubator-echarts')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_inferno():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'inferno')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_infiniScroll_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'infiniScroll.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_infiniwall():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'infiniwall')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_inline_manifest_webpack_plugin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'inline-manifest-webpack-plugin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_inscribe_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'inscribe.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_instachrome():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'instachrome')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_instafeed_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'instafeed.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_instagram_javascript_sdk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'instagram-javascript-sdk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_intermediate_javascript_assessment():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'intermediate-javascript-assessment')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_interview_questions_in_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'interview-questions-in-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_intl_tel_input():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'intl-tel-input')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_intro_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'intro-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_inu():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'inu')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ioBroker_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ioBroker.javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ionic_demo_resort_app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ionic-demo-resort-app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ionic_modal_select():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ionic-modal-select')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ionic_ocr_example():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ionic-ocr-example')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iota_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iota.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iptv():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iptv')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ipython_vimception():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ipython-vimception')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_irecord():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'irecord')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iroha_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iroha-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iron_cli():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iron-cli')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_is_loading():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'is-loading')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_is_up_cli():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'is-up-cli')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_iscroll():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'iscroll')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_isomorphic_lab():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'isomorphic-lab')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_isomorphic_redux():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'isomorphic-redux')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_isotope_perfectmasonry():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'isotope-perfectmasonry')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_isparta_loader():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'isparta-loader')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jCryption():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jCryption')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQote2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQote2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Chrono():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Chrono')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Custom_File_Input():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Custom-File-Input')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Mobile_Boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Mobile-Boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Parse():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Parse')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Shadow():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Shadow')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Simple_Timer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Simple-Timer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Smart_Auto_Complete():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Smart-Auto-Complete')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Stickem():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Stickem')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Timepicker_Addon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Timepicker-Addon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Validation_Engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Validation-Engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_Verbose_Calendar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-Verbose-Calendar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_basic_arithmetic_plugin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery-basic-arithmetic-plugin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_loadScroll():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery.loadScroll')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jQuery_stayInWebApp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jQuery.stayInWebApp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jWorkflow():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jWorkflow')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jaadi_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jaadi.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jarallax():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jarallax')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jarves():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jarves')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jasmine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jasmine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jasmine_fixture():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jasmine-fixture')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jasmine_sinon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jasmine-sinon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jasmine_async():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jasmine.async')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javaScript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javaScript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javaee_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javaee-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_1_afternoon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-1-afternoon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_101():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-101')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_2_afternoon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-2-afternoon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_2_afternoon_2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-2-afternoon-2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_2_afternoon_3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-2-afternoon-3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_3_afternoon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-3-afternoon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_5_lodash():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-5-lodash')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_action():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-action')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_airbnb():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-airbnb')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_algorithms():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-algorithms')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_allonge():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-allonge')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_arithmetic_lab_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-arithmetic-lab-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_arrays_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-arrays-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_arrays_js_intro_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-arrays-js-intro-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_arrays_lab_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-arrays-lab-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_astar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-astar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_barcode():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-barcode')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_basic_assessment():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-basic-assessment')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_basics():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-basics')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_biginteger():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-biginteger')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_bignum():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-bignum')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_bootcamp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-bootcamp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_challenges():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-challenges')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_challenges_book():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-challenges-book')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_conferences():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-conferences')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_crypto_library():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-crypto-library')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_datastructures_algorithms():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-datastructures-algorithms')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_debug():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-debug')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_decorators():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-decorators')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_design_patterns_for_humans():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-design-patterns-for-humans')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_detect_element_resize():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-detect-element-resize')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_development_environment():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-development-environment')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_dom():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-dom')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_drones():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-drones')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_ebooks():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-ebooks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_empire():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-empire')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_enlightenment():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-enlightenment')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_error_logging():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-error-logging')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_errors_notifier():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-errors-notifier')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_exercises():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-exercises')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_fetch_lab():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-fetch-lab')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_fix_the_scope_lab_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-fix-the-scope-lab-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_fix_the_scope_lab_js_apply_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-fix-the-scope-lab-js-apply-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_fix_the_scope_lab_js_intro_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-fix-the-scope-lab-js-intro-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_for_cats():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-for-cats')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_foundations():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-foundations')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_gauntlet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-gauntlet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_guessing_game():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-guessing-game')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_hide_and_seek_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-hide-and-seek-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_in_14_minutes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-in-14-minutes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_in_one_pic():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-in-one-pic')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_inspirate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-inspirate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_interview():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-interview')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_interview_questions():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-interview-questions')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_intro_to_functions_lab_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-intro-to-functions-lab-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_intro_to_functions_lab_js_apply_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-intro-to-functions-lab-js-apply-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_intro_to_looping_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-intro-to-looping-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_intro_to_looping_js_intro_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-intro-to-looping-js-intro-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_journey():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-journey')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_jpeg_encoder():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-jpeg-encoder')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_jquery_ruble():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-jquery.ruble')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_kit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-kit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_koans():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-koans')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_last_fm_api():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-last.fm-api')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_lessons():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-lessons')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_libraries_syntax_vim():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-libraries-syntax.vim')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_linkify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-linkify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_logging_lab_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-logging-lab-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_logging_lab_js_intro_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-logging-lab-js-intro-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_malware_collection():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-malware-collection')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_mobile_desktop_geolocation():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-mobile-desktop-geolocation')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_natural_sort():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-natural-sort')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_notes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-notes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_obfuscator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-obfuscator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_obfuscator_ui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-obfuscator-ui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_objects_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-objects-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_objects_js_intro_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-objects-js-intro-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_objects_lab_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-objects-lab-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_opentimestamps():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-opentimestamps')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_path():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-path')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_patterns():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-patterns')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_piano():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-piano')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_pong():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-pong')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_profesional():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-profesional')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_professional():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-professional')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_questions():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-questions')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_quiz():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-quiz')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_racer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-racer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_risingstars():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-risingstars')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_robotics():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-robotics')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_rock_dodger_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-rock-dodger-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_rsa():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-rsa')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_samples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-samples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_sandbox_console():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-sandbox-console')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_sdk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-sdk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_sdk_design():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-sdk-design')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_simon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-simon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_snakes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-snakes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_starter_course():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-starter-course')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_state_machine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-state-machine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_strings_lab_js_apply_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-strings-lab-js-apply-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_strings_lab_js_intro_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-strings-lab-js-intro-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_style_guide():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-style-guide')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_test_reporter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-test-reporter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_testing_best_practices():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-testing-best-practices')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_tests():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-tests')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_tetris():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-tetris')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_time_ago():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-time-ago')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_tiny_platformer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-tiny-platformer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_tips_and_tidbits():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-tips-and-tidbits')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_to_purescript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-to-purescript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_todo_list_tutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-todo-list-tutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_tools_tmbundle():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-tools.tmbundle')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_tutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-tutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_tutorial_cn_old():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-tutorial-cn-old')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_tutorial_ru():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-tutorial-ru')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_typescript_langserver():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-typescript-langserver')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_unit_testing_with_mocha():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-unit-testing-with-mocha')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_videos_ru_2018():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-videos-ru-2018')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_web():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-web')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_web_srv():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-web-srv')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_winwheel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-winwheel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_workbook():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-workbook')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_zh():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript-zh')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_patterns():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript.patterns')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_tmbundle():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript.tmbundle')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript101():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript101')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript5_mini():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript5-mini')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript6_examples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript6_examples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_computer_science_exercises():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript_computer_science_exercises')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_curriculum():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript_curriculum')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascript_playground():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascript_playground')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascriptcookbook():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascriptcookbook')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascripting():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascripting')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascriptstuff_db():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascriptstuff-db')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_javascriptvisualizer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'javascriptvisualizer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jerryscript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jerryscript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jest():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jest')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jest_webdriver():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jest-webdriver')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jinja():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jinja')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jint():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jint')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jison():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jison')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_johnny_five():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'johnny-five')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_joint():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'joint')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_joplin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'joplin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_joshfire_framework():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'joshfire-framework')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jotted():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jotted')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jqm_pagination():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jqm-pagination')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jqmobile_metro_theme():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jqmobile-metro-theme')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_approach():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-approach')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_countdown():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-countdown')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_deserialize():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-deserialize')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_facebook_multi_friend_selector():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-facebook-multi-friend-selector')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_fastLiveFilter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-fastLiveFilter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_form_builder_plugin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-form-builder-plugin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_html5_upload():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-html5-upload')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_inlog():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-inlog')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_intelligist():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-intelligist')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_lightbox():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-lightbox')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_pjax():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-pjax')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_postmessage():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-postmessage')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_requestAnimationFrame():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-requestAnimationFrame')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_scrollintoview():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-scrollintoview')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_serialize_object():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-serialize-object')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_simple_slider():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-simple-slider')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_timepicker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-timepicker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_timing():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-timing')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_video_extend():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery-video-extend')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_backgroundSize_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.backgroundSize.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_diamonds_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.diamonds.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_entwine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.entwine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_eventsource():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.eventsource')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_inlineedit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.inlineedit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_resizeend():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.resizeend')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_selection():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.selection')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_serialScroll():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.serialScroll')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_snapscroll():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.snapscroll')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jquery_tweetable_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jquery.tweetable.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jqueryrotate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jqueryrotate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jrac():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jrac')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_assignments():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-assignments')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_base64():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-base64')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_beautify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-beautify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_bits():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-bits')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_by_examples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-by-examples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_cookie():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-cookie')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_git():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-git')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_iso8601():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-iso8601')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_mind():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-mind')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_mindmap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-mindmap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_must_watch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-must-watch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_plugin_circliful():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-plugin-circliful')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_projects():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-projects')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_stack_from_scratch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-stack-from-scratch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_testing_boilerplates():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-testing-boilerplates')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_training():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-training')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_vuln_db():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-vuln-db')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_yaml():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js-yaml')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js_org():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js.org')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_js2coffee():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'js2coffee')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsPDF():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsPDF')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsStudy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsStudy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsTag():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsTag')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsandbox():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsandbox')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsbeautify_for_chrome():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsbeautify-for-chrome')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsbin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsbin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jscodeshift():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jscodeshift')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsctags():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsctags')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsdiff():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsdiff')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsdoc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsdoc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsduck():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsduck')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsep():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsep')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsfeat():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsfeat')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsfuck():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsfuck')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsgif():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsgif')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jslogo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jslogo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsmind():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsmind')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsmpeg():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsmpeg')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsmvc_pres():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsmvc-pres')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsnes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsnes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_json_server():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'json-server')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsondiffpatch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsondiffpatch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsonpath():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsonpath')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jspm_react():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jspm-react')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsqrcode():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsqrcode')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsrepl():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsrepl')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsrt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsrt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jstat():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jstat')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jstorm():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jstorm')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsts():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsts')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jstutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jstutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jsvu():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jsvu')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jszip():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jszip')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_juggernaut_plugin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'juggernaut_plugin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jumly():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jumly')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jungle():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jungle')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_just_not_sorry():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'just-not-sorry')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_jxcore():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'jxcore')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kademlia():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kademlia')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kadoh():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kadoh')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_karma():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'karma')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kbengine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kbengine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kd_tree_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kd-tree-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kept():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kept')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ketchup_plugin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ketchup-plugin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_keycodes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'keycodes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kickstart_meteor_react():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kickstart-meteor-react')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kickstart_meteor_react_router():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kickstart-meteor-react-router')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kickup():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kickup')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kinetic():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kinetic')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kitchensink():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kitchensink')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kiwi():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kiwi')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kline():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kline')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kmdjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kmdjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ko_javascript_info():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ko.javascript.info')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_koa():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'koa')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_koa_middlewares():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'koa-middlewares')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_koa_project_tpl():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'koa-project-tpl')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_koa_proxy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'koa-proxy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_koa_resource_router():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'koa-resource-router')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kopi_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kopi.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kotojs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kotojs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kratko_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kratko.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kss_rails():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kss-rails')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_kubernetes_engine_samples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'kubernetes-engine-samples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_es6_javascript_koans():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-es6-javascript-koans')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_javascript_advanced_algorithms():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-javascript-advanced-algorithms')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_javascript_basic_algorithms():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-javascript-basic-algorithms')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_javascript_chronometer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-javascript-chronometer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_javascript_clue():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-javascript-clue')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_javascript_functions_and_arrays():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-javascript-functions-and-arrays')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_javascript_greatest_movies():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-javascript-greatest-movies')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_javascript_koans():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-javascript-koans')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_javascript_memory_game():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-javascript-memory-game')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lab_javascript_vikings():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lab-javascript-vikings')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_labelmask():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'labelmask')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ladda_angular():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ladda-angular')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lambda_complex():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lambda-complex')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lambda_packager():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lambda-packager')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_language_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'language-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_laraflat():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'laraflat')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_laravel_blade_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'laravel-blade-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_laravel_jsvalidation():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'laravel-jsvalidation')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_layui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'layui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lazyload():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lazyload')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lazypipe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lazypipe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lazysizes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lazysizes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_leapjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'leapjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_learn_fullstack_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'learn-fullstack-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_learn_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'learn-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_learnGitBranching():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'learnGitBranching')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lectric():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lectric')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_leetcode():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'leetcode')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_leetcode_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'leetcode-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lell():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lell')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lerna():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lerna')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_less_js_middleware():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'less.js-middleware')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lets_code_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lets_code_javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_letsrate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'letsrate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_libcanvas():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'libcanvas')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_libsignal_protocol_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'libsignal-protocol-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_libv8():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'libv8')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_libxmljs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'libxmljs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lifxjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lifxjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lightgallery_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lightgallery.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lighthouse():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lighthouse')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lightstep_tracer_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lightstep-tracer-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_limestone():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'limestone')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_linq():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'linq')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_liquid_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'liquid.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_liquidfun():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'liquidfun')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_listloading():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'listloading')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_live_cljs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'live-cljs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_live_log_analyzer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'live-log-analyzer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_livecss():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'livecss')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lively():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lively')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lmd():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lmd')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_loadrunner():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'loadrunner')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_localForage():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'localForage')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lodash():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lodash')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_log_sys():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'log-sys')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lookforward():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lookforward')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lottie_web():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lottie-web')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_loupe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'loupe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_love_webplayer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'love-webplayer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lrInfiniteScroll():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lrInfiniteScroll')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lvlDragDrop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lvlDragDrop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_lz_string():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'lz-string')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mach():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mach')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_machine_learning_for_software_engineers():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'machine-learning-for-software-engineers')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mage():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mage')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_magic_iterable():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'magic-iterable')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_magix_inspector():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'magix-inspector')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_magixjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'magixjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mailmao():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mailmao')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_make_me():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'make-me')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_manim():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'manim')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mantra_cli():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mantra-cli')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mantra_sample_blog_app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mantra-sample-blog-app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_map_stream():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'map-stream')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mapbox_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mapbox.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mapquery():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mapquery')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_maps_api_for_javascript_examples():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'maps-api-for-javascript-examples')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_markdown_here():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'markdown-here')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_markdown_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'markdown-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_markdown_live():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'markdown-live')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_marked():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'marked')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_markerclustererplus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'markerclustererplus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_marktext():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'marktext')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_marquette():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'marquette')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mastering_modular_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mastering-modular-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_matchmedia_ng():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'matchmedia-ng')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_material():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'material')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_material_theme_appbar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'material-theme-appbar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_material_ui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'material-ui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_material_ui_vue():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'material-ui-vue')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_materialize():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'materialize')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_materials():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'materials')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mathjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mathjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_matrix_react_sdk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'matrix-react-sdk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_md2react():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'md2react')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mechanic():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mechanic')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meeting_ticker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meeting-ticker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_megamanjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'megamanjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_melonJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'melonJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_memdiff():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'memdiff')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mermaid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mermaid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mers():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mers')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_chat_tutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-chat-tutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_collection_helpers():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-collection-helpers')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_ddp_analyzer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-ddp-analyzer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_pg():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-pg')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_polymer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-polymer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_react_layout():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-react-layout')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_react_router_ssr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-react-router-ssr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_rethinkdb():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-rethinkdb')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_spin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-spin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_tupperware():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-tupperware')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_typeahead():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-typeahead')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_vue():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-vue')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_webpack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-webpack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_meteor_webpack_react():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'meteor-webpack-react')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_metro():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'metro')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_metronome():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'metronome')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_micro_starter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'micro-starter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_microcosm():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'microcosm')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_midas():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'midas')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_minimal_gltf_loader():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'minimal-gltf-loader')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_minimatch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'minimatch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_minwidth_relocate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'minwidth-relocate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mithril_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mithril.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mixpanel_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mixpanel-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ml():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ml')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mobile_packages():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mobile-packages')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mobile_ui_patterns():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mobile-ui-patterns')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mobx():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mobx')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mobx_reactor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mobx-reactor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mocha():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mocha')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_modalbox():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'modalbox')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_modelizr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'modelizr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_modern_backbone_starterkit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'modern-backbone-starterkit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_modern_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'modern-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_modulargrid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'modulargrid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_modulejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'modulejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_modules():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'modules')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mojs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mojs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_moment():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'moment')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_monaco_editor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'monaco-editor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mongodb_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mongodb-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mongoose():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mongoose')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mongoose_q():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mongoose-q')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_monocles():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'monocles')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_monorouter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'monorouter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_moobile_core():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'moobile-core')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mootools_bootstrap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mootools-bootstrap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mootools_mobile():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mootools-mobile')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mostly_adequate_guide():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mostly-adequate-guide')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mousetrap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mousetrap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mout():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mout')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_move_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'move.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_movie_board():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'movie-board')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_movies_javascript_bolt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'movies-javascript-bolt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_moving_things_with_javascript_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'moving-things-with-javascript-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mpvue():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mpvue')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mr_doc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mr-doc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_msgpack_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'msgpack-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_msgpack_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'msgpack-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_msgpack_lite():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'msgpack-lite')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_msgraph_sdk_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'msgraph-sdk-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mtg_sdk_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mtg-sdk-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mtui_react():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mtui-react')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mullet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mullet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_multilevel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'multilevel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_multiline():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'multiline')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_must_watch_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'must-watch-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mustache_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mustache.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mux():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mux')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_mvi_example():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'mvi-example')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_napajs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'napajs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_napp_alloy_adapter_restsql():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'napp.alloy.adapter.restsql')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nash():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nash')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_naturalScroll():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'naturalScroll')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ndm():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ndm')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ndu():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ndu')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nearley():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nearley')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nectarjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nectarjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_neo4j_javascript_driver():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'neo4j-javascript-driver')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_neo4js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'neo4js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_neocortex():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'neocortex')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nerve():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nerve')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_neunode():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'neunode')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_neuron():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'neuron')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_new_relic_boxes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'new-relic-boxes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_newbie_training():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'newbie-training')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_newsmonger():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'newsmonger')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_next_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'next.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ng_classy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ng-classy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ng_simplePagination():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ng-simplePagination')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ng_youtube_embed():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ng-youtube-embed')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ngForce():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ngForce')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ngMeteor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ngMeteor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ngMidwayTester():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ngMidwayTester')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ngVideo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ngVideo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nightmare():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nightmare')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_amf():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-amf')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_cloudfiles():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-cloudfiles')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_codein():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-codein')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_comment_macros():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-comment-macros')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_csswring():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-csswring')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_dbmon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-dbmon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_dronestream():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-dronestream')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_errno():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-errno')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_fast():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-fast')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_google_distance():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-google-distance')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_hashish():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-hashish')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_host():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-host')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_icalendar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-icalendar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_int64():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-int64')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_jscs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-jscs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_jsonwebtoken():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-jsonwebtoken')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_lessons():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-lessons')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_libvirt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-libvirt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_markdown():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-markdown')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_memcache():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-memcache')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_millenium():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-millenium')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_mime():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-mime')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_modules():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-modules')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_mongolian():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-mongolian')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_neo4j_template():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-neo4j-template')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_paperboy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-paperboy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_persistence():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-persistence')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_quickcheck():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-quickcheck')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_romulus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-romulus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_rtc_peer_connection():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-rtc-peer-connection')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_serialport():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-serialport')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_tar_gz():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-tar.gz')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_term_list():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-term-list')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_twilio():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-twilio')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_xml():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-xml')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_xml2js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-xml2js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_ytdl_core():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node-ytdl-core')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_dbslayer_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node.dbslayer.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodeRunner():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodeRunner')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_node_alipay():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'node_alipay')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodebestpractices():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodebestpractices')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodecms():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodecms')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodejs_intro():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodejs-intro')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodemailer_smtp_transport():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodemailer-smtp-transport')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodember():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodember')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodemon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodemon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_noderce():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'noderce')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodeshot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodeshot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodewiki():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodewiki')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nodrr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nodrr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_noflo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'noflo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nools():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nools')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_normalizr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'normalizr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_normalizr_immutable():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'normalizr-immutable')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_notes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'notes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_notifer_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'notifer.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nprogress():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nprogress')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nssocket():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nssocket')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nucleus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nucleus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nude_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nude.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_numeric():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'numeric')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_numjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'numjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_numscrubberjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'numscrubberjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nutella_scrape():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nutella-scrape')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nuxt_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nuxt.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nw_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nw.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nwmatcher():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nwmatcher')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nya_bootstrap_select():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nya-bootstrap-select')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nylas_mail():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nylas-mail')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_nyroModal():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'nyroModal')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_o_O():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'o_O')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_obey():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'obey')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_objgrep():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'objgrep')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ocrad_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ocrad.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_octotree():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'octotree')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_odoo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'odoo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_olcPixelGameEngine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'olcPixelGameEngine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_on_media_query():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'on-media-query')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_onejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'onejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_opal():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'opal')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_open_bounty():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'open-bounty')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_open_source_search_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'open-source-search-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_opencvjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'opencvjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_openpgpjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'openpgpjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_opentracing_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'opentracing-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_opentype_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'opentype.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_orb():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'orb')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_orbited2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'orbited2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_orca():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'orca')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_oriento():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'oriento')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_osgjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'osgjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_otto():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'otto')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_over_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'over-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_owt_client_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'owt-client-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_p2_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'p2.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_paho_mqtt_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'paho.mqtt.javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pangu_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pangu.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_parallax():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'parallax')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_parcel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'parcel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pareidoloop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pareidoloop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_parse_angular_patch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'parse-angular-patch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_parse_server():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'parse-server')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_particle_excess_demo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'particle-excess-demo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_particles_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'particles.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_passport():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'passport')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pastalog():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pastalog')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pathmenu_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pathmenu.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pavlov():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pavlov')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pdf_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pdf.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pdfkit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pdfkit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pdfmake():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pdfmake')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_peerdium():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'peerdium')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pegjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pegjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_percolatestudio_com():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'percolatestudio.com')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_perk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'perk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_permission_site():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'permission.site')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_phantom_jasmine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'phantom-jasmine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_phantom_render_stream():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'phantom-render-stream')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_phaser():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'phaser')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_phoneformat_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'phoneformat.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_phonegap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'phonegap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_phonegap_desktop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'phonegap-desktop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_photobooth_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'photobooth-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_picard():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'picard')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pin_cushion():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pin-cushion')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pipelines_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pipelines-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pipelines_javascript_docker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pipelines-javascript-docker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pithy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pithy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pixastic():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pixastic')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pixel_picker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pixel-picker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pl():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pl')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_placeholder():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'placeholder')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_placeholder_enhanced():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'placeholder-enhanced')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_planck_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'planck.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_plexus_form():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'plexus-form')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_plotly_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'plotly.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ploy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ploy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_plv8():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'plv8')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_plyr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'plyr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pm2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pm2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pm2_webshell():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pm2-webshell')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pnotify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pnotify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pocketsphinx_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pocketsphinx.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_poi():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'poi')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_polycrypt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'polycrypt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_polyfill():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'polyfill')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_polyfill_webcomponents():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'polyfill-webcomponents')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_polymer_dev():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'polymer-dev')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_polymer_tutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'polymer-tutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_popcorntime_smarttv():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'popcorntime-smarttv')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_popmotion():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'popmotion')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_popper_core():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'popper-core')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_post_forking():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'post-forking')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_postcss():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'postcss')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_postcss_icss_values():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'postcss-icss-values')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_postmark_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'postmark.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pq():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pq')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_practical_modern_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'practical-modern-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pre3d():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pre3d')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_preact():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'preact')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_prepack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'prepack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_prettier():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'prettier')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_prettyCheckable():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'prettyCheckable')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_prismic_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'prismic-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_processing_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'processing-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_profvis():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'profvis')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_project_guidelines():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'project-guidelines')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_projector():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'projector')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_promises_book():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'promises-book')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_promptu_menu():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'promptu-menu')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_propel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'propel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_protobuf_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'protobuf.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_protographql():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'protographql')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_prototype():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'prototype')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_proxyquireify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'proxyquireify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pug():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pug')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pull_to_reload():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pull-to-reload')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pulse():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pulse')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pumpify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pumpify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_puppeteer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'puppeteer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_purescript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'purescript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pusher_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pusher-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_put_selector():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'put-selector')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pynYNAB():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pynYNAB')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_pypyjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'pypyjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_python_vs_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'python-vs-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_q():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'q')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_qooxdoo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'qooxdoo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_qr_scanner():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'qr-scanner')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_qrcodejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'qrcodejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_qss():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'qss')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_quaggaJS():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'quaggaJS')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_quail():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'quail')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_querystring():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'querystring')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_queuer_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'queuer.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_quick_javascript_switcher():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'quick-javascript-switcher')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_quick_ng_repeat():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'quick-ng-repeat')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_quickblox_javascript_sdk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'quickblox-javascript-sdk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_quill():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'quill')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_qunit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'qunit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_r_token():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'r-token')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_r2d2b2g():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'r2d2b2g')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rabbot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rabbot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_racket():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'racket')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_radium_grid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'radium-grid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ragadjust():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ragadjust')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rainbow():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rainbow')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ramda():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ramda')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_raphael():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'raphael')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_raphael_svg_import():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'raphael-svg-import')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_raphael_serialize():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'raphael.serialize')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_raphy_charts():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'raphy-charts')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ratchet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ratchet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rave():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rave')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_razzle():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'razzle')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rd3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rd3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_reD3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'reD3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_bootstrap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-bootstrap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_canvas():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-canvas')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_component_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-component-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_dapp_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-dapp-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_decorators():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-decorators')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_developer_roadmap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-developer-roadmap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_echarts_modules():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-echarts-modules')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_flight():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-flight')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_fullstack_skeleton():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-fullstack-skeleton')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_hooks():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-hooks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_hot_loader_loader():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-hot-loader-loader')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_imation():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-imation')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_inline_grid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-inline-grid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_intercom():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-intercom')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_javascript_to_typescript_transform():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-javascript-to-typescript-transform')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_json_editor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-json-editor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_lightning_design_system():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-lightning-design-system')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_look():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-look')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_magician():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-magician')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_modules():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-modules')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_motion():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-motion')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_clean_form():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-clean-form')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_complex_nav():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-complex-nav')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_elements():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-elements')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_gift_app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-gift-app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_hot_redux_starter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-hot-redux-starter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_loading_container():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-loading-container')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_refresh_infinite_listview():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-refresh-infinite-listview')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_selectablesectionlistview():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-selectablesectionlistview')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_smart_scroll_view():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-smart-scroll-view')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_swiper_animated():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-swiper-animated')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_tabbar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-tabbar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_web():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-web')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_native_webview_bridge():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-native-webview-bridge')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_organism():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-organism')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_page_transitions():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-page-transitions')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_pagify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-pagify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_pivot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-pivot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_pledge():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-pledge')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_redux():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-redux')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_redux_example():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-redux-example')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_router():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-router')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_ruby_china():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-ruby-china')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_select():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-select')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_semantify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-semantify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_showroom_client():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-showroom-client')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_spring():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-spring')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_stampit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-stampit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_starter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-starter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_starter_kit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-starter-kit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_static_plate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-static-plate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_stylesheet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-stylesheet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_testing_mocha_jsdom():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-testing-mocha-jsdom')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_time():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-time')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_transform_catch_errors():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-transform-catch-errors')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_tutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-tutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_tutorial_todos():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-tutorial-todos')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_validation_mixin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-validation-mixin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_virtualized():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-virtualized')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_way_getting_started():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-way-getting-started')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_way_immutable_flux():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-way-immutable-flux')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_webpack_example():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-webpack-example')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_react_with_hooks():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'react-with-hooks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_reactcards():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'reactcards')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_readium_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'readium-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_real_world_javascript_interview_questions():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'real-world-javascript-interview-questions')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_really_simple_color_picker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'really-simple-color-picker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_realtime_playground():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'realtime-playground')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_realworld():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'realworld')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rebound_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rebound-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_reclare():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'reclare')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_recompose():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'recompose')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_red_dwarf():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'red-dwarf')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redash():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redash')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redis_node():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redis-node')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redmine_lightbox():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redmine_lightbox')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_act_async():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-act-async')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_await():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-await')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_easy_app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-easy-app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_fractal():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-fractal')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_friendlist_demo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-friendlist-demo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_react_navigation_demos():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-react-navigation-demos')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_requests():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-requests')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_saga():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-saga')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_saga_tester():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-saga-tester')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_redux_webpack_es6_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'redux-webpack-es6-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_refluxjs_todo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'refluxjs-todo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_refraction():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'refraction')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_refunk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'refunk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_registry():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'registry')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_regression_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'regression-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_relate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'relate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_relay():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'relay')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_relay_sink():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'relay-sink')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rellax():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rellax')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_remtail():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'remtail')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_render_markdown_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'render-markdown-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_replpad():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'replpad')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_reporting_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'reporting-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_request():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'request')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_require_analyzer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'require-analyzer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_requirejs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'requirejs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_requirejs_library():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'requirejs-library')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rereduce():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rereduce')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_reselect():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'reselect')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_resin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'resin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_resourceful():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'resourceful')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_responsive_mockups():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'responsive_mockups')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rest():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rest')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rest_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rest.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_restmvc_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'restmvc.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_resume_github_com():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'resume.github.com')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_reveal_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'reveal.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_reverse_engineering():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'reverse-engineering')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_revue():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'revue')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rfc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rfc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_riak_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'riak-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rickshaw():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rickshaw')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_riot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'riot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_riot_isomorphic():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'riot-isomorphic')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_riotjs_startkit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'riotjs-startkit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rndr_me():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rndr.me')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rocambole():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rocambole')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rollerblade():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rollerblade')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rollup():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rollup')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rome():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rome')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_roslibjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'roslibjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rts():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rts')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ru_javascript_info():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ru.javascript.info')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ruby_javascript_data_viz():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ruby_javascript_data_viz')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_run_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'run-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_runloop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'runloop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rxdb():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rxdb')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_rxjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'rxjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_s3upload_coffee_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 's3upload-coffee-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sa_sdk_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sa-sdk-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sailng():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sailng')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sails():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sails')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sails_auth():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sails-auth')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sails_hook_autoreload():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sails-hook-autoreload')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sails_react_example():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sails-react-example')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sammy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sammy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sample_arti():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sample-arti')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sample_hapi_rest_api():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sample-hapi-rest-api')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sample_app_rails_4():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sample_app_rails_4')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_san():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'san')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sanctuary_def():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sanctuary-def')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sass_color_picker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sass-color-picker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_satisfy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'satisfy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_savepublishing():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'savepublishing')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sc_crud_sample():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sc-crud-sample')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_scala_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'scala-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_scooch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'scooch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_scoreunder():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'scoreunder')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_scotch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'scotch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_screencap():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'screencap')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_screw_unit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'screw-unit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_scribbletune():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'scribbletune')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_script_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'script.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_scriptular():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'scriptular')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_scrollport_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'scrollport-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_scrollreveal():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'scrollreveal')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_search_source():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'search-source')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_searx():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'searx')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_select2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'select2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sennajs_com():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sennajs.com')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sense_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sense-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sense_old():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sense_old')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sentry_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sentry-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sequelize():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sequelize')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_serialize_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'serialize-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_serverless():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'serverless')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_services():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'services')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_services_engineering():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'services-engineering')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_servo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'servo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_servo_shell():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'servo-shell')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_shaka_player():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'shaka-player')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_shapado():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'shapado')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_shape_form():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'shape-form')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_shapesmith():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'shapesmith')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sharp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sharp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sheetjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sheetjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sherlogjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sherlogjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_shift_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'shift-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_shore():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'shore')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_short_and_sweet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'short-and-sweet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_shower():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'shower')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sigma_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sigma.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_simpl():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'simpl')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_simple_frontend_boilerplate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'simple-frontend-boilerplate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_simple_statistics():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'simple-statistics')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_simpleCRM():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'simpleCRM')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_simpledb():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'simpledb')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sinon():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sinon')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sixflix():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sixflix')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sizzle():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sizzle')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sjcl():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sjcl')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sketch_data_studio():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sketch-data-studio')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sketch_relabel_button():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sketch-relabel-button')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sketch_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sketch.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_skills_based_javascript_intro_to_flow_control_bootcamp_prep_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'skills-based-javascript-intro-to-flow-control-bootcamp-prep-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_skills_based_javascript_intro_to_flow_control_js_intro_000():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'skills-based-javascript-intro-to-flow-control-js-intro-000')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_skrollr_decks():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'skrollr-decks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_skulpt():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'skulpt')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_slack_news():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'slack-news')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_slate():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'slate')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_slick():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'slick')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_slush_angular():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'slush-angular')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_smart_contract():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'smart_contract')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_smile():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'smile')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_smokescreen():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'smokescreen')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_smokestack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'smokestack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_smoosh():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'smoosh')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_snabbt_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'snabbt.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_snippets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'snippets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_snowplow_javascript_tracker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'snowplow-javascript-tracker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_soca():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'soca')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_social_engineer_toolkit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'social-engineer-toolkit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_socket_io():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'socket.io')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_socket_io_titanium():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'socket.io-titanium')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_socketcluster_client():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'socketcluster-client')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sockjs_client():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sockjs-client')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sodajs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sodajs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_solidityx_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'solidityx-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_soundcloud_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'soundcloud-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_space_tweet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'space-tweet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_spacedrop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'spacedrop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_spark():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'spark')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sparky():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sparky')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_spazcore():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'spazcore')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_speak_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'speak.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_specter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'specter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_spectrum():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'spectrum')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_speech_javascript_sdk():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'speech-javascript-sdk')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_speed_monitor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'speed-monitor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_spicetify_cli():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'spicetify-cli')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_spine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'spine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_spine_todos():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'spine.todos')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_spm2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'spm2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_spmx():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'spmx')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sqip():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sqip')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sql_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sql.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_squel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'squel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_srcset_polyfill():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'srcset-polyfill')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stackedit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stackedit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_standard():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'standard')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_starter_javascript_exercicios():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'starter-javascript-exercicios')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stat_distributions_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stat-distributions-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stats_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stats.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_statsd():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'statsd')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_steal():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'steal')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_steamSummerMinigame():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'steamSummerMinigame')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stellarium_web_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stellarium-web-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_step_by_step_frontend():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'step-by-step-frontend')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stimulus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stimulus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stochator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stochator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stockfish_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stockfish.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_storybook_addon_jest():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'storybook-addon-jest')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_storybook_addon_knobs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'storybook-addon-knobs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_storyshots():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'storyshots')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stp_pediff():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stp.pediff')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_strapi():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'strapi')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_strapi_sdk_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'strapi-sdk-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stream_handbook():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stream-handbook')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stream_spec():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stream-spec')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_streamgraph_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'streamgraph.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_streamie():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'streamie')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_strftime():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'strftime')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stride():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stride')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_string_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'string.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stripe_meteor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stripe-meteor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_strman():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'strman')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_strong_pm():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'strong-pm')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_structuring_backbone_with_requirejs_and_marionette():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'structuring-backbone-with-requirejs-and-marionette')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stuhome():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stuhome')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_styled_components():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'styled-components')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_styled_theme():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'styled-theme')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_styler():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'styler')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_stylis_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'stylis.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sublime_javascript_snippets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sublime-javascript-snippets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sublime_text_refactor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sublime-text-refactor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_substance_text():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'substance-text')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_substituteteacher_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'substituteteacher.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_suggest():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'suggest')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sunnybaby():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sunnybaby')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_superagent():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'superagent')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_supergrep():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'supergrep')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_superherojs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'superherojs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_supertest_as_promised():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'supertest-as-promised')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_superui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'superui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_superviews_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'superviews.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_survey_library():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'survey-library')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_svelte():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'svelte')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_svgo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'svgo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_swagger_ui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'swagger-ui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_swaggerize_express():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'swaggerize-express')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sweet_core():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sweet-core')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sweet_justice():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sweet-justice')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sweetalert():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sweetalert')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_swf2js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'swf2js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_swig():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'swig')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_swiper():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'swiper')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_symbol_sdk_typescript_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'symbol-sdk-typescript-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_sync_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'sync-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_systemjs_seed():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'systemjs-seed')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_syze():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'syze')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_t1_runtime():
# path_name = os.path.join(constants.seeds_dir, 'repos', 't1-runtime')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tEmbO():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tEmbO')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_taberareloo():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'taberareloo')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tabler():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tabler')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tabulator():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tabulator')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tap_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tap.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tapchat():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tapchat')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_taro():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'taro')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_task_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'task.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_teamchatviz():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'teamchatviz')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tech_interview_handbook():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tech-interview-handbook')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tedit():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tedit')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_telepat_api():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'telepat-api')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_template_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'template.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_template7():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'template7')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_templayed_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'templayed.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_term_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'term.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_terminus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'terminus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tern_meteor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tern-meteor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_terrain():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'terrain')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_terse_webpack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'terse-webpack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tesserace():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tesserace')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tesseract():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tesseract')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tesseract_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tesseract.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_testable_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'testable-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_testing_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'testing-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_text_mask():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'text-mask')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tfjs_core():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tfjs-core')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_the_javascript_curriculum():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'the-javascript-curriculum')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_the_super_tiny_compiler():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'the-super-tiny-compiler')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_thedaywefightback_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'thedaywefightback.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_thejsway():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'thejsway')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_themoviedb_javascript_library():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'themoviedb-javascript-library')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_thingiview_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'thingiview.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_thinky():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'thinky')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_thorax_seed():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'thorax-seed')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_three_orbit_controls():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'three-orbit-controls')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_three_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'three.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_thumbbot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'thumbbot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_thumbd():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'thumbd')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_thunderbird():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'thunderbird')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tic_tac_toe_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tic-tac-toe-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tie():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tie')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tile5():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tile5')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_timeframe():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'timeframe')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_timezone():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'timezone')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tiny_slider():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tiny-slider')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tinymce_rails_imageupload():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tinymce-rails-imageupload')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tip_cards():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tip_cards')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tips():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tips')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tire():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tire')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_titanium_facebook_slide_menu():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'titanium-facebook-slide-menu')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_titanium_developer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'titanium_developer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_titanium_mobile():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'titanium_mobile')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tmlib_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tmlib.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_toastr():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'toastr')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_todomvc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'todomvc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_toe_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'toe.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_token_based_auth_frontend():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'token-based-auth-frontend')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tonal():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tonal')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_topup():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'topup')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tor_fingerprint():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tor-fingerprint')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_touche():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'touche')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tpl_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tpl.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tr8n():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tr8n')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_traceur_compiler():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'traceur-compiler')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_transducers_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'transducers-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_transformer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'transformer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_travel_RN():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'travel-RN')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tree_grid_directive():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tree-grid-directive')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_trek():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'trek')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_trello_calendar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'trello-calendar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_trial_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'trial-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_trilha_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'trilha-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_trophymanager():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'trophymanager')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ttf_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ttf.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tufte_graph():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tufte-graph')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tui_calendar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tui.calendar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tumblr_downloader():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tumblr-downloader')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_turbine_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'turbine.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_turbulenz_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'turbulenz_engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_turf():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'turf')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_tween_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'tween.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_twostroke():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'twostroke')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_twss_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'twss.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_typeahead_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'typeahead.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_typed_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'typed.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_typeofnan_javascript_quizzes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'typeofnan-javascript-quizzes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_typist_jquery():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'typist-jquery')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_uBlock():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'uBlock')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ud549():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ud549')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ui_progress_bar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ui-progress-bar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ui_tinymce():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ui-tinymce')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_uiji():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'uiji')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_uilayer():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'uilayer')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_underscore():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'underscore')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_underscore_inflection():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'underscore.inflection')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_underscore_string():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'underscore.string')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_understanding_npm():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'understanding-npm')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_unexpected_react():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'unexpected-react')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_unexpected_react_shallow():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'unexpected-react-shallow')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_uni_app():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'uni-app')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_unicorn():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'unicorn')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_unify():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'unify')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_universal_react_tutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'universal-react-tutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_universal_redux_template():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'universal-redux-template')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_university():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'university')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_unprecedented_midwife():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'unprecedented-midwife')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_unsplash_source_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'unsplash-source-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_unused_My_Wallet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'unused-My-Wallet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_uppy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'uppy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_uri_templates():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'uri-templates')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_use_amd():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'use-amd')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_utils():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'utils')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_uuid():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'uuid')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_v2ex_ext():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'v2ex.ext')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_v3_utility_library():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'v3-utility-library')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_v7():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'v7')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_v8():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'v8')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_v86():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'v86')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_v8eval():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'v8eval')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_v8js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'v8js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_v8n():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'v8n')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_v8pp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'v8pp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vagueTime_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vagueTime.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_validate_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'validate.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_validator_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'validator.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vant_weapp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vant-weapp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vector_river_map():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vector-river-map')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vektor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vektor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_velocity():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'velocity')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_velositey():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'velositey')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_veria():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'veria')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_verlet_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'verlet-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_video_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'video.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_viewer_javascript_tutorial():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'viewer-javascript-tutorial')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_viewer_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'viewer.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_viewerjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'viewerjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vim_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vim-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vim_recipes():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vim-recipes')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vim_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vim.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vivus():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vivus')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vm_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vm.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_voca():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'voca')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vogue():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vogue')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_voie():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'voie')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vot_ar():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vot.ar')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_voxel_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'voxel-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_voxelengine3():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'voxelengine3')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vscode_es7_javascript_react_snippets():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vscode-es7-javascript-react-snippets')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vscode_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vscode-javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vticker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vticker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vtree():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vtree')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vts_browser_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vts-browser-js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_autocomplete():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-autocomplete')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_cli():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-cli')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_clip():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-clip')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_devtools():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-devtools')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_file_upload_component():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-file-upload-component')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_mini_shop():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-mini-shop')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_region_picker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-region-picker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_router():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-router')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_server():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-server')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_smart_table():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-smart-table')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vue_starter():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vue-starter')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vuepress():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vuepress')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vuera():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vuera')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vuex():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vuex')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_vuln_javascript():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'vuln_javascript')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wagn():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wagn')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_walkabout_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'walkabout.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wax():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wax')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wdui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wdui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_weapp_session():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'weapp-session')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_web_bundle():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'web-bundle')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_web3_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'web3.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webRTCCopy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webRTCCopy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webSlide():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webSlide')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webkit_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webkit.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webpack():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webpack')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webpack_MultiplePage():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webpack-MultiplePage')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webpack_chrome_extension():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webpack-chrome-extension')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webpack_dashboard():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webpack-dashboard')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webpack_messages():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webpack-messages')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webpack_serve():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webpack-serve')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webpack_uglify_parallel():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webpack-uglify-parallel')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webpacker():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webpacker')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webservice_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webservice.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_webtorrent():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'webtorrent')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wechat_helper():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wechat-helper')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_weex_learning():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'weex-learning')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wekan():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wekan')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wepy():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wepy')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_weui_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'weui.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wheel_menu():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wheel-menu')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_whiskey():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'whiskey')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_whitewater_mobile_video():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'whitewater-mobile-video')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wikifetch():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wikifetch')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wind():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wind')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wink():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wink')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_winston():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'winston')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_with_react_hooks():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'with-react-hooks')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wizardry():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wizardry')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_workbox():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'workbox')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_workshop_js_funcional_free():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'workshop-js-funcional-free')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wp_calypso():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wp-calypso')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wpilot():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wpilot')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_ws():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'ws')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wtfjs():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wtfjs')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wujb():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wujb')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wx():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wx')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wxapp_devFrame():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wxapp-devFrame')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wysihat():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wysihat')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wysihat_engine():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wysihat-engine')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_wysiwyg_editor():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'wysiwyg-editor')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_x_spreadsheet():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'x-spreadsheet')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xdomain():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xdomain')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xiaotiantian():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xiaotiantian')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xjst():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xjst')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xkcd_pixels():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xkcd-pixels')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xmpp_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xmpp.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xregexp():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xregexp')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xscroll():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xscroll')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xssor2():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xssor2')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xto():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xto')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_xui():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'xui')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_yapi():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'yapi')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_yarn():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'yarn')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_youkuhtml5playerbookmark():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'youkuhtml5playerbookmark')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_yours_bitcoin():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'yours-bitcoin')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_yuidoc():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'yuidoc')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_z():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'z')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_zelect():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'zelect')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_zeroclickinfo_goodies():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'zeroclickinfo-goodies')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_zethos():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'zethos')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_zh_javascript_info():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'zh.javascript.info')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_zip_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'zip.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
# def test_repos_zone_js():
# path_name = os.path.join(constants.seeds_dir, 'repos', 'zone.js')
# multicall.multicall_directories(path_name, fuzzer='quickfuzz', validator=validate)
| 40.652009 | 127 | 0.749758 | 61,421 | 503,841 | 5.871607 | 0.041533 | 0.109827 | 0.068656 | 0.096079 | 0.911923 | 0.904536 | 0.902146 | 0.901065 | 0.900693 | 0.781128 | 0 | 0.001141 | 0.113234 | 503,841 | 12,393 | 128 | 40.655289 | 0.80604 | 0.959656 | 0 | 0 | 0 | 0 | 0.001085 | 0 | 0 | 0 | 0 | 0.000081 | 0 | 1 | 0.0625 | false | 0 | 0.375 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
fc0b8081a324ff00d228bc9e3b9225935bcc633e | 6,289 | py | Python | DataGenerator.py | parkerhuynh/Speech-to-Text | dad60f8355f1f6bdeb1385ebe29d0dd3d08cf74f | [
"Apache-2.0"
] | null | null | null | DataGenerator.py | parkerhuynh/Speech-to-Text | dad60f8355f1f6bdeb1385ebe29d0dd3d08cf74f | [
"Apache-2.0"
] | null | null | null | DataGenerator.py | parkerhuynh/Speech-to-Text | dad60f8355f1f6bdeb1385ebe29d0dd3d08cf74f | [
"Apache-2.0"
] | null | null | null | import pandas as pd
import numpy as np
import utils
from tensorflow.keras.preprocessing.sequence import pad_sequences
import tensorflow as tf
from tensorflow import keras
import config
from scipy.io import wavfile
import python_speech_features
class CleanDataGenerator(keras.utils.Sequence):
'Generates data for Keras'
def __init__(self, dataset, data_parameter = config.data_parameters,
chr_mapping = utils.chr_mapping()):
'Initialization'
self.dataset = dataset
self.batch_size = data_parameter["batch_size"]
self.char_mapping = chr_mapping
self.on_epoch_end()
def __len__(self):
'Denotes the number of batches per epoch'
return int(np.floor(len(self.dataset) / self.batch_size))
def __getitem__(self, index):
'Generate one batch of data'
# Generate indexes of the batch
indexes = self.indexes[index*self.batch_size:(index+1)*self.batch_size]
batch_data = [self.dataset.iloc[k] for k in indexes]
audios, labels = self.__data_generation(batch_data)
return audios, labels
'text processing'
def text_to_idx(self, text):
text = text.lower()
idx = []
for chr in text:
if chr in self.char_mapping :
idx.append(self.char_mapping[chr])
return idx
'normalize raw audio'
def normalize(self, audio):
gain = 1.0 / (np.max(np.abs(audio)) + 1e-5)
return audio * gain
'standardize FBANK'
def standardize(self,features):
mean = np.mean(features)
std = np.std(features)
return (features - mean) / std
'FBAnk processing'
def audio_to_features(self, audio):
sf, audio = wavfile.read(f"./data/LJSpeech-1.1/wavs/{audio}.wav")
audio = self.normalize(audio.astype(np.float32))
audio = (audio * np.iinfo(np.int16).max).astype(np.int16)
feat, energy = python_speech_features.fbank(
audio, nfilt=160, winlen=0.02,winstep=0.01, winfunc = np.hanning)
features = np.log(feat)
return self.standardize(features)
def on_epoch_end(self):
'Updates indexes after each epoch'
self.indexes = np.arange(len(self.dataset))
def __data_generation(self, batch_data):
audios = []
labels = []
label_len = []
audio_len = []
for filename, transcript in batch_data:
audio = self.audio_to_features(filename)
audios.append(audio)
audio_len.append(len(audio))
label = self.text_to_idx(transcript)
labels.append(label)
label_len.append(len(label))
max_audio_len = max(audio_len)
max_label_len = max(label_len)
audios = pad_sequences(audios, maxlen = max_audio_len, dtype='float32', value=0, padding='post')
labels = pad_sequences(labels, maxlen = max_label_len, value=28, padding='post')
return audios, labels
class NoiseDataGenerator(keras.utils.Sequence):
'Generates data for Keras'
def __init__(self, dataset, noise_sound, batch_size=32, noise_rate = config.data_parameters["noise_rate"]):
'Initialization'
self.dataset = dataset
self.batch_size = data_parameter["batch_size"]
self.char_mapping = chr_mapping
self.noise_sound = noise_sound
self.noise_rate = noise_rate
self.on_epoch_end()
def __len__(self):
'Denotes the number of batches per epoch'
return int(np.floor(len(self.dataset) / self.batch_size))
def __getitem__(self, index):
'Generate one batch of data'
# Generate indexes of the batch
indexes = self.indexes[index*self.batch_size:(index+1)*self.batch_size]
batch_data = [self.dataset.iloc[k] for k in indexes]
audios, labels = self.__data_generation(batch_data)
return audios, labels
'text processing'
def text_to_idx(self, text):
text = text.lower()
idx = []
for chr in text:
if chr in self.char_mapping :
idx.append(self.char_mapping[chr])
return idx
'normalize raw audio'
def normalize(self, audio):
gain = 1.0 / (np.max(np.abs(audio)) + 1e-5)
return audio * gain
'standardize FBANK'
def standardize(self,features):
mean = np.mean(features)
std = np.std(features)
return (features - mean) / std
'FBAnk processing'
def audio_to_features(self, audio):
#Load speech sound
sf, audio = wavfile.read(f"./LJSpeech-1.1/wavs/{audio}.wav")
#Trim battle sound
i = np.random.randint(int(len(self.noise_sound) - len(audio)))
try:
trim_audio = self.noise_sound[i:i+len(audio)][:, 0]
except:
trim_audio = self.noise_sound[i:i+len(audio)]
#Add noise sound to speech sound
audio = audio*(1-self.noise_rate) + trim_audio*self.noise_rate
audio = self.normalize(audio.astype(np.float32))
audio = (audio * np.iinfo(np.int16).max).astype(np.int16)
feat, energy = python_speech_features.fbank(
audio, nfilt=160, winlen=0.02,winstep=0.01, winfunc = np.hanning)
features = np.log(feat)
return self.standardize(features)
def on_epoch_end(self):
'Updates indexes after each epoch'
self.indexes = np.arange(len(self.dataset))
def __data_generation(self, batch_data):
audios = []
labels = []
label_len = []
audio_len = []
for filename, transcript in batch_data:
audio = self.audio_to_features(filename)
audios.append(audio)
audio_len.append(len(audio))
label = self.text_to_idx(transcript)
labels.append(label)
label_len.append(len(label))
max_audio_len = max(audio_len)
max_label_len = max(label_len)
audios = pad_sequences(audios, maxlen = max_audio_len, dtype='float32', value=0, padding='post')
labels = pad_sequences(labels, maxlen = max_label_len, value=28, padding='post')
return audios, labels | 35.937143 | 111 | 0.617268 | 791 | 6,289 | 4.720607 | 0.166877 | 0.026513 | 0.027852 | 0.021425 | 0.846813 | 0.836636 | 0.824853 | 0.824853 | 0.824853 | 0.807177 | 0 | 0.012804 | 0.279695 | 6,289 | 175 | 112 | 35.937143 | 0.811479 | 0.064557 | 0 | 0.833333 | 0 | 0 | 0.086215 | 0.010878 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.0625 | 0 | 0.298611 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc29fe46706dac90265b9f8a6a984dcacef649a4 | 5,676 | py | Python | tests/test_model/test_backbone/test_bottleneck.py | ZJCV/PyCls | 1ef59301646b6134f2ffcc009b4fd76550fa4089 | [
"Apache-2.0"
] | 110 | 2021-02-04T14:32:57.000Z | 2022-03-30T01:51:56.000Z | tests/test_model/test_backbone/test_bottleneck.py | likyoo/ZCls | 568621aca3a8b090c93345f0858d52c5757f2f0e | [
"Apache-2.0"
] | 8 | 2021-04-11T02:46:57.000Z | 2021-12-14T19:30:58.000Z | tests/test_model/test_backbone/test_bottleneck.py | likyoo/ZCls | 568621aca3a8b090c93345f0858d52c5757f2f0e | [
"Apache-2.0"
] | 20 | 2021-02-07T14:17:07.000Z | 2022-03-22T05:20:40.000Z | # -*- coding: utf-8 -*-
"""
@date: 2020/11/21 下午3:33
@file: test_bottleneck.py
@author: zj
@description:
"""
import torch
import torch.nn as nn
from zcls.model.backbones.resnet.bottleneck import Bottleneck
def test_bottleneck():
data = torch.randn(1, 256, 56, 56)
in_planes = 256
out_planes = 128
expansion = Bottleneck.expansion
# 不进行下采样
stride = 1
down_sample = nn.Sequential(
nn.Conv2d(in_planes, out_planes * expansion, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_planes * expansion),
)
model = Bottleneck(in_planes, out_planes, stride, down_sample)
print(model)
outputs = model(data)
print(outputs.shape)
assert outputs.shape == (1, 512, 56, 56)
# 进行下采样
stride = 2
down_sample = nn.Sequential(
nn.Conv2d(in_planes, out_planes * expansion, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_planes * expansion),
)
model = Bottleneck(in_planes, out_planes, stride, down_sample)
print(model)
outputs = model(data)
print(outputs.shape)
assert outputs.shape == (1, 512, 28, 28)
# 32x4d
# 进行下采样
stride = 2
down_sample = nn.Sequential(
nn.Conv2d(in_planes, out_planes * expansion, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_planes * expansion),
)
model = Bottleneck(in_planes, out_planes, stride, down_sample, 32, 4)
print(model)
outputs = model(data)
print(outputs.shape)
assert outputs.shape == (1, 512, 28, 28)
def test_attention_bottleneck(attention_type='SqueezeAndExcitationBlock2D'):
with_attention = 1
reduction = 16
data = torch.randn(3, 256, 56, 56)
in_planes = 256
out_planes = 128
expansion = Bottleneck.expansion
# 不进行下采样
stride = 1
down_sample = nn.Sequential(
nn.Conv2d(in_planes, out_planes * expansion, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_planes * expansion),
)
model = Bottleneck(in_planes, out_planes, stride, down_sample)
print(model)
outputs = model(data)
print(outputs.shape)
assert outputs.shape == (3, 512, 56, 56)
# 进行下采样
stride = 2
down_sample = nn.Sequential(
nn.Conv2d(in_planes, out_planes * expansion, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_planes * expansion),
)
model = Bottleneck(in_planes=in_planes,
out_planes=out_planes,
stride=stride,
down_sample=down_sample,
with_attention=with_attention,
reduction=reduction,
attention_type=attention_type
)
print(model)
outputs = model(data)
print(outputs.shape)
assert outputs.shape == (3, 512, 28, 28)
# 32x4d
# 进行下采样
stride = 2
down_sample = nn.Sequential(
nn.Conv2d(in_planes, out_planes * expansion, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_planes * expansion),
)
model = Bottleneck(in_planes=in_planes,
out_planes=out_planes,
stride=stride,
down_sample=down_sample,
groups=32,
base_width=4,
with_attention=with_attention,
reduction=reduction,
attention_type=attention_type
)
print(model)
outputs = model(data)
print(outputs.shape)
assert outputs.shape == (3, 512, 28, 28)
def test_avg_bottleneck():
data = torch.randn(1, 256, 56, 56)
in_planes = 256
out_planes = 128
expansion = Bottleneck.expansion
# 不进行下采样
stride = 1
down_sample = nn.Sequential(
nn.Conv2d(in_planes, out_planes * expansion, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_planes * expansion),
)
model = Bottleneck(in_planes, out_planes, stride, down_sample, use_avg=False, fast_avg=False)
print(model)
outputs = model(data)
print(outputs.shape)
assert outputs.shape == (1, 512, 56, 56)
# 进行下采样 avg
stride = 2
down_sample = nn.Sequential(
nn.Conv2d(in_planes, out_planes * expansion, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_planes * expansion),
)
model = Bottleneck(in_planes, out_planes, stride, down_sample, use_avg=True, fast_avg=False)
print(model)
outputs = model(data)
print(outputs.shape)
assert outputs.shape == (1, 512, 28, 28)
# 32x4d_fast_avg
# 进行下采样
stride = 2
down_sample = nn.Sequential(
nn.Conv2d(in_planes, out_planes * expansion, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_planes * expansion),
)
model = Bottleneck(in_planes, out_planes, stride, down_sample, 32, 4, use_avg=True, fast_avg=True)
print(model)
outputs = model(data)
print(outputs.shape)
assert outputs.shape == (1, 512, 28, 28)
if __name__ == '__main__':
print('*' * 10 + ' bottleneck')
test_bottleneck()
print('*' * 10 + ' se bottleneck')
test_attention_bottleneck(attention_type='SqueezeAndExcitationBlock2D')
print('*' * 10 + ' nl bottleneck')
test_attention_bottleneck(attention_type='NonLocal2DEmbeddedGaussian')
print('*' * 10 + ' snl bottleneck')
test_attention_bottleneck(attention_type='SimplifiedNonLocal2DEmbeddedGaussian')
print('*' * 10 + ' gc bottleneck')
test_attention_bottleneck(attention_type='GlobalContextBlock2D')
print('*' * 10 + ' avg bottleneck')
test_avg_bottleneck()
| 30.191489 | 102 | 0.632664 | 670 | 5,676 | 5.162687 | 0.123881 | 0.083261 | 0.08673 | 0.088465 | 0.875108 | 0.866146 | 0.786933 | 0.786933 | 0.786933 | 0.786933 | 0 | 0.047608 | 0.259866 | 5,676 | 187 | 103 | 30.352941 | 0.775768 | 0.033122 | 0 | 0.71223 | 0 | 0 | 0.042612 | 0.021214 | 0 | 0 | 0 | 0 | 0.064748 | 1 | 0.021583 | false | 0 | 0.021583 | 0 | 0.043165 | 0.172662 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc2a9b23989aee059198108adeace687b2014bbc | 9,704 | py | Python | dfirtrack_config/tests/artifact/test_artifact_exporter_spreadsheet_xls_config_views.py | thomas-kropeit/dfirtrack | b1e0e659af7bc8085cfe2d269ddc651f9f4ba585 | [
"Apache-2.0"
] | 273 | 2018-04-18T22:09:15.000Z | 2021-06-04T09:15:48.000Z | dfirtrack_config/tests/artifact/test_artifact_exporter_spreadsheet_xls_config_views.py | stuhli/dfirtrack | 9260c91e4367b36d4cb1ae7efe4e2d2452f58e6e | [
"Apache-2.0"
] | 75 | 2018-08-31T11:05:37.000Z | 2021-06-08T14:15:07.000Z | dfirtrack_config/tests/artifact/test_artifact_exporter_spreadsheet_xls_config_views.py | thomas-kropeit/dfirtrack | b1e0e659af7bc8085cfe2d269ddc651f9f4ba585 | [
"Apache-2.0"
] | 61 | 2018-11-12T22:55:48.000Z | 2021-06-06T15:16:16.000Z | import urllib.parse
from django.contrib.auth.models import User
from django.contrib.messages import get_messages
from django.test import TestCase
from dfirtrack_artifacts.models import Artifactstatus
from dfirtrack_config.models import ArtifactExporterSpreadsheetXlsConfigModel
class ArtifactExporterSpreadsheetXlsConfigViewTestCase(TestCase):
"""artifact exporter spreadsheet XLS config view tests"""
@classmethod
def setUpTestData(cls):
# create user
User.objects.create_user(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# create objects
Artifactstatus.objects.create(
artifactstatus_name='artifactstatus_1',
artifactstatus_slug='artifactstatus_1',
)
Artifactstatus.objects.create(
artifactstatus_name='artifactstatus_2',
artifactstatus_slug='artifactstatus_2',
)
def test_artifact_exporter_spreadsheet_xls_config_not_logged_in(self):
"""test exporter view"""
# create url
destination = '/login/?next=' + urllib.parse.quote(
'/config/artifact/exporter/spreadsheet/xls/', safe=''
)
# get response
response = self.client.get(
'/config/artifact/exporter/spreadsheet/xls/', follow=True
)
# compare
self.assertRedirects(
response, destination, status_code=302, target_status_code=200
)
def test_artifact_exporter_spreadsheet_xls_config_logged_in(self):
"""test view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# get response
response = self.client.get('/config/artifact/exporter/spreadsheet/xls/')
# compare
self.assertEqual(response.status_code, 200)
def test_artifact_exporter_spreadsheet_xls_config_template(self):
"""test exporter view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# get response
response = self.client.get('/config/artifact/exporter/spreadsheet/xls/')
# compare
self.assertTemplateUsed(
response,
'dfirtrack_config/artifact/artifact_exporter_spreadsheet_xls_config_popup.html',
)
def test_artifact_exporter_spreadsheet_xls_config_get_user_context(self):
"""test exporter view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# get response
response = self.client.get('/config/artifact/exporter/spreadsheet/xls/')
# compare
self.assertEqual(
str(response.context['user']),
'testuser_artifact_exporter_spreadsheet_xls_config',
)
def test_artifact_exporter_spreadsheet_xls_config_redirect(self):
"""test view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# create url
destination = urllib.parse.quote(
'/config/artifact/exporter/spreadsheet/xls/', safe='/'
)
# get response
response = self.client.get(
'/config/artifact/exporter/spreadsheet/xls', follow=True
)
# compare
self.assertRedirects(
response, destination, status_code=301, target_status_code=200
)
def test_artifact_exporter_spreadsheet_xls_config_post_message(self):
"""test view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# get objects
artifactstatus_1 = Artifactstatus.objects.get(
artifactstatus_name='artifactstatus_1'
).artifactstatus_id
artifactstatus_2 = Artifactstatus.objects.get(
artifactstatus_name='artifactstatus_2'
).artifactstatus_id
# create post data
data_dict = {
'artifactlist_xls_choice_artifactstatus': [
str(artifactstatus_1),
str(artifactstatus_2),
],
}
# get response
response = self.client.post(
'/config/artifact/exporter/spreadsheet/xls/', data_dict
)
# get messages
messages = list(get_messages(response.wsgi_request))
# compare
self.assertEqual(
str(messages[-1]), 'Artifact exporter spreadsheet XLS config changed'
)
def test_artifact_exporter_spreadsheet_xls_config_post_redirect(self):
"""test view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# get objects
artifactstatus_1 = Artifactstatus.objects.get(
artifactstatus_name='artifactstatus_1'
).artifactstatus_id
artifactstatus_2 = Artifactstatus.objects.get(
artifactstatus_name='artifactstatus_2'
).artifactstatus_id
# create post data
data_dict = {
'artifactlist_xls_choice_artifactstatus': [
str(artifactstatus_1),
str(artifactstatus_2),
],
}
# get response
response = self.client.post(
'/config/artifact/exporter/spreadsheet/xls/', data_dict
)
# compare
self.assertEqual(response.status_code, 200)
def test_artifact_exporter_spreadsheet_xls_config_post_artifact_id_false(self):
"""test view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# get objects
artifactstatus_1 = Artifactstatus.objects.get(
artifactstatus_name='artifactstatus_1'
).artifactstatus_id
artifactstatus_2 = Artifactstatus.objects.get(
artifactstatus_name='artifactstatus_2'
).artifactstatus_id
# create post data
data_dict = {
'artifactlist_xls_choice_artifactstatus': [
str(artifactstatus_1),
str(artifactstatus_2),
],
}
# get response
self.client.post('/config/artifact/exporter/spreadsheet/xls/', data_dict)
# get object
artifact_exporter_spreadsheet_xls_config_model = ArtifactExporterSpreadsheetXlsConfigModel.objects.get(
artifact_exporter_spreadsheet_xls_config_name='ArtifactExporterSpreadsheetXlsConfig'
)
# compare
self.assertFalse(
artifact_exporter_spreadsheet_xls_config_model.artifactlist_xls_artifact_id
)
def test_artifact_exporter_spreadsheet_xls_config_post_artifact_id_true(self):
"""test view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# get objects
artifactstatus_1 = Artifactstatus.objects.get(
artifactstatus_name='artifactstatus_1'
).artifactstatus_id
artifactstatus_2 = Artifactstatus.objects.get(
artifactstatus_name='artifactstatus_2'
).artifactstatus_id
# create post data
data_dict = {
'artifactlist_xls_choice_artifactstatus': [
str(artifactstatus_1),
str(artifactstatus_2),
],
'artifactlist_xls_artifact_id': 'on',
}
# get response
self.client.post('/config/artifact/exporter/spreadsheet/xls/', data_dict)
# get object
artifact_exporter_spreadsheet_xls_config_model = ArtifactExporterSpreadsheetXlsConfigModel.objects.get(
artifact_exporter_spreadsheet_xls_config_name='ArtifactExporterSpreadsheetXlsConfig'
)
# compare
self.assertTrue(
artifact_exporter_spreadsheet_xls_config_model.artifactlist_xls_artifact_id
)
def test_artifact_exporter_spreadsheet_xls_config_post_invalid_reload(self):
"""test view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# create post data
data_dict = {}
# get response
response = self.client.post(
'/config/artifact/exporter/spreadsheet/xls/', data_dict
)
# compare
self.assertEqual(response.status_code, 200)
def test_artifact_exporter_spreadsheet_xls_config_post_invalid_template(self):
"""test view"""
# login testuser
self.client.login(
username='testuser_artifact_exporter_spreadsheet_xls_config',
password='i3jLLnbrAEgel24sGs9i',
)
# create post data
data_dict = {}
# get response
response = self.client.post(
'/config/artifact/exporter/spreadsheet/xls/', data_dict
)
# compare
self.assertTemplateUsed(
response,
'dfirtrack_config/artifact/artifact_exporter_spreadsheet_xls_config_popup.html',
)
| 34.657143 | 111 | 0.64087 | 869 | 9,704 | 6.833142 | 0.103567 | 0.123947 | 0.209161 | 0.232401 | 0.878579 | 0.861401 | 0.834119 | 0.812395 | 0.80448 | 0.802122 | 0 | 0.013454 | 0.279988 | 9,704 | 279 | 112 | 34.781362 | 0.83641 | 0.082852 | 0 | 0.603175 | 0 | 0 | 0.229667 | 0.175065 | 0 | 0 | 0 | 0 | 0.058201 | 1 | 0.063492 | false | 0.058201 | 0.031746 | 0 | 0.100529 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
fc8522ced3f355d3558bcad345a32ebe05a897ff | 9,540 | py | Python | test/option/taskmastertrace.py | Valkatraz/scons | 5e70c65f633dcecc035751c9f0c6f894088df8a0 | [
"MIT"
] | 1,403 | 2017-11-23T14:24:01.000Z | 2022-03-30T20:59:39.000Z | test/option/taskmastertrace.py | Valkatraz/scons | 5e70c65f633dcecc035751c9f0c6f894088df8a0 | [
"MIT"
] | 3,708 | 2017-11-27T13:47:12.000Z | 2022-03-29T17:21:17.000Z | test/option/taskmastertrace.py | Valkatraz/scons | 5e70c65f633dcecc035751c9f0c6f894088df8a0 | [
"MIT"
] | 281 | 2017-12-01T23:48:38.000Z | 2022-03-31T15:25:44.000Z | #!/usr/bin/env python
#
# __COPYRIGHT__
#
# Permission is hereby granted, free of charge, to any person obtaining
# a copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to
# the following conditions:
#
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY
# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
# WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
__revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
"""
Simple tests of the --taskmastertrace= option.
"""
import TestSCons
test = TestSCons.TestSCons()
test.write('SConstruct', """
DefaultEnvironment(tools=[])
env = Environment(tools=[])
# We name the files 'Tfile' so that they will sort after the SConstruct
# file regardless of whether the test is being run on a case-sensitive
# or case-insensitive system.
env.Command('Tfile.out', 'Tfile.mid', Copy('$TARGET', '$SOURCE'))
env.Command('Tfile.mid', 'Tfile.in', Copy('$TARGET', '$SOURCE'))
""")
test.write('Tfile.in', "Tfile.in\n")
expect_stdout = test.wrap_stdout("""\
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <no_state 0 '.'> and its children:
Taskmaster: <no_state 0 'SConstruct'>
Taskmaster: <no_state 0 'Tfile.in'>
Taskmaster: <no_state 0 'Tfile.mid'>
Taskmaster: <no_state 0 'Tfile.out'>
Taskmaster: adjusted ref count: <pending 1 '.'>, child 'SConstruct'
Taskmaster: adjusted ref count: <pending 2 '.'>, child 'Tfile.in'
Taskmaster: adjusted ref count: <pending 3 '.'>, child 'Tfile.mid'
Taskmaster: adjusted ref count: <pending 4 '.'>, child 'Tfile.out'
Taskmaster: Considering node <no_state 0 'SConstruct'> and its children:
Taskmaster: Evaluating <pending 0 'SConstruct'>
Task.make_ready_current(): node <pending 0 'SConstruct'>
Task.prepare(): node <up_to_date 0 'SConstruct'>
Task.executed_with_callbacks(): node <up_to_date 0 'SConstruct'>
Task.postprocess(): node <up_to_date 0 'SConstruct'>
Task.postprocess(): removing <up_to_date 0 'SConstruct'>
Task.postprocess(): adjusted parent ref count <pending 3 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <no_state 0 'Tfile.in'> and its children:
Taskmaster: Evaluating <pending 0 'Tfile.in'>
Task.make_ready_current(): node <pending 0 'Tfile.in'>
Task.prepare(): node <up_to_date 0 'Tfile.in'>
Task.executed_with_callbacks(): node <up_to_date 0 'Tfile.in'>
Task.postprocess(): node <up_to_date 0 'Tfile.in'>
Task.postprocess(): removing <up_to_date 0 'Tfile.in'>
Task.postprocess(): adjusted parent ref count <pending 2 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <no_state 0 'Tfile.mid'> and its children:
Taskmaster: <up_to_date 0 'Tfile.in'>
Taskmaster: Evaluating <pending 0 'Tfile.mid'>
Task.make_ready_current(): node <pending 0 'Tfile.mid'>
Task.prepare(): node <executing 0 'Tfile.mid'>
Task.execute(): node <executing 0 'Tfile.mid'>
Copy("Tfile.mid", "Tfile.in")
Task.executed_with_callbacks(): node <executing 0 'Tfile.mid'>
Task.postprocess(): node <executed 0 'Tfile.mid'>
Task.postprocess(): removing <executed 0 'Tfile.mid'>
Task.postprocess(): adjusted parent ref count <pending 1 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <no_state 0 'Tfile.out'> and its children:
Taskmaster: <executed 0 'Tfile.mid'>
Taskmaster: Evaluating <pending 0 'Tfile.out'>
Task.make_ready_current(): node <pending 0 'Tfile.out'>
Task.prepare(): node <executing 0 'Tfile.out'>
Task.execute(): node <executing 0 'Tfile.out'>
Copy("Tfile.out", "Tfile.mid")
Task.executed_with_callbacks(): node <executing 0 'Tfile.out'>
Task.postprocess(): node <executed 0 'Tfile.out'>
Task.postprocess(): removing <executed 0 'Tfile.out'>
Task.postprocess(): adjusted parent ref count <pending 0 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <pending 0 '.'> and its children:
Taskmaster: <up_to_date 0 'SConstruct'>
Taskmaster: <up_to_date 0 'Tfile.in'>
Taskmaster: <executed 0 'Tfile.mid'>
Taskmaster: <executed 0 'Tfile.out'>
Taskmaster: Evaluating <pending 0 '.'>
Task.make_ready_current(): node <pending 0 '.'>
Task.prepare(): node <executing 0 '.'>
Task.execute(): node <executing 0 '.'>
Task.executed_with_callbacks(): node <executing 0 '.'>
Task.postprocess(): node <executed 0 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: No candidate anymore.
""")
test.run(arguments='--taskmastertrace=- .', stdout=expect_stdout)
test.run(arguments='-c .')
expect_stdout = test.wrap_stdout("""\
Copy("Tfile.mid", "Tfile.in")
Copy("Tfile.out", "Tfile.mid")
""")
test.run(arguments='--taskmastertrace=trace.out .', stdout=expect_stdout)
expect_trace = """\
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <no_state 0 '.'> and its children:
Taskmaster: <no_state 0 'SConstruct'>
Taskmaster: <no_state 0 'Tfile.in'>
Taskmaster: <no_state 0 'Tfile.mid'>
Taskmaster: <no_state 0 'Tfile.out'>
Taskmaster: adjusted ref count: <pending 1 '.'>, child 'SConstruct'
Taskmaster: adjusted ref count: <pending 2 '.'>, child 'Tfile.in'
Taskmaster: adjusted ref count: <pending 3 '.'>, child 'Tfile.mid'
Taskmaster: adjusted ref count: <pending 4 '.'>, child 'Tfile.out'
Taskmaster: Considering node <no_state 0 'SConstruct'> and its children:
Taskmaster: Evaluating <pending 0 'SConstruct'>
Task.make_ready_current(): node <pending 0 'SConstruct'>
Task.prepare(): node <up_to_date 0 'SConstruct'>
Task.executed_with_callbacks(): node <up_to_date 0 'SConstruct'>
Task.postprocess(): node <up_to_date 0 'SConstruct'>
Task.postprocess(): removing <up_to_date 0 'SConstruct'>
Task.postprocess(): adjusted parent ref count <pending 3 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <no_state 0 'Tfile.in'> and its children:
Taskmaster: Evaluating <pending 0 'Tfile.in'>
Task.make_ready_current(): node <pending 0 'Tfile.in'>
Task.prepare(): node <up_to_date 0 'Tfile.in'>
Task.executed_with_callbacks(): node <up_to_date 0 'Tfile.in'>
Task.postprocess(): node <up_to_date 0 'Tfile.in'>
Task.postprocess(): removing <up_to_date 0 'Tfile.in'>
Task.postprocess(): adjusted parent ref count <pending 2 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <no_state 0 'Tfile.mid'> and its children:
Taskmaster: <up_to_date 0 'Tfile.in'>
Taskmaster: Evaluating <pending 0 'Tfile.mid'>
Task.make_ready_current(): node <pending 0 'Tfile.mid'>
Task.prepare(): node <executing 0 'Tfile.mid'>
Task.execute(): node <executing 0 'Tfile.mid'>
Task.executed_with_callbacks(): node <executing 0 'Tfile.mid'>
Task.postprocess(): node <executed 0 'Tfile.mid'>
Task.postprocess(): removing <executed 0 'Tfile.mid'>
Task.postprocess(): adjusted parent ref count <pending 1 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <no_state 0 'Tfile.out'> and its children:
Taskmaster: <executed 0 'Tfile.mid'>
Taskmaster: Evaluating <pending 0 'Tfile.out'>
Task.make_ready_current(): node <pending 0 'Tfile.out'>
Task.prepare(): node <executing 0 'Tfile.out'>
Task.execute(): node <executing 0 'Tfile.out'>
Task.executed_with_callbacks(): node <executing 0 'Tfile.out'>
Task.postprocess(): node <executed 0 'Tfile.out'>
Task.postprocess(): removing <executed 0 'Tfile.out'>
Task.postprocess(): adjusted parent ref count <pending 0 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: Considering node <pending 0 '.'> and its children:
Taskmaster: <up_to_date 0 'SConstruct'>
Taskmaster: <up_to_date 0 'Tfile.in'>
Taskmaster: <executed 0 'Tfile.mid'>
Taskmaster: <executed 0 'Tfile.out'>
Taskmaster: Evaluating <pending 0 '.'>
Task.make_ready_current(): node <pending 0 '.'>
Task.prepare(): node <executing 0 '.'>
Task.execute(): node <executing 0 '.'>
Task.executed_with_callbacks(): node <executing 0 '.'>
Task.postprocess(): node <executed 0 '.'>
Taskmaster: Looking for a node to evaluate
Taskmaster: No candidate anymore.
"""
test.must_match('trace.out', expect_trace, mode='r')
test.pass_test()
# Local Variables:
# tab-width:4
# indent-tabs-mode:nil
# End:
# vim: set expandtab tabstop=4 shiftwidth=4:
| 42.026432 | 80 | 0.682285 | 1,264 | 9,540 | 5.04193 | 0.150316 | 0.058371 | 0.031069 | 0.031069 | 0.765887 | 0.744861 | 0.744861 | 0.743763 | 0.743763 | 0.743763 | 0 | 0.014927 | 0.192453 | 9,540 | 226 | 81 | 42.212389 | 0.812305 | 0.12086 | 0 | 0.876623 | 0 | 0 | 0.948585 | 0.083564 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.006494 | 0.006494 | 0 | 0.006494 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc8ab25d5c65fc84b7d2f9da07f73958c175e092 | 399 | py | Python | forte/data/readers/__init__.py | huzecong/forte | beae4e923c9a6873b582588972e6ec9919079271 | [
"Apache-2.0"
] | null | null | null | forte/data/readers/__init__.py | huzecong/forte | beae4e923c9a6873b582588972e6ec9919079271 | [
"Apache-2.0"
] | null | null | null | forte/data/readers/__init__.py | huzecong/forte | beae4e923c9a6873b582588972e6ec9919079271 | [
"Apache-2.0"
] | null | null | null | from forte.data.readers.base_reader import *
from forte.data.readers.conll03_reader import *
from forte.data.readers.conllu_ud_reader import *
from forte.data.readers.ontonotes_reader import *
from forte.data.readers.plainsentence_txtgen_reader import *
from forte.data.readers.plaintext_reader import *
from forte.data.readers.prodigy_reader import *
from forte.data.readers.string_reader import *
| 44.333333 | 60 | 0.839599 | 58 | 399 | 5.603448 | 0.275862 | 0.221538 | 0.32 | 0.492308 | 0.689231 | 0.689231 | 0 | 0 | 0 | 0 | 0 | 0.00545 | 0.080201 | 399 | 8 | 61 | 49.875 | 0.880109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fc90da1250339a47ac555d961212c17e7b703178 | 4,135 | py | Python | pysubgroup/tests/test_representations.py | rovany706/pysubgroup | ed20c47fd3b82f34109e5451552986c3349e6aa4 | [
"Apache-2.0"
] | 40 | 2018-11-25T14:00:32.000Z | 2022-03-17T07:24:22.000Z | pysubgroup/tests/test_representations.py | rebelosa/pysubgroup | 589da8c2134d3e5b3ee131094c4c0e59e3f5b254 | [
"Apache-2.0"
] | 25 | 2018-12-05T13:56:17.000Z | 2022-03-02T10:53:49.000Z | pysubgroup/tests/test_representations.py | rebelosa/pysubgroup | 589da8c2134d3e5b3ee131094c4c0e59e3f5b254 | [
"Apache-2.0"
] | 21 | 2018-12-21T11:19:37.000Z | 2022-01-14T03:41:36.000Z | import unittest
import numpy as np
import pandas as pd
import pysubgroup as ps
class TestRepresentation(unittest.TestCase):
def setUp(self):
self.A = np.array([0, 0, 1, 1, 0, 0, 1, 1, 1, 1], dtype=bool)
self.A1 = ps.EqualitySelector("columnA", True)
self.A0 = ps.EqualitySelector("columnA", False)
self.B = np.array(["A", "B", "C", "C", "B", "A", "D", "A", "A", "A"])
self.BA = ps.EqualitySelector("columnB", "A")
self.BC = ps.EqualitySelector("columnB", "C")
self.C = np.array([np.nan, np.nan, 1.1, 1.1, 2, 2, 2, 2, 2, 2])
self.CA = ps.EqualitySelector("columnC", 1.1)
self.CNan = ps.EqualitySelector("columnC", np.nan)
self.df = pd.DataFrame.from_dict({"columnA": self.A, "columnB": self.B, "columnC": self.C})
def test_BitSet(self):
with ps.BitSetRepresentation(self.df, [self.A1, self.A0, self.BA, self.BC, self.CA, self.CNan]) as representation:
np.testing.assert_array_equal(self.A1.representation, self.A) # pylint: disable=no-member
np.testing.assert_array_equal(self.A0.representation, np.logical_not(self.A)) # pylint: disable=no-member
np.testing.assert_array_equal(self.BA.representation, [1, 0, 0, 0, 0, 1, 0, 1, 1, 1]) # pylint: disable=no-member
np.testing.assert_array_equal(self.BC.representation, [0, 0, 1, 1, 0, 0, 0, 0, 0, 0]) # pylint: disable=no-member
np.testing.assert_array_equal(self.CA.representation, [0, 0, 1, 1, 0, 0, 0, 0, 0, 0]) # pylint: disable=no-member
np.testing.assert_array_equal(self.CNan.representation, [1, 1, 0, 0, 0, 0, 0, 0, 0, 0]) # pylint: disable=no-member
np.testing.assert_array_equal(representation.Conjunction([self.BA, self.CNan]).representation, [1, 0, 0, 0, 0, 0, 0, 0, 0, 0]) # pylint: disable=no-member
np.testing.assert_array_equal(representation.Disjunction([self.BA, self.BC]).representation, [1, 0, 1, 1, 0, 1, 0, 1, 1, 1]) # pylint: disable=no-member
def test_Set(self):
with ps.SetRepresentation(self.df, [self.A1, self.A0, self.BA, self.BC, self.CA, self.CNan]) as representation:
self.assertEqual(self.A1.representation, {2, 3, 6, 7, 8, 9}) # pylint: disable=no-member
self.assertEqual(self.A0.representation, {0, 1, 4, 5}) # pylint: disable=no-member
self.assertEqual(self.BA.representation, {0, 5, 7, 8, 9}) # pylint: disable=no-member
self.assertEqual(self.BC.representation, {2, 3}) # pylint: disable=no-member
self.assertEqual(self.CA.representation, {2, 3}) # pylint: disable=no-member
self.assertEqual(self.CNan.representation, {0, 1}) # pylint: disable=no-member
self.assertEqual(representation.Conjunction([self.BA, self.CNan]).representation, {0}) # pylint: disable=no-member
self.assertEqual(representation.Conjunction([self.A0, self.CNan]).representation, {0, 1}) # pylint: disable=no-member
def test_NumpySet(self):
with ps.NumpySetRepresentation(self.df, [self.A1, self.A0, self.BA, self.BC, self.CA, self.CNan]) as representation:
np.testing.assert_array_equal(self.A1.representation, [2, 3, 6, 7, 8, 9]) # pylint: disable=no-member
np.testing.assert_array_equal(self.A0.representation, [0, 1, 4, 5]) # pylint: disable=no-member
np.testing.assert_array_equal(self.BA.representation, [0, 5, 7, 8, 9]) # pylint: disable=no-member
np.testing.assert_array_equal(self.BC.representation, [2, 3]) # pylint: disable=no-member
np.testing.assert_array_equal(self.CA.representation, [2, 3]) # pylint: disable=no-member
np.testing.assert_array_equal(self.CNan.representation, [0, 1]) # pylint: disable=no-member
np.testing.assert_array_equal(representation.Conjunction([self.BA, self.CNan]).representation, [0]) # pylint: disable=no-member
np.testing.assert_array_equal(representation.Conjunction([self.A0, self.CNan]).representation, [0, 1]) # pylint: disable=no-member
if __name__ == '__main__':
unittest.main()
| 56.643836 | 167 | 0.647642 | 599 | 4,135 | 4.395659 | 0.12187 | 0.024307 | 0.136726 | 0.191417 | 0.749715 | 0.747436 | 0.743638 | 0.732624 | 0.732245 | 0.702241 | 0 | 0.043686 | 0.191778 | 4,135 | 72 | 168 | 57.430556 | 0.744165 | 0.150665 | 0 | 0 | 0 | 0 | 0.023803 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.083333 | false | 0 | 0.083333 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d8fce5a9da17bedc19b9d0b08329fa2aa0a8edb | 15,271 | py | Python | orquesta/tests/unit/composition/mistral/test_workflow_basic.py | batk0/orquesta | f03f3f2f3820bf111a9277f4f6c5d6c83a89d004 | [
"Apache-2.0"
] | null | null | null | orquesta/tests/unit/composition/mistral/test_workflow_basic.py | batk0/orquesta | f03f3f2f3820bf111a9277f4f6c5d6c83a89d004 | [
"Apache-2.0"
] | null | null | null | orquesta/tests/unit/composition/mistral/test_workflow_basic.py | batk0/orquesta | f03f3f2f3820bf111a9277f4f6c5d6c83a89d004 | [
"Apache-2.0"
] | null | null | null | # Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from orquesta.tests.unit.composition.mistral import base
class BasicWorkflowComposerTest(base.MistralWorkflowComposerTest):
def test_sequential(self):
wf_name = 'sequential'
expected_wf_graph = {
'directed': True,
'graph': {},
'nodes': [
{
'id': 'task1'
},
{
'id': 'task2'
},
{
'id': 'task3'
}
],
'adjacency': [
[
{
'id': 'task2',
'key': 0,
'criteria': self.compose_seq_expr(
'task1',
condition='on-success'
)
}
],
[
{
'id': 'task3',
'key': 0,
'criteria': self.compose_seq_expr(
'task2',
condition='on-success'
)
}
],
[]
],
'multigraph': True
}
self.assert_compose_to_wf_graph(wf_name, expected_wf_graph)
expected_wf_ex_graph = {
'directed': True,
'graph': {},
'nodes': [
{
'id': 'task1',
'name': 'task1'
},
{
'id': 'task2',
'name': 'task2'
},
{
'id': 'task3',
'name': 'task3'
}
],
'adjacency': [
[
{
'id': 'task2',
'key': 0,
'criteria': self.compose_seq_expr(
'task1',
condition='on-success'
)
}
],
[
{
'id': 'task3',
'key': 0,
'criteria': self.compose_seq_expr(
'task2',
condition='on-success'
)
}
],
[]
],
'multigraph': True
}
self.assert_compose_to_wf_ex_graph(wf_name, expected_wf_ex_graph)
def test_parallel(self):
wf_name = 'parallel'
expected_wf_graph = {
'directed': True,
'graph': {},
'nodes': [
{
'id': 'task1'
},
{
'id': 'task2'
},
{
'id': 'task3'
},
{
'id': 'task4'
},
{
'id': 'task5'
},
{
'id': 'task6'
}
],
'adjacency': [
[
{
'id': 'task2',
'key': 0,
'criteria': self.compose_seq_expr(
'task1',
condition='on-success'
)
}
],
[
{
'id': 'task3',
'key': 0,
'criteria': self.compose_seq_expr(
'task2',
condition='on-success'
)
}
],
[],
[
{
'id': 'task5',
'key': 0,
'criteria': self.compose_seq_expr(
'task4',
condition='on-success'
)
}
],
[
{
'id': 'task6',
'key': 0,
'criteria': self.compose_seq_expr(
'task5',
condition='on-success'
)
}
],
[]
],
'multigraph': True
}
self.assert_compose_to_wf_graph(wf_name, expected_wf_graph)
expected_wf_ex_graph = {
'directed': True,
'graph': {},
'nodes': [
{
'id': 'task1',
'name': 'task1'
},
{
'id': 'task2',
'name': 'task2'
},
{
'id': 'task3',
'name': 'task3'
},
{
'id': 'task4',
'name': 'task4'
},
{
'id': 'task5',
'name': 'task5'
},
{
'id': 'task6',
'name': 'task6'
}
],
'adjacency': [
[
{
'id': 'task2',
'key': 0,
'criteria': self.compose_seq_expr(
'task1',
condition='on-success'
)
}
],
[
{
'id': 'task3',
'key': 0,
'criteria': self.compose_seq_expr(
'task2',
condition='on-success'
)
}
],
[],
[
{
'id': 'task5',
'key': 0,
'criteria': self.compose_seq_expr(
'task4',
condition='on-success'
)
}
],
[
{
'id': 'task6',
'key': 0,
'criteria': self.compose_seq_expr(
'task5',
condition='on-success'
)
}
],
[]
],
'multigraph': True
}
self.assert_compose_to_wf_ex_graph(wf_name, expected_wf_ex_graph)
def test_branching(self):
wf_name = 'branching'
expected_wf_graph = {
'directed': True,
'graph': {},
'nodes': [
{
'id': 'task1'
},
{
'id': 'task2'
},
{
'id': 'task3'
},
{
'id': 'task4'
},
{
'id': 'task5'
}
],
'adjacency': [
[
{
'id': 'task2',
'key': 0,
'criteria': self.compose_seq_expr(
'task1',
condition='on-success'
)
},
{
'id': 'task4',
'key': 0,
'criteria': self.compose_seq_expr(
'task1',
condition='on-success'
)
}
],
[
{
'id': 'task3',
'key': 0,
'criteria': self.compose_seq_expr(
'task2',
condition='on-success'
)
}
],
[],
[
{
'id': 'task5',
'key': 0,
'criteria': self.compose_seq_expr(
'task4',
condition='on-success'
)
}
],
[]
],
'multigraph': True
}
self.assert_compose_to_wf_graph(wf_name, expected_wf_graph)
expected_wf_ex_graph = {
'directed': True,
'graph': {},
'nodes': [
{
'id': 'task1',
'name': 'task1'
},
{
'id': 'task2',
'name': 'task2'
},
{
'id': 'task3',
'name': 'task3'
},
{
'id': 'task4',
'name': 'task4'
},
{
'id': 'task5',
'name': 'task5'
}
],
'adjacency': [
[
{
'id': 'task2',
'key': 0,
'criteria': self.compose_seq_expr(
'task1',
condition='on-success'
)
},
{
'id': 'task4',
'key': 0,
'criteria': self.compose_seq_expr(
'task1',
condition='on-success'
)
}
],
[
{
'id': 'task3',
'key': 0,
'criteria': self.compose_seq_expr(
'task2',
condition='on-success'
)
}
],
[],
[
{
'id': 'task5',
'key': 0,
'criteria': self.compose_seq_expr(
'task4',
condition='on-success'
)
}
],
[]
],
'multigraph': True
}
self.assert_compose_to_wf_ex_graph(wf_name, expected_wf_ex_graph)
def test_decision_tree(self):
wf_name = 'decision'
expected_wf_graph = {
'directed': True,
'graph': {},
'nodes': [
{
'id': 't1'
},
{
'id': 'a'
},
{
'id': 'b'
},
{
'id': 'c'
}
],
'adjacency': [
[
{
'id': 'a',
'key': 0,
'criteria': self.compose_seq_expr(
't1',
condition='on-success',
expr="<% ctx().which = 'a' %>"
)
},
{
'id': 'b',
'key': 0,
'criteria': self.compose_seq_expr(
't1',
condition='on-success',
expr="<% ctx().which = 'b' %>"
)
},
{
'id': 'c',
'key': 0,
'criteria': self.compose_seq_expr(
't1',
condition='on-success',
expr="<% not ctx().which in list(a, b) %>"
)
}
],
[],
[],
[]
],
'multigraph': True
}
self.assert_compose_to_wf_graph(wf_name, expected_wf_graph)
expected_wf_ex_graph = {
'directed': True,
'graph': {},
'nodes': [
{
'id': 't1',
'name': 't1'
},
{
'id': 'a',
'name': 'a'
},
{
'id': 'b',
'name': 'b'
},
{
'id': 'c',
'name': 'c'
}
],
'adjacency': [
[
{
'id': 'a',
'key': 0,
'criteria': self.compose_seq_expr(
't1',
condition='on-success',
expr="<% ctx().which = 'a' %>"
)
},
{
'id': 'b',
'key': 0,
'criteria': self.compose_seq_expr(
't1',
condition='on-success',
expr="<% ctx().which = 'b' %>"
)
},
{
'id': 'c',
'key': 0,
'criteria': self.compose_seq_expr(
't1',
condition='on-success',
expr="<% not ctx().which in list(a, b) %>"
)
}
],
[],
[],
[]
],
'multigraph': True
}
self.assert_compose_to_wf_ex_graph(wf_name, expected_wf_ex_graph)
| 29.31094 | 74 | 0.240849 | 785 | 15,271 | 4.500637 | 0.138854 | 0.029437 | 0.08831 | 0.117747 | 0.802717 | 0.802717 | 0.802717 | 0.801585 | 0.790546 | 0.790546 | 0 | 0.02314 | 0.657586 | 15,271 | 520 | 75 | 29.367308 | 0.652515 | 0.034117 | 0 | 0.590535 | 0 | 0 | 0.113855 | 0 | 0 | 0 | 0 | 0 | 0.016461 | 1 | 0.00823 | false | 0 | 0.002058 | 0 | 0.012346 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5da6936f8319312257e8a8eddb90380d2f9c81b9 | 29,929 | py | Python | sdk/python/pulumi_azure/appservice/app_service.py | kenny-wealth/pulumi-azure | e57e3a81f95bf622e7429c53f0bff93e33372aa1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/appservice/app_service.py | kenny-wealth/pulumi-azure | e57e3a81f95bf622e7429c53f0bff93e33372aa1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/appservice/app_service.py | kenny-wealth/pulumi-azure | e57e3a81f95bf622e7429c53f0bff93e33372aa1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import json
import warnings
import pulumi
import pulumi.runtime
from typing import Union
from .. import utilities, tables
class AppService(pulumi.CustomResource):
app_service_plan_id: pulumi.Output[str]
"""
The ID of the App Service Plan within which to create this App Service.
"""
app_settings: pulumi.Output[dict]
"""
A key-value pair of App Settings.
"""
auth_settings: pulumi.Output[dict]
"""
A `auth_settings` block as defined below.
* `activeDirectory` (`dict`)
* `allowedAudiences` (`list`)
* `client_id` (`str`)
* `client_secret` (`str`)
* `additionalLoginParams` (`dict`)
* `allowedExternalRedirectUrls` (`list`)
* `defaultProvider` (`str`)
* `enabled` (`bool`) - Is the App Service Enabled?
* `facebook` (`dict`)
* `app_id` (`str`)
* `appSecret` (`str`)
* `oauthScopes` (`list`)
* `google` (`dict`)
* `client_id` (`str`)
* `client_secret` (`str`)
* `oauthScopes` (`list`)
* `issuer` (`str`)
* `microsoft` (`dict`)
* `client_id` (`str`)
* `client_secret` (`str`)
* `oauthScopes` (`list`)
* `runtimeVersion` (`str`)
* `tokenRefreshExtensionHours` (`float`)
* `tokenStoreEnabled` (`bool`)
* `twitter` (`dict`)
* `consumerKey` (`str`)
* `consumerSecret` (`str`)
* `unauthenticatedClientAction` (`str`)
"""
backup: pulumi.Output[dict]
client_affinity_enabled: pulumi.Output[bool]
"""
Should the App Service send session affinity cookies, which route client requests in the same session to the same instance?
"""
client_cert_enabled: pulumi.Output[bool]
"""
Does the App Service require client certificates for incoming requests? Defaults to `false`.
"""
connection_strings: pulumi.Output[list]
"""
One or more `connection_string` blocks as defined below.
* `name` (`str`) - Specifies the name of the App Service. Changing this forces a new resource to be created.
* `type` (`str`)
* `value` (`str`)
"""
default_site_hostname: pulumi.Output[str]
"""
The Default Hostname associated with the App Service - such as `mysite.azurewebsites.net`
"""
enabled: pulumi.Output[bool]
"""
Is the App Service Enabled?
"""
https_only: pulumi.Output[bool]
"""
Can the App Service only be accessed via HTTPS? Defaults to `false`.
"""
identity: pulumi.Output[dict]
"""
A Managed Service Identity block as defined below.
* `identityIds` (`list`)
* `principalId` (`str`) - The Principal ID for the Service Principal associated with the Managed Service Identity of this App Service.
* `tenantId` (`str`) - The Tenant ID for the Service Principal associated with the Managed Service Identity of this App Service.
* `type` (`str`)
"""
location: pulumi.Output[str]
"""
Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
"""
logs: pulumi.Output[dict]
"""
A `logs` block as defined below.
* `applicationLogs` (`dict`)
* `azureBlobStorage` (`dict`)
* `level` (`str`)
* `retentionInDays` (`float`)
* `sasUrl` (`str`)
* `httpLogs` (`dict`)
* `azureBlobStorage` (`dict`)
* `retentionInDays` (`float`)
* `sasUrl` (`str`)
* `fileSystem` (`dict`)
* `retentionInDays` (`float`)
* `retentionInMb` (`float`)
"""
name: pulumi.Output[str]
"""
Specifies the name of the App Service. Changing this forces a new resource to be created.
"""
outbound_ip_addresses: pulumi.Output[str]
"""
A comma separated list of outbound IP addresses - such as `52.23.25.3,52.143.43.12`
"""
possible_outbound_ip_addresses: pulumi.Output[str]
"""
A comma separated list of outbound IP addresses - such as `52.23.25.3,52.143.43.12,52.143.43.17` - not all of which are necessarily in use. Superset of `outbound_ip_addresses`.
"""
resource_group_name: pulumi.Output[str]
"""
The name of the resource group in which to create the App Service.
"""
site_config: pulumi.Output[dict]
"""
A `site_config` block as defined below.
* `alwaysOn` (`bool`)
* `appCommandLine` (`str`)
* `cors` (`dict`)
* `allowedOrigins` (`list`)
* `supportCredentials` (`bool`)
* `defaultDocuments` (`list`)
* `dotnetFrameworkVersion` (`str`)
* `ftpsState` (`str`)
* `http2Enabled` (`bool`)
* `ipRestrictions` (`list`)
* `ipAddress` (`str`)
* `subnetMask` (`str`)
* `virtualNetworkSubnetId` (`str`)
* `javaContainer` (`str`)
* `javaContainerVersion` (`str`)
* `javaVersion` (`str`)
* `linuxFxVersion` (`str`)
* `localMysqlEnabled` (`bool`)
* `managedPipelineMode` (`str`)
* `minTlsVersion` (`str`)
* `phpVersion` (`str`)
* `pythonVersion` (`str`)
* `remoteDebuggingEnabled` (`bool`)
* `remoteDebuggingVersion` (`str`)
* `scmType` (`str`)
* `use32BitWorkerProcess` (`bool`)
* `virtualNetworkName` (`str`)
* `websocketsEnabled` (`bool`)
* `windowsFxVersion` (`str`)
"""
site_credential: pulumi.Output[dict]
"""
A `site_credential` block as defined below, which contains the site-level credentials used to publish to this App Service.
* `password` (`str`) - The password associated with the username, which can be used to publish to this App Service.
* `username` (`str`) - The username which can be used to publish to this App Service
"""
source_control: pulumi.Output[dict]
"""
A `source_control` block as defined below, which contains the Source Control information when `scm_type` is set to `LocalGit`.
* `branch` (`str`) - Branch name of the Git repository for this App Service.
* `repoUrl` (`str`) - URL of the Git repository for this App Service.
"""
storage_accounts: pulumi.Output[list]
"""
One or more `storage_account` blocks as defined below.
* `accessKey` (`str`)
* `accountName` (`str`)
* `mountPath` (`str`)
* `name` (`str`) - Specifies the name of the App Service. Changing this forces a new resource to be created.
* `shareName` (`str`)
* `type` (`str`)
"""
tags: pulumi.Output[dict]
"""
A mapping of tags to assign to the resource.
"""
def __init__(__self__, resource_name, opts=None, app_service_plan_id=None, app_settings=None, auth_settings=None, backup=None, client_affinity_enabled=None, client_cert_enabled=None, connection_strings=None, enabled=None, https_only=None, identity=None, location=None, logs=None, name=None, resource_group_name=None, site_config=None, storage_accounts=None, tags=None, __props__=None, __name__=None, __opts__=None):
"""
Manages an App Service (within an App Service Plan).
> **Note:** When using Slots - the `app_settings`, `connection_string` and `site_config` blocks on the `appservice.AppService` resource will be overwritten when promoting a Slot using the `appservice.ActiveSlot` resource.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] app_service_plan_id: The ID of the App Service Plan within which to create this App Service.
:param pulumi.Input[dict] app_settings: A key-value pair of App Settings.
:param pulumi.Input[dict] auth_settings: A `auth_settings` block as defined below.
:param pulumi.Input[bool] client_affinity_enabled: Should the App Service send session affinity cookies, which route client requests in the same session to the same instance?
:param pulumi.Input[bool] client_cert_enabled: Does the App Service require client certificates for incoming requests? Defaults to `false`.
:param pulumi.Input[list] connection_strings: One or more `connection_string` blocks as defined below.
:param pulumi.Input[bool] enabled: Is the App Service Enabled?
:param pulumi.Input[bool] https_only: Can the App Service only be accessed via HTTPS? Defaults to `false`.
:param pulumi.Input[dict] identity: A Managed Service Identity block as defined below.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
:param pulumi.Input[dict] logs: A `logs` block as defined below.
:param pulumi.Input[str] name: Specifies the name of the App Service. Changing this forces a new resource to be created.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the App Service.
:param pulumi.Input[dict] site_config: A `site_config` block as defined below.
:param pulumi.Input[list] storage_accounts: One or more `storage_account` blocks as defined below.
:param pulumi.Input[dict] tags: A mapping of tags to assign to the resource.
The **auth_settings** object supports the following:
* `activeDirectory` (`pulumi.Input[dict]`)
* `allowedAudiences` (`pulumi.Input[list]`)
* `client_id` (`pulumi.Input[str]`)
* `client_secret` (`pulumi.Input[str]`)
* `additionalLoginParams` (`pulumi.Input[dict]`)
* `allowedExternalRedirectUrls` (`pulumi.Input[list]`)
* `defaultProvider` (`pulumi.Input[str]`)
* `enabled` (`pulumi.Input[bool]`) - Is the App Service Enabled?
* `facebook` (`pulumi.Input[dict]`)
* `app_id` (`pulumi.Input[str]`)
* `appSecret` (`pulumi.Input[str]`)
* `oauthScopes` (`pulumi.Input[list]`)
* `google` (`pulumi.Input[dict]`)
* `client_id` (`pulumi.Input[str]`)
* `client_secret` (`pulumi.Input[str]`)
* `oauthScopes` (`pulumi.Input[list]`)
* `issuer` (`pulumi.Input[str]`)
* `microsoft` (`pulumi.Input[dict]`)
* `client_id` (`pulumi.Input[str]`)
* `client_secret` (`pulumi.Input[str]`)
* `oauthScopes` (`pulumi.Input[list]`)
* `runtimeVersion` (`pulumi.Input[str]`)
* `tokenRefreshExtensionHours` (`pulumi.Input[float]`)
* `tokenStoreEnabled` (`pulumi.Input[bool]`)
* `twitter` (`pulumi.Input[dict]`)
* `consumerKey` (`pulumi.Input[str]`)
* `consumerSecret` (`pulumi.Input[str]`)
* `unauthenticatedClientAction` (`pulumi.Input[str]`)
The **backup** object supports the following:
* `enabled` (`pulumi.Input[bool]`) - Is the App Service Enabled?
* `name` (`pulumi.Input[str]`) - Specifies the name of the App Service. Changing this forces a new resource to be created.
* `schedule` (`pulumi.Input[dict]`)
* `frequencyInterval` (`pulumi.Input[float]`)
* `frequencyUnit` (`pulumi.Input[str]`)
* `keepAtLeastOneBackup` (`pulumi.Input[bool]`)
* `retentionPeriodInDays` (`pulumi.Input[float]`)
* `startTime` (`pulumi.Input[str]`)
* `storageAccountUrl` (`pulumi.Input[str]`)
The **connection_strings** object supports the following:
* `name` (`pulumi.Input[str]`) - Specifies the name of the App Service. Changing this forces a new resource to be created.
* `type` (`pulumi.Input[str]`)
* `value` (`pulumi.Input[str]`)
The **identity** object supports the following:
* `identityIds` (`pulumi.Input[list]`)
* `principalId` (`pulumi.Input[str]`) - The Principal ID for the Service Principal associated with the Managed Service Identity of this App Service.
* `tenantId` (`pulumi.Input[str]`) - The Tenant ID for the Service Principal associated with the Managed Service Identity of this App Service.
* `type` (`pulumi.Input[str]`)
The **logs** object supports the following:
* `applicationLogs` (`pulumi.Input[dict]`)
* `azureBlobStorage` (`pulumi.Input[dict]`)
* `level` (`pulumi.Input[str]`)
* `retentionInDays` (`pulumi.Input[float]`)
* `sasUrl` (`pulumi.Input[str]`)
* `httpLogs` (`pulumi.Input[dict]`)
* `azureBlobStorage` (`pulumi.Input[dict]`)
* `retentionInDays` (`pulumi.Input[float]`)
* `sasUrl` (`pulumi.Input[str]`)
* `fileSystem` (`pulumi.Input[dict]`)
* `retentionInDays` (`pulumi.Input[float]`)
* `retentionInMb` (`pulumi.Input[float]`)
The **site_config** object supports the following:
* `alwaysOn` (`pulumi.Input[bool]`)
* `appCommandLine` (`pulumi.Input[str]`)
* `cors` (`pulumi.Input[dict]`)
* `allowedOrigins` (`pulumi.Input[list]`)
* `supportCredentials` (`pulumi.Input[bool]`)
* `defaultDocuments` (`pulumi.Input[list]`)
* `dotnetFrameworkVersion` (`pulumi.Input[str]`)
* `ftpsState` (`pulumi.Input[str]`)
* `http2Enabled` (`pulumi.Input[bool]`)
* `ipRestrictions` (`pulumi.Input[list]`)
* `ipAddress` (`pulumi.Input[str]`)
* `subnetMask` (`pulumi.Input[str]`)
* `virtualNetworkSubnetId` (`pulumi.Input[str]`)
* `javaContainer` (`pulumi.Input[str]`)
* `javaContainerVersion` (`pulumi.Input[str]`)
* `javaVersion` (`pulumi.Input[str]`)
* `linuxFxVersion` (`pulumi.Input[str]`)
* `localMysqlEnabled` (`pulumi.Input[bool]`)
* `managedPipelineMode` (`pulumi.Input[str]`)
* `minTlsVersion` (`pulumi.Input[str]`)
* `phpVersion` (`pulumi.Input[str]`)
* `pythonVersion` (`pulumi.Input[str]`)
* `remoteDebuggingEnabled` (`pulumi.Input[bool]`)
* `remoteDebuggingVersion` (`pulumi.Input[str]`)
* `scmType` (`pulumi.Input[str]`)
* `use32BitWorkerProcess` (`pulumi.Input[bool]`)
* `virtualNetworkName` (`pulumi.Input[str]`)
* `websocketsEnabled` (`pulumi.Input[bool]`)
* `windowsFxVersion` (`pulumi.Input[str]`)
The **storage_accounts** object supports the following:
* `accessKey` (`pulumi.Input[str]`)
* `accountName` (`pulumi.Input[str]`)
* `mountPath` (`pulumi.Input[str]`)
* `name` (`pulumi.Input[str]`) - Specifies the name of the App Service. Changing this forces a new resource to be created.
* `shareName` (`pulumi.Input[str]`)
* `type` (`pulumi.Input[str]`)
> This content is derived from https://github.com/terraform-providers/terraform-provider-azurerm/blob/master/website/docs/r/app_service.html.markdown.
"""
if __name__ is not None:
warnings.warn("explicit use of __name__ is deprecated", DeprecationWarning)
resource_name = __name__
if __opts__ is not None:
warnings.warn("explicit use of __opts__ is deprecated, use 'opts' instead", DeprecationWarning)
opts = __opts__
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = dict()
if app_service_plan_id is None:
raise TypeError("Missing required property 'app_service_plan_id'")
__props__['app_service_plan_id'] = app_service_plan_id
__props__['app_settings'] = app_settings
__props__['auth_settings'] = auth_settings
__props__['backup'] = backup
__props__['client_affinity_enabled'] = client_affinity_enabled
__props__['client_cert_enabled'] = client_cert_enabled
__props__['connection_strings'] = connection_strings
__props__['enabled'] = enabled
__props__['https_only'] = https_only
__props__['identity'] = identity
__props__['location'] = location
__props__['logs'] = logs
__props__['name'] = name
if resource_group_name is None:
raise TypeError("Missing required property 'resource_group_name'")
__props__['resource_group_name'] = resource_group_name
__props__['site_config'] = site_config
__props__['storage_accounts'] = storage_accounts
__props__['tags'] = tags
__props__['default_site_hostname'] = None
__props__['outbound_ip_addresses'] = None
__props__['possible_outbound_ip_addresses'] = None
__props__['site_credential'] = None
__props__['source_control'] = None
super(AppService, __self__).__init__(
'azure:appservice/appService:AppService',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name, id, opts=None, app_service_plan_id=None, app_settings=None, auth_settings=None, backup=None, client_affinity_enabled=None, client_cert_enabled=None, connection_strings=None, default_site_hostname=None, enabled=None, https_only=None, identity=None, location=None, logs=None, name=None, outbound_ip_addresses=None, possible_outbound_ip_addresses=None, resource_group_name=None, site_config=None, site_credential=None, source_control=None, storage_accounts=None, tags=None):
"""
Get an existing AppService resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param str id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] app_service_plan_id: The ID of the App Service Plan within which to create this App Service.
:param pulumi.Input[dict] app_settings: A key-value pair of App Settings.
:param pulumi.Input[dict] auth_settings: A `auth_settings` block as defined below.
:param pulumi.Input[bool] client_affinity_enabled: Should the App Service send session affinity cookies, which route client requests in the same session to the same instance?
:param pulumi.Input[bool] client_cert_enabled: Does the App Service require client certificates for incoming requests? Defaults to `false`.
:param pulumi.Input[list] connection_strings: One or more `connection_string` blocks as defined below.
:param pulumi.Input[str] default_site_hostname: The Default Hostname associated with the App Service - such as `mysite.azurewebsites.net`
:param pulumi.Input[bool] enabled: Is the App Service Enabled?
:param pulumi.Input[bool] https_only: Can the App Service only be accessed via HTTPS? Defaults to `false`.
:param pulumi.Input[dict] identity: A Managed Service Identity block as defined below.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
:param pulumi.Input[dict] logs: A `logs` block as defined below.
:param pulumi.Input[str] name: Specifies the name of the App Service. Changing this forces a new resource to be created.
:param pulumi.Input[str] outbound_ip_addresses: A comma separated list of outbound IP addresses - such as `52.23.25.3,52.143.43.12`
:param pulumi.Input[str] possible_outbound_ip_addresses: A comma separated list of outbound IP addresses - such as `52.23.25.3,52.143.43.12,52.143.43.17` - not all of which are necessarily in use. Superset of `outbound_ip_addresses`.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the App Service.
:param pulumi.Input[dict] site_config: A `site_config` block as defined below.
:param pulumi.Input[dict] site_credential: A `site_credential` block as defined below, which contains the site-level credentials used to publish to this App Service.
:param pulumi.Input[dict] source_control: A `source_control` block as defined below, which contains the Source Control information when `scm_type` is set to `LocalGit`.
:param pulumi.Input[list] storage_accounts: One or more `storage_account` blocks as defined below.
:param pulumi.Input[dict] tags: A mapping of tags to assign to the resource.
The **auth_settings** object supports the following:
* `activeDirectory` (`pulumi.Input[dict]`)
* `allowedAudiences` (`pulumi.Input[list]`)
* `client_id` (`pulumi.Input[str]`)
* `client_secret` (`pulumi.Input[str]`)
* `additionalLoginParams` (`pulumi.Input[dict]`)
* `allowedExternalRedirectUrls` (`pulumi.Input[list]`)
* `defaultProvider` (`pulumi.Input[str]`)
* `enabled` (`pulumi.Input[bool]`) - Is the App Service Enabled?
* `facebook` (`pulumi.Input[dict]`)
* `app_id` (`pulumi.Input[str]`)
* `appSecret` (`pulumi.Input[str]`)
* `oauthScopes` (`pulumi.Input[list]`)
* `google` (`pulumi.Input[dict]`)
* `client_id` (`pulumi.Input[str]`)
* `client_secret` (`pulumi.Input[str]`)
* `oauthScopes` (`pulumi.Input[list]`)
* `issuer` (`pulumi.Input[str]`)
* `microsoft` (`pulumi.Input[dict]`)
* `client_id` (`pulumi.Input[str]`)
* `client_secret` (`pulumi.Input[str]`)
* `oauthScopes` (`pulumi.Input[list]`)
* `runtimeVersion` (`pulumi.Input[str]`)
* `tokenRefreshExtensionHours` (`pulumi.Input[float]`)
* `tokenStoreEnabled` (`pulumi.Input[bool]`)
* `twitter` (`pulumi.Input[dict]`)
* `consumerKey` (`pulumi.Input[str]`)
* `consumerSecret` (`pulumi.Input[str]`)
* `unauthenticatedClientAction` (`pulumi.Input[str]`)
The **backup** object supports the following:
* `enabled` (`pulumi.Input[bool]`) - Is the App Service Enabled?
* `name` (`pulumi.Input[str]`) - Specifies the name of the App Service. Changing this forces a new resource to be created.
* `schedule` (`pulumi.Input[dict]`)
* `frequencyInterval` (`pulumi.Input[float]`)
* `frequencyUnit` (`pulumi.Input[str]`)
* `keepAtLeastOneBackup` (`pulumi.Input[bool]`)
* `retentionPeriodInDays` (`pulumi.Input[float]`)
* `startTime` (`pulumi.Input[str]`)
* `storageAccountUrl` (`pulumi.Input[str]`)
The **connection_strings** object supports the following:
* `name` (`pulumi.Input[str]`) - Specifies the name of the App Service. Changing this forces a new resource to be created.
* `type` (`pulumi.Input[str]`)
* `value` (`pulumi.Input[str]`)
The **identity** object supports the following:
* `identityIds` (`pulumi.Input[list]`)
* `principalId` (`pulumi.Input[str]`) - The Principal ID for the Service Principal associated with the Managed Service Identity of this App Service.
* `tenantId` (`pulumi.Input[str]`) - The Tenant ID for the Service Principal associated with the Managed Service Identity of this App Service.
* `type` (`pulumi.Input[str]`)
The **logs** object supports the following:
* `applicationLogs` (`pulumi.Input[dict]`)
* `azureBlobStorage` (`pulumi.Input[dict]`)
* `level` (`pulumi.Input[str]`)
* `retentionInDays` (`pulumi.Input[float]`)
* `sasUrl` (`pulumi.Input[str]`)
* `httpLogs` (`pulumi.Input[dict]`)
* `azureBlobStorage` (`pulumi.Input[dict]`)
* `retentionInDays` (`pulumi.Input[float]`)
* `sasUrl` (`pulumi.Input[str]`)
* `fileSystem` (`pulumi.Input[dict]`)
* `retentionInDays` (`pulumi.Input[float]`)
* `retentionInMb` (`pulumi.Input[float]`)
The **site_config** object supports the following:
* `alwaysOn` (`pulumi.Input[bool]`)
* `appCommandLine` (`pulumi.Input[str]`)
* `cors` (`pulumi.Input[dict]`)
* `allowedOrigins` (`pulumi.Input[list]`)
* `supportCredentials` (`pulumi.Input[bool]`)
* `defaultDocuments` (`pulumi.Input[list]`)
* `dotnetFrameworkVersion` (`pulumi.Input[str]`)
* `ftpsState` (`pulumi.Input[str]`)
* `http2Enabled` (`pulumi.Input[bool]`)
* `ipRestrictions` (`pulumi.Input[list]`)
* `ipAddress` (`pulumi.Input[str]`)
* `subnetMask` (`pulumi.Input[str]`)
* `virtualNetworkSubnetId` (`pulumi.Input[str]`)
* `javaContainer` (`pulumi.Input[str]`)
* `javaContainerVersion` (`pulumi.Input[str]`)
* `javaVersion` (`pulumi.Input[str]`)
* `linuxFxVersion` (`pulumi.Input[str]`)
* `localMysqlEnabled` (`pulumi.Input[bool]`)
* `managedPipelineMode` (`pulumi.Input[str]`)
* `minTlsVersion` (`pulumi.Input[str]`)
* `phpVersion` (`pulumi.Input[str]`)
* `pythonVersion` (`pulumi.Input[str]`)
* `remoteDebuggingEnabled` (`pulumi.Input[bool]`)
* `remoteDebuggingVersion` (`pulumi.Input[str]`)
* `scmType` (`pulumi.Input[str]`)
* `use32BitWorkerProcess` (`pulumi.Input[bool]`)
* `virtualNetworkName` (`pulumi.Input[str]`)
* `websocketsEnabled` (`pulumi.Input[bool]`)
* `windowsFxVersion` (`pulumi.Input[str]`)
The **site_credential** object supports the following:
* `password` (`pulumi.Input[str]`) - The password associated with the username, which can be used to publish to this App Service.
* `username` (`pulumi.Input[str]`) - The username which can be used to publish to this App Service
The **source_control** object supports the following:
* `branch` (`pulumi.Input[str]`) - Branch name of the Git repository for this App Service.
* `repoUrl` (`pulumi.Input[str]`) - URL of the Git repository for this App Service.
The **storage_accounts** object supports the following:
* `accessKey` (`pulumi.Input[str]`)
* `accountName` (`pulumi.Input[str]`)
* `mountPath` (`pulumi.Input[str]`)
* `name` (`pulumi.Input[str]`) - Specifies the name of the App Service. Changing this forces a new resource to be created.
* `shareName` (`pulumi.Input[str]`)
* `type` (`pulumi.Input[str]`)
> This content is derived from https://github.com/terraform-providers/terraform-provider-azurerm/blob/master/website/docs/r/app_service.html.markdown.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = dict()
__props__["app_service_plan_id"] = app_service_plan_id
__props__["app_settings"] = app_settings
__props__["auth_settings"] = auth_settings
__props__["backup"] = backup
__props__["client_affinity_enabled"] = client_affinity_enabled
__props__["client_cert_enabled"] = client_cert_enabled
__props__["connection_strings"] = connection_strings
__props__["default_site_hostname"] = default_site_hostname
__props__["enabled"] = enabled
__props__["https_only"] = https_only
__props__["identity"] = identity
__props__["location"] = location
__props__["logs"] = logs
__props__["name"] = name
__props__["outbound_ip_addresses"] = outbound_ip_addresses
__props__["possible_outbound_ip_addresses"] = possible_outbound_ip_addresses
__props__["resource_group_name"] = resource_group_name
__props__["site_config"] = site_config
__props__["site_credential"] = site_credential
__props__["source_control"] = source_control
__props__["storage_accounts"] = storage_accounts
__props__["tags"] = tags
return AppService(resource_name, opts=opts, __props__=__props__)
def translate_output_property(self, prop):
return tables._CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
def translate_input_property(self, prop):
return tables._SNAKE_TO_CAMEL_CASE_TABLE.get(prop) or prop
| 47.206625 | 498 | 0.613251 | 3,215 | 29,929 | 5.542457 | 0.099222 | 0.137662 | 0.091924 | 0.01706 | 0.800494 | 0.78203 | 0.768 | 0.754812 | 0.739379 | 0.727426 | 0 | 0.00416 | 0.261051 | 29,929 | 633 | 499 | 47.281201 | 0.801546 | 0.520331 | 0 | 0.018519 | 1 | 0 | 0.162051 | 0.044241 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0.009259 | 0.055556 | 0.018519 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5db1913f711563c50a37a4af857df87079f56b03 | 7,533 | py | Python | userbot/plugins/tiktok.py | indianSammy07/Wolfuserbots | 9c56dde1f81cec9eb4dd85a369f3a1f8b99f0763 | [
"MIT"
] | 1 | 2021-07-06T19:12:56.000Z | 2021-07-06T19:12:56.000Z | userbot/plugins/tiktok.py | indianSammy07/Wolfuserbots | 9c56dde1f81cec9eb4dd85a369f3a1f8b99f0763 | [
"MIT"
] | null | null | null | userbot/plugins/tiktok.py | indianSammy07/Wolfuserbots | 9c56dde1f81cec9eb4dd85a369f3a1f8b99f0763 | [
"MIT"
] | 1 | 2021-07-06T19:12:57.000Z | 2021-07-06T19:12:57.000Z | """ tiktok downloaded plugin creted by @mrconfused and @sandy1709
idea by @IMperialxx
Dont edit credits """
import datetime
import asyncio
from telethon import events
from telethon.errors.rpcerrorlist import YouBlockedUserError, UserAlreadyParticipantError
from telethon.tl.functions.account import UpdateNotifySettingsRequest
from telethon.tl.functions.messages import ImportChatInviteRequest
from userbot.utils import admin_cmd , sudo_cmd
from userbot import CMD_HELP
@borg.on(admin_cmd("tti ?(.*)"))
async def _(event):
if event.fwd_from:
return
d_link = event.pattern_match.group(1)
if ".com" not in d_link:
await event.edit("` I need a link to download something pro.`**(._.)**")
return
else:
await event.edit("doownloading your video")
bot = "@HK_tiktok_BOT"
async with borg.conversation("@HK_tiktok_BOT") as conv:
try:
await conv.send_message(d_link)
cat1 = await conv.get_response()
details = await conv.get_response()
if details.text.startswith("Sorry"):
await borg.send_message(event.chat_id , "sorry . something went wrong" )
return
cat2 = await conv.get_response()
cat3 = await conv.get_response()
await borg.send_file(event.chat_id, details, caption = details.text)
await event.delete()
except YouBlockedUserError:
await event.edit("**Error:** `unblock` @HK_tiktok_BOT `and retry!`")
@borg.on(admin_cmd("ttv ?(.*)"))
async def _(event):
if event.fwd_from:
return
d_link = event.pattern_match.group(1)
if ".com" not in d_link:
await event.edit("` I need a link to download something pro.`**(._.)**")
return
else:
await event.edit("doownloading your video")
bot = "@HK_tiktok_BOT"
async with borg.conversation("@HK_tiktok_BOT") as conv:
try:
await conv.send_message(d_link)
cat1 = await conv.get_response()
details = await conv.get_response()
if details.text.startswith("Sorry"):
await borg.send_message(event.chat_id , "sorry . something went wrong" )
return
cat2 = await conv.get_response()
cat3 = await conv.get_response()
await borg.send_file(event.chat_id, cat3)
await event.delete()
except YouBlockedUserError:
await event.edit("**Error:** `unblock` @HK_tiktok_BOT `and retry!`")
@borg.on(admin_cmd("wttv ?(.*)"))
async def _(event):
if event.fwd_from:
return
d_link = event.pattern_match.group(1)
if ".com" not in d_link:
await event.edit("` I need a link to download something pro.`**(._.)**")
return
else:
await event.edit("doownloading your video")
bot = "@HK_tiktok_BOT"
async with borg.conversation("@HK_tiktok_BOT") as conv:
try:
await conv.send_message(d_link)
cat1 = await conv.get_response()
details = await conv.get_response()
if details.text.startswith("Sorry"):
await borg.send_message(event.chat_id , "sorry . something went wrong" )
return
cat2 = await conv.get_response()
cat3 = await conv.get_response()
await borg.send_file(event.chat_id, cat2)
await event.delete()
except YouBlockedUserError:
await event.edit("**Error:** `unblock` @HK_tiktok_BOT `and retry!`")
@borg.on(sudo_cmd(pattern = "tti ?(.*)", allow_sudo=True))
async def _(event):
if event.fwd_from:
return
d_link = event.pattern_match.group(1)
await event.delete()
if ".com" not in d_link:
await event.reply("` I need a link to download something pro.`**(._.)**")
return
else:
sandy =await event.reply("doownloading your video")
bot = "@HK_tiktok_BOT"
async with borg.conversation("@HK_tiktok_BOT") as conv:
try:
await conv.send_message(d_link)
cat1 = await conv.get_response()
details = await conv.get_response()
if details.text.startswith("Sorry"):
await borg.send_message(event.chat_id , "sorry . something went wrong" )
return
cat2 = await conv.get_response()
cat3 = await conv.get_response()
await borg.send_file(event.chat_id, details, caption = details.text)
await sandy.delete()
except YouBlockedUserError:
await event.edit("**Error:** `unblock` @HK_tiktok_BOT `and retry!`")
@borg.on(sudo_cmd(pattern = "ttv ?(.*)", allow_sudo=True))
async def _(event):
if event.fwd_from:
return
d_link = event.pattern_match.group(1)
await event.delete()
if ".com" not in d_link:
await event.reply("` I need a link to download something pro.`**(._.)**")
return
else:
sandy = await event.reply("doownloading your video")
bot = "@HK_tiktok_BOT"
async with borg.conversation("@HK_tiktok_BOT") as conv:
try:
await conv.send_message(d_link)
cat1 = await conv.get_response()
details = await conv.get_response()
if details.text.startswith("Sorry"):
await borg.send_message(event.chat_id , "sorry . something went wrong" )
return
cat2 = await conv.get_response()
cat3 = await conv.get_response()
await borg.send_file(event.chat_id, cat3)
await sandy.delete()
except YouBlockedUserError:
await event.edit("**Error:** `unblock` @HK_tiktok_BOT `and retry!`")
@borg.on(sudo_cmd(pattern = "wttv ?(.*)", allow_sudo=True))
async def _(event):
if event.fwd_from:
return
d_link = event.pattern_match.group(1)
await event.delete()
if ".com" not in d_link:
await event.reply("` I need a link to download something pro.`**(._.)**")
return
else:
sandy = await event.reply("doownloading your video")
bot = "@HK_tiktok_BOT"
async with borg.conversation("@HK_tiktok_BOT") as conv:
try:
await conv.send_message(d_link)
cat1 = await conv.get_response()
details = await conv.get_response()
if details.text.startswith("Sorry"):
await borg.send_message(event.chat_id , "sorry . something went wrong" )
return
cat2 = await conv.get_response()
cat3 = await conv.get_response()
await borg.send_file(event.chat_id, cat2)
await sandy.delete()
except YouBlockedUserError:
await event.edit("**Error:** `unblock` @HK_tiktok_BOT `and retry!`")
CMD_HELP.update({"tiktok": "`.tti` <link> :\
\nUSAGE: Shows you the information of the given tiktok video link.\
\n\n `.ttv `<link>\
\nUSAGE: Sends you the tiktok video of the given link without watermark\
\n\n `.wttv `<link>\
\n\nUSAGE: Sends you the tiktok video of the given link with watermark\
"
})
| 39.234375 | 93 | 0.57945 | 886 | 7,533 | 4.76298 | 0.126411 | 0.063981 | 0.068246 | 0.113744 | 0.866825 | 0.866825 | 0.866825 | 0.866825 | 0.866825 | 0.866825 | 0 | 0.006182 | 0.31289 | 7,533 | 191 | 94 | 39.439791 | 0.809119 | 0.013673 | 0 | 0.869822 | 0 | 0 | 0.160334 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047337 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d0034ec6c8ccec078dedfb7f60712ebdc4b7f7f | 175 | py | Python | astropy/modeling/setup_package.py | xiaomi1122/astropy | 8876e902f5efa02a3fc27d82fe15c16001d4df5e | [
"BSD-3-Clause"
] | null | null | null | astropy/modeling/setup_package.py | xiaomi1122/astropy | 8876e902f5efa02a3fc27d82fe15c16001d4df5e | [
"BSD-3-Clause"
] | null | null | null | astropy/modeling/setup_package.py | xiaomi1122/astropy | 8876e902f5efa02a3fc27d82fe15c16001d4df5e | [
"BSD-3-Clause"
] | null | null | null | # Licensed under a 3-clause BSD style license - see LICENSE.rst
def get_package_data():
return {'astropy.modeling.tests': ['data/*.fits', '../../wcs/tests/maps/*.hdr']}
| 29.166667 | 84 | 0.668571 | 25 | 175 | 4.6 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006579 | 0.131429 | 175 | 5 | 85 | 35 | 0.75 | 0.348571 | 0 | 0 | 0 | 0 | 0.526786 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
5d256bac4937e2fa3d98b7bf6692e898ad08d39e | 9,365 | py | Python | test/jit/test_scriptmod_ann.py | xiaohanhuang/pytorch | a31aea8eaa99a5ff72b5d002c206cd68d5467a5e | [
"Intel"
] | 183 | 2018-04-06T21:10:36.000Z | 2022-03-30T15:05:24.000Z | test/jit/test_scriptmod_ann.py | xiaohanhuang/pytorch | a31aea8eaa99a5ff72b5d002c206cd68d5467a5e | [
"Intel"
] | 818 | 2020-02-07T02:36:44.000Z | 2022-03-31T23:49:44.000Z | test/jit/test_scriptmod_ann.py | xiaohanhuang/pytorch | a31aea8eaa99a5ff72b5d002c206cd68d5467a5e | [
"Intel"
] | 58 | 2018-06-05T16:40:18.000Z | 2022-03-16T15:37:29.000Z | # Owner(s): ["oncall: jit"]
import os
import sys
import warnings
import torch
from typing import List, Dict, Optional
# Make the helper files in test/ importable
pytorch_test_dir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
sys.path.append(pytorch_test_dir)
from torch.testing._internal.jit_utils import JitTestCase
if __name__ == '__main__':
raise RuntimeError("This test file is not meant to be run directly, use:\n\n"
"\tpython test/test_jit.py TESTNAME\n\n"
"instead.")
class TestScriptModuleInstanceAttributeTypeAnnotation(JitTestCase):
# NB: There are no tests for `Tuple` or `NamedTuple` here. In fact,
# reassigning a non-empty Tuple to an attribute previously typed
# as containing an empty Tuple SHOULD fail. See note in `_check.py`
def test_annotated_falsy_base_type(self):
class M(torch.nn.Module):
def __init__(self):
super().__init__()
self.x: int = 0
def forward(self, x: int):
self.x = x
return 1
with warnings.catch_warnings(record=True) as w:
self.checkModule(M(), (1,))
assert(len(w) == 0)
def test_annotated_nonempty_container(self):
class M(torch.nn.Module):
def __init__(self):
super().__init__()
self.x: List[int] = [1, 2, 3]
def forward(self, x: List[int]):
self.x = x
return 1
with warnings.catch_warnings(record=True) as w:
self.checkModule(M(), ([1, 2, 3],))
assert(len(w) == 0)
def test_annotated_empty_tensor(self):
class M(torch.nn.Module):
def __init__(self):
super(M, self).__init__()
self.x: torch.Tensor = torch.empty(0)
def forward(self, x: torch.Tensor):
self.x = x
return self.x
with warnings.catch_warnings(record=True) as w:
self.checkModule(M(), (torch.rand(2, 3),))
assert(len(w) == 0)
def test_annotated_with_jit_attribute(self):
class M(torch.nn.Module):
def __init__(self):
super(M, self).__init__()
self.x = torch.jit.Attribute([], List[int])
def forward(self, x: List[int]):
self.x = x
return self.x
with warnings.catch_warnings(record=True) as w:
self.checkModule(M(), ([1, 2, 3],))
assert(len(w) == 0)
def test_annotated_class_level_annotation_only(self):
class M(torch.nn.Module):
x: List[int]
def __init__(self):
super().__init__()
self.x = []
def forward(self, y: List[int]):
self.x = y
return self.x
with warnings.catch_warnings(record=True) as w:
self.checkModule(M(), ([1, 2, 3],))
assert(len(w) == 0)
def test_annotated_class_level_annotation_and_init_annotation(self):
class M(torch.nn.Module):
x: List[int]
def __init__(self):
super().__init__()
self.x: List[int] = []
def forward(self, y: List[int]):
self.x = y
return self.x
with warnings.catch_warnings(record=True) as w:
self.checkModule(M(), ([1, 2, 3],))
assert(len(w) == 0)
def test_annotated_class_level_jit_annotation(self):
class M(torch.nn.Module):
x: List[int]
def __init__(self):
super().__init__()
self.x: List[int] = torch.jit.annotate(List[int], [])
def forward(self, y: List[int]):
self.x = y
return self.x
with warnings.catch_warnings(record=True) as w:
self.checkModule(M(), ([1, 2, 3],))
assert(len(w) == 0)
def test_annotated_empty_list(self):
class M(torch.nn.Module):
def __init__(self):
super().__init__()
self.x: List[int] = []
def forward(self, x: List[int]):
self.x = x
return 1
with self.assertRaisesRegexWithHighlight(RuntimeError,
"Tried to set nonexistent attribute",
"self.x = x"):
with self.assertWarnsRegex(UserWarning, "doesn't support "
"instance-level annotations on "
"empty non-base types"):
torch.jit.script(M())
def test_annotated_empty_dict(self):
class M(torch.nn.Module):
def __init__(self):
super().__init__()
self.x: Dict[str, int] = {}
def forward(self, x: Dict[str, int]):
self.x = x
return 1
with self.assertRaisesRegexWithHighlight(RuntimeError,
"Tried to set nonexistent attribute",
"self.x = x"):
with self.assertWarnsRegex(UserWarning, "doesn't support "
"instance-level annotations on "
"empty non-base types"):
torch.jit.script(M())
def test_annotated_empty_optional(self):
class M(torch.nn.Module):
def __init__(self):
super().__init__()
self.x: Optional[str] = None
def forward(self, x: Optional[str]):
self.x = x
return 1
with self.assertRaisesRegexWithHighlight(RuntimeError,
"Wrong type for attribute assignment",
"self.x = x"):
with self.assertWarnsRegex(UserWarning, "doesn't support "
"instance-level annotations on "
"empty non-base types"):
torch.jit.script(M())
def test_annotated_with_jit_empty_list(self):
class M(torch.nn.Module):
def __init__(self):
super().__init__()
self.x = torch.jit.annotate(List[int], [])
def forward(self, x: List[int]):
self.x = x
return 1
with self.assertRaisesRegexWithHighlight(RuntimeError,
"Tried to set nonexistent attribute",
"self.x = x"):
with self.assertWarnsRegex(UserWarning, "doesn't support "
"instance-level annotations on "
"empty non-base types"):
torch.jit.script(M())
def test_annotated_with_jit_empty_dict(self):
class M(torch.nn.Module):
def __init__(self):
super().__init__()
self.x = torch.jit.annotate(Dict[str, int], {})
def forward(self, x: Dict[str, int]):
self.x = x
return 1
with self.assertRaisesRegexWithHighlight(RuntimeError,
"Tried to set nonexistent attribute",
"self.x = x"):
with self.assertWarnsRegex(UserWarning, "doesn't support "
"instance-level annotations on "
"empty non-base types"):
torch.jit.script(M())
def test_annotated_with_jit_empty_optional(self):
class M(torch.nn.Module):
def __init__(self):
super().__init__()
self.x = torch.jit.annotate(Optional[str], None)
def forward(self, x: Optional[str]):
self.x = x
return 1
with self.assertRaisesRegexWithHighlight(RuntimeError,
"Wrong type for attribute assignment",
"self.x = x"):
with self.assertWarnsRegex(UserWarning, "doesn't support "
"instance-level annotations on "
"empty non-base types"):
torch.jit.script(M())
def test_annotated_with_torch_jit_import(self):
from torch import jit
class M(torch.nn.Module):
def __init__(self):
super().__init__()
self.x = jit.annotate(Optional[str], None)
def forward(self, x: Optional[str]):
self.x = x
return 1
with self.assertRaisesRegexWithHighlight(RuntimeError,
"Wrong type for attribute assignment",
"self.x = x"):
with self.assertWarnsRegex(UserWarning, "doesn't support "
"instance-level annotations on "
"empty non-base types"):
torch.jit.script(M())
| 35.881226 | 87 | 0.486599 | 961 | 9,365 | 4.53486 | 0.139438 | 0.058513 | 0.024782 | 0.041762 | 0.837999 | 0.821019 | 0.821019 | 0.814823 | 0.811152 | 0.804497 | 0 | 0.007092 | 0.412814 | 9,365 | 260 | 88 | 36.019231 | 0.785416 | 0.027977 | 0 | 0.778894 | 0 | 0 | 0.097065 | 0 | 0 | 0 | 0 | 0 | 0.105528 | 1 | 0.211055 | false | 0 | 0.040201 | 0 | 0.396985 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5d3f2e953958ae70d6caeea500d85580cce77486 | 53,888 | py | Python | msgraph-cli-extensions/beta/bookings_beta/azext_bookings_beta/generated/custom.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/beta/bookings_beta/azext_bookings_beta/generated/custom.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/beta/bookings_beta/azext_bookings_beta/generated/custom.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | # --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=too-many-lines
def bookings_booking_business_booking_business_create_booking_business(client,
id_=None,
display_name=None,
address=None,
business_hours=None,
business_type=None,
default_currency_iso=None,
email=None,
is_published=None,
phone=None,
public_url=None,
scheduling_policy=None,
web_site_url=None,
appointments=None,
calendar_view=None,
customers=None,
services=None,
staff_members=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['address'] = address
body['business_hours'] = business_hours
body['business_type'] = business_type
body['default_currency_iso'] = default_currency_iso
body['email'] = email
body['is_published'] = is_published
body['phone'] = phone
body['public_url'] = public_url
body['scheduling_policy'] = scheduling_policy
body['web_site_url'] = web_site_url
body['appointments'] = appointments
body['calendar_view'] = calendar_view
body['customers'] = customers
body['services'] = services
body['staff_members'] = staff_members
return client.create_booking_business(body=body)
def bookings_booking_business_booking_business_delete_booking_business(client,
booking_business_id,
if_match=None):
return client.delete_booking_business(booking_business_id=booking_business_id,
if_match=if_match)
def bookings_booking_business_booking_business_list_booking_business(client,
orderby=None,
select=None,
expand=None):
return client.list_booking_business(orderby=orderby,
select=select,
expand=expand)
def bookings_booking_business_booking_business_show_booking_business(client,
booking_business_id,
select=None,
expand=None):
return client.get_booking_business(booking_business_id=booking_business_id,
select=select,
expand=expand)
def bookings_booking_business_booking_business_update_booking_business(client,
booking_business_id,
id_=None,
display_name=None,
address=None,
business_hours=None,
business_type=None,
default_currency_iso=None,
email=None,
is_published=None,
phone=None,
public_url=None,
scheduling_policy=None,
web_site_url=None,
appointments=None,
calendar_view=None,
customers=None,
services=None,
staff_members=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['address'] = address
body['business_hours'] = business_hours
body['business_type'] = business_type
body['default_currency_iso'] = default_currency_iso
body['email'] = email
body['is_published'] = is_published
body['phone'] = phone
body['public_url'] = public_url
body['scheduling_policy'] = scheduling_policy
body['web_site_url'] = web_site_url
body['appointments'] = appointments
body['calendar_view'] = calendar_view
body['customers'] = customers
body['services'] = services
body['staff_members'] = staff_members
return client.update_booking_business(booking_business_id=booking_business_id,
body=body)
def bookings_booking_business_create_appointment(client,
booking_business_id,
id_=None,
additional_information=None,
customer_email_address=None,
customer_id=None,
customer_name=None,
customer_notes=None,
customer_phone=None,
duration=None,
end=None,
invoice_amount=None,
invoice_date=None,
invoice_id=None,
invoice_status=None,
invoice_url=None,
is_location_online=None,
online_meeting_url=None,
opt_out_of_customer_email=None,
post_buffer=None,
pre_buffer=None,
price=None,
price_type=None,
reminders=None,
self_service_appointment_id=None,
service_id=None,
service_name=None,
service_notes=None,
staff_member_ids=None,
start=None,
address=None,
coordinates=None,
display_name=None,
location_email_address=None,
location_type=None,
location_uri=None,
unique_id=None,
unique_id_type=None,
microsoft_graph_physical_address=None,
microsoft_graph_outlook_geo_coordinates=None,
microsoft_graph_location_display_name=None,
microsoft_graph_location_email_address_location_email_address=None,
microsoft_graph_location_type=None,
microsoft_graph_location_uri=None,
microsoft_graph_location_unique_id=None,
microsoft_graph_location_unique_id_type_unique_id_type=None):
body = {}
body['id'] = id_
body['additional_information'] = additional_information
body['customer_email_address'] = customer_email_address
body['customer_id'] = customer_id
body['customer_name'] = customer_name
body['customer_notes'] = customer_notes
body['customer_phone'] = customer_phone
body['duration'] = duration
body['end'] = end
body['invoice_amount'] = invoice_amount
body['invoice_date'] = invoice_date
body['invoice_id'] = invoice_id
body['invoice_status'] = invoice_status
body['invoice_url'] = invoice_url
body['is_location_online'] = is_location_online
body['online_meeting_url'] = online_meeting_url
body['opt_out_of_customer_email'] = opt_out_of_customer_email
body['post_buffer'] = post_buffer
body['pre_buffer'] = pre_buffer
body['price'] = price
body['price_type'] = price_type
body['reminders'] = reminders
body['self_service_appointment_id'] = self_service_appointment_id
body['service_id'] = service_id
body['service_name'] = service_name
body['service_notes'] = service_notes
body['staff_member_ids'] = staff_member_ids
body['start'] = start
body['service_location'] = {}
body['service_location']['address'] = address
body['service_location']['coordinates'] = coordinates
body['service_location']['display_name'] = display_name
body['service_location']['location_email_address'] = location_email_address
body['service_location']['location_type'] = location_type
body['service_location']['location_uri'] = location_uri
body['service_location']['unique_id'] = unique_id
body['service_location']['unique_id_type'] = unique_id_type
body['customer_location'] = {}
body['customer_location']['address'] = microsoft_graph_physical_address
body['customer_location']['coordinates'] = microsoft_graph_outlook_geo_coordinates
body['customer_location']['display_name'] = microsoft_graph_location_display_name
body['customer_location']['location_email_address'] = microsoft_graph_location_email_address_location_email_address
body['customer_location']['location_type'] = microsoft_graph_location_type
body['customer_location']['location_uri'] = microsoft_graph_location_uri
body['customer_location']['unique_id'] = microsoft_graph_location_unique_id
body['customer_location']['unique_id_type'] = microsoft_graph_location_unique_id_type_unique_id_type
return client.create_appointments(booking_business_id=booking_business_id,
body=body)
def bookings_booking_business_create_calendar_view(client,
booking_business_id,
id_=None,
additional_information=None,
customer_email_address=None,
customer_id=None,
customer_name=None,
customer_notes=None,
customer_phone=None,
duration=None,
end=None,
invoice_amount=None,
invoice_date=None,
invoice_id=None,
invoice_status=None,
invoice_url=None,
is_location_online=None,
online_meeting_url=None,
opt_out_of_customer_email=None,
post_buffer=None,
pre_buffer=None,
price=None,
price_type=None,
reminders=None,
self_service_appointment_id=None,
service_id=None,
service_name=None,
service_notes=None,
staff_member_ids=None,
start=None,
address=None,
coordinates=None,
display_name=None,
location_email_address=None,
location_type=None,
location_uri=None,
unique_id=None,
unique_id_type=None,
microsoft_graph_physical_address=None,
microsoft_graph_outlook_geo_coordinates=None,
microsoft_graph_location_display_name=None,
microsoft_graph_location_email_address_location_email_address=None,
microsoft_graph_location_type=None,
microsoft_graph_location_uri=None,
microsoft_graph_location_unique_id=None,
microsoft_graph_location_unique_id_type_unique_id_type=None):
body = {}
body['id'] = id_
body['additional_information'] = additional_information
body['customer_email_address'] = customer_email_address
body['customer_id'] = customer_id
body['customer_name'] = customer_name
body['customer_notes'] = customer_notes
body['customer_phone'] = customer_phone
body['duration'] = duration
body['end'] = end
body['invoice_amount'] = invoice_amount
body['invoice_date'] = invoice_date
body['invoice_id'] = invoice_id
body['invoice_status'] = invoice_status
body['invoice_url'] = invoice_url
body['is_location_online'] = is_location_online
body['online_meeting_url'] = online_meeting_url
body['opt_out_of_customer_email'] = opt_out_of_customer_email
body['post_buffer'] = post_buffer
body['pre_buffer'] = pre_buffer
body['price'] = price
body['price_type'] = price_type
body['reminders'] = reminders
body['self_service_appointment_id'] = self_service_appointment_id
body['service_id'] = service_id
body['service_name'] = service_name
body['service_notes'] = service_notes
body['staff_member_ids'] = staff_member_ids
body['start'] = start
body['service_location'] = {}
body['service_location']['address'] = address
body['service_location']['coordinates'] = coordinates
body['service_location']['display_name'] = display_name
body['service_location']['location_email_address'] = location_email_address
body['service_location']['location_type'] = location_type
body['service_location']['location_uri'] = location_uri
body['service_location']['unique_id'] = unique_id
body['service_location']['unique_id_type'] = unique_id_type
body['customer_location'] = {}
body['customer_location']['address'] = microsoft_graph_physical_address
body['customer_location']['coordinates'] = microsoft_graph_outlook_geo_coordinates
body['customer_location']['display_name'] = microsoft_graph_location_display_name
body['customer_location']['location_email_address'] = microsoft_graph_location_email_address_location_email_address
body['customer_location']['location_type'] = microsoft_graph_location_type
body['customer_location']['location_uri'] = microsoft_graph_location_uri
body['customer_location']['unique_id'] = microsoft_graph_location_unique_id
body['customer_location']['unique_id_type'] = microsoft_graph_location_unique_id_type_unique_id_type
return client.create_calendar_view(booking_business_id=booking_business_id,
body=body)
def bookings_booking_business_create_customer(client,
booking_business_id,
id_=None,
display_name=None,
email_address=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['email_address'] = email_address
return client.create_customers(booking_business_id=booking_business_id,
body=body)
def bookings_booking_business_create_service(client,
booking_business_id,
id_=None,
display_name=None,
additional_information=None,
default_duration=None,
default_price=None,
default_price_type=None,
default_reminders=None,
description=None,
is_hidden_from_customers=None,
is_location_online=None,
notes=None,
post_buffer=None,
pre_buffer=None,
scheduling_policy=None,
staff_member_ids=None,
address=None,
coordinates=None,
microsoft_graph_location_display_name=None,
location_email_address=None,
location_type=None,
location_uri=None,
unique_id=None,
unique_id_type=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['additional_information'] = additional_information
body['default_duration'] = default_duration
body['default_price'] = default_price
body['default_price_type'] = default_price_type
body['default_reminders'] = default_reminders
body['description'] = description
body['is_hidden_from_customers'] = is_hidden_from_customers
body['is_location_online'] = is_location_online
body['notes'] = notes
body['post_buffer'] = post_buffer
body['pre_buffer'] = pre_buffer
body['scheduling_policy'] = scheduling_policy
body['staff_member_ids'] = staff_member_ids
body['default_location'] = {}
body['default_location']['address'] = address
body['default_location']['coordinates'] = coordinates
body['default_location']['display_name'] = microsoft_graph_location_display_name
body['default_location']['location_email_address'] = location_email_address
body['default_location']['location_type'] = location_type
body['default_location']['location_uri'] = location_uri
body['default_location']['unique_id'] = unique_id
body['default_location']['unique_id_type'] = unique_id_type
return client.create_services(booking_business_id=booking_business_id,
body=body)
def bookings_booking_business_create_staff_member(client,
booking_business_id,
id_=None,
display_name=None,
email_address=None,
availability_is_affected_by_personal_calendar=None,
color_index=None,
role=None,
use_business_hours=None,
working_hours=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['email_address'] = email_address
body['availability_is_affected_by_personal_calendar'] = availability_is_affected_by_personal_calendar
body['color_index'] = color_index
body['role'] = role
body['use_business_hours'] = use_business_hours
body['working_hours'] = working_hours
return client.create_staff_members(booking_business_id=booking_business_id,
body=body)
def bookings_booking_business_delete_appointment(client,
booking_business_id,
booking_appointment_id,
if_match=None):
return client.delete_appointments(booking_business_id=booking_business_id,
booking_appointment_id=booking_appointment_id,
if_match=if_match)
def bookings_booking_business_delete_calendar_view(client,
booking_business_id,
booking_appointment_id,
if_match=None):
return client.delete_calendar_view(booking_business_id=booking_business_id,
booking_appointment_id=booking_appointment_id,
if_match=if_match)
def bookings_booking_business_delete_customer(client,
booking_business_id,
booking_customer_id,
if_match=None):
return client.delete_customers(booking_business_id=booking_business_id,
booking_customer_id=booking_customer_id,
if_match=if_match)
def bookings_booking_business_delete_service(client,
booking_business_id,
booking_service_id,
if_match=None):
return client.delete_services(booking_business_id=booking_business_id,
booking_service_id=booking_service_id,
if_match=if_match)
def bookings_booking_business_delete_staff_member(client,
booking_business_id,
booking_staff_member_id,
if_match=None):
return client.delete_staff_members(booking_business_id=booking_business_id,
booking_staff_member_id=booking_staff_member_id,
if_match=if_match)
def bookings_booking_business_list_appointment(client,
booking_business_id,
orderby=None,
select=None,
expand=None):
return client.list_appointments(booking_business_id=booking_business_id,
orderby=orderby,
select=select,
expand=expand)
def bookings_booking_business_list_calendar_view(client,
booking_business_id,
start,
end,
orderby=None,
select=None,
expand=None):
return client.list_calendar_view(booking_business_id=booking_business_id,
start=start,
end=end,
orderby=orderby,
select=select,
expand=expand)
def bookings_booking_business_list_customer(client,
booking_business_id,
orderby=None,
select=None,
expand=None):
return client.list_customers(booking_business_id=booking_business_id,
orderby=orderby,
select=select,
expand=expand)
def bookings_booking_business_list_service(client,
booking_business_id,
orderby=None,
select=None,
expand=None):
return client.list_services(booking_business_id=booking_business_id,
orderby=orderby,
select=select,
expand=expand)
def bookings_booking_business_list_staff_member(client,
booking_business_id,
orderby=None,
select=None,
expand=None):
return client.list_staff_members(booking_business_id=booking_business_id,
orderby=orderby,
select=select,
expand=expand)
def bookings_booking_business_publish(client,
booking_business_id):
return client.publish(booking_business_id=booking_business_id)
def bookings_booking_business_show_appointment(client,
booking_business_id,
booking_appointment_id,
select=None,
expand=None):
return client.get_appointments(booking_business_id=booking_business_id,
booking_appointment_id=booking_appointment_id,
select=select,
expand=expand)
def bookings_booking_business_show_calendar_view(client,
booking_business_id,
booking_appointment_id,
start,
end,
select=None,
expand=None):
return client.get_calendar_view(booking_business_id=booking_business_id,
booking_appointment_id=booking_appointment_id,
start=start,
end=end,
select=select,
expand=expand)
def bookings_booking_business_show_customer(client,
booking_business_id,
booking_customer_id,
select=None,
expand=None):
return client.get_customers(booking_business_id=booking_business_id,
booking_customer_id=booking_customer_id,
select=select,
expand=expand)
def bookings_booking_business_show_service(client,
booking_business_id,
booking_service_id,
select=None,
expand=None):
return client.get_services(booking_business_id=booking_business_id,
booking_service_id=booking_service_id,
select=select,
expand=expand)
def bookings_booking_business_show_staff_member(client,
booking_business_id,
booking_staff_member_id,
select=None,
expand=None):
return client.get_staff_members(booking_business_id=booking_business_id,
booking_staff_member_id=booking_staff_member_id,
select=select,
expand=expand)
def bookings_booking_business_unpublish(client,
booking_business_id):
return client.unpublish(booking_business_id=booking_business_id)
def bookings_booking_business_update_appointment(client,
booking_business_id,
booking_appointment_id,
id_=None,
additional_information=None,
customer_email_address=None,
customer_id=None,
customer_name=None,
customer_notes=None,
customer_phone=None,
duration=None,
end=None,
invoice_amount=None,
invoice_date=None,
invoice_id=None,
invoice_status=None,
invoice_url=None,
is_location_online=None,
online_meeting_url=None,
opt_out_of_customer_email=None,
post_buffer=None,
pre_buffer=None,
price=None,
price_type=None,
reminders=None,
self_service_appointment_id=None,
service_id=None,
service_name=None,
service_notes=None,
staff_member_ids=None,
start=None,
address=None,
coordinates=None,
display_name=None,
location_email_address=None,
location_type=None,
location_uri=None,
unique_id=None,
unique_id_type=None,
microsoft_graph_physical_address=None,
microsoft_graph_outlook_geo_coordinates=None,
microsoft_graph_location_display_name=None,
microsoft_graph_location_email_address_location_email_address=None,
microsoft_graph_location_type=None,
microsoft_graph_location_uri=None,
microsoft_graph_location_unique_id=None,
microsoft_graph_location_unique_id_type_unique_id_type=None):
body = {}
body['id'] = id_
body['additional_information'] = additional_information
body['customer_email_address'] = customer_email_address
body['customer_id'] = customer_id
body['customer_name'] = customer_name
body['customer_notes'] = customer_notes
body['customer_phone'] = customer_phone
body['duration'] = duration
body['end'] = end
body['invoice_amount'] = invoice_amount
body['invoice_date'] = invoice_date
body['invoice_id'] = invoice_id
body['invoice_status'] = invoice_status
body['invoice_url'] = invoice_url
body['is_location_online'] = is_location_online
body['online_meeting_url'] = online_meeting_url
body['opt_out_of_customer_email'] = opt_out_of_customer_email
body['post_buffer'] = post_buffer
body['pre_buffer'] = pre_buffer
body['price'] = price
body['price_type'] = price_type
body['reminders'] = reminders
body['self_service_appointment_id'] = self_service_appointment_id
body['service_id'] = service_id
body['service_name'] = service_name
body['service_notes'] = service_notes
body['staff_member_ids'] = staff_member_ids
body['start'] = start
body['service_location'] = {}
body['service_location']['address'] = address
body['service_location']['coordinates'] = coordinates
body['service_location']['display_name'] = display_name
body['service_location']['location_email_address'] = location_email_address
body['service_location']['location_type'] = location_type
body['service_location']['location_uri'] = location_uri
body['service_location']['unique_id'] = unique_id
body['service_location']['unique_id_type'] = unique_id_type
body['customer_location'] = {}
body['customer_location']['address'] = microsoft_graph_physical_address
body['customer_location']['coordinates'] = microsoft_graph_outlook_geo_coordinates
body['customer_location']['display_name'] = microsoft_graph_location_display_name
body['customer_location']['location_email_address'] = microsoft_graph_location_email_address_location_email_address
body['customer_location']['location_type'] = microsoft_graph_location_type
body['customer_location']['location_uri'] = microsoft_graph_location_uri
body['customer_location']['unique_id'] = microsoft_graph_location_unique_id
body['customer_location']['unique_id_type'] = microsoft_graph_location_unique_id_type_unique_id_type
return client.update_appointments(booking_business_id=booking_business_id,
booking_appointment_id=booking_appointment_id,
body=body)
def bookings_booking_business_update_calendar_view(client,
booking_business_id,
booking_appointment_id,
id_=None,
additional_information=None,
customer_email_address=None,
customer_id=None,
customer_name=None,
customer_notes=None,
customer_phone=None,
duration=None,
end=None,
invoice_amount=None,
invoice_date=None,
invoice_id=None,
invoice_status=None,
invoice_url=None,
is_location_online=None,
online_meeting_url=None,
opt_out_of_customer_email=None,
post_buffer=None,
pre_buffer=None,
price=None,
price_type=None,
reminders=None,
self_service_appointment_id=None,
service_id=None,
service_name=None,
service_notes=None,
staff_member_ids=None,
start=None,
address=None,
coordinates=None,
display_name=None,
location_email_address=None,
location_type=None,
location_uri=None,
unique_id=None,
unique_id_type=None,
microsoft_graph_physical_address=None,
microsoft_graph_outlook_geo_coordinates=None,
microsoft_graph_location_display_name=None,
microsoft_graph_location_email_address_location_email_address=None,
microsoft_graph_location_type=None,
microsoft_graph_location_uri=None,
microsoft_graph_location_unique_id=None,
microsoft_graph_location_unique_id_type_unique_id_type=None):
body = {}
body['id'] = id_
body['additional_information'] = additional_information
body['customer_email_address'] = customer_email_address
body['customer_id'] = customer_id
body['customer_name'] = customer_name
body['customer_notes'] = customer_notes
body['customer_phone'] = customer_phone
body['duration'] = duration
body['end'] = end
body['invoice_amount'] = invoice_amount
body['invoice_date'] = invoice_date
body['invoice_id'] = invoice_id
body['invoice_status'] = invoice_status
body['invoice_url'] = invoice_url
body['is_location_online'] = is_location_online
body['online_meeting_url'] = online_meeting_url
body['opt_out_of_customer_email'] = opt_out_of_customer_email
body['post_buffer'] = post_buffer
body['pre_buffer'] = pre_buffer
body['price'] = price
body['price_type'] = price_type
body['reminders'] = reminders
body['self_service_appointment_id'] = self_service_appointment_id
body['service_id'] = service_id
body['service_name'] = service_name
body['service_notes'] = service_notes
body['staff_member_ids'] = staff_member_ids
body['start'] = start
body['service_location'] = {}
body['service_location']['address'] = address
body['service_location']['coordinates'] = coordinates
body['service_location']['display_name'] = display_name
body['service_location']['location_email_address'] = location_email_address
body['service_location']['location_type'] = location_type
body['service_location']['location_uri'] = location_uri
body['service_location']['unique_id'] = unique_id
body['service_location']['unique_id_type'] = unique_id_type
body['customer_location'] = {}
body['customer_location']['address'] = microsoft_graph_physical_address
body['customer_location']['coordinates'] = microsoft_graph_outlook_geo_coordinates
body['customer_location']['display_name'] = microsoft_graph_location_display_name
body['customer_location']['location_email_address'] = microsoft_graph_location_email_address_location_email_address
body['customer_location']['location_type'] = microsoft_graph_location_type
body['customer_location']['location_uri'] = microsoft_graph_location_uri
body['customer_location']['unique_id'] = microsoft_graph_location_unique_id
body['customer_location']['unique_id_type'] = microsoft_graph_location_unique_id_type_unique_id_type
return client.update_calendar_view(booking_business_id=booking_business_id,
booking_appointment_id=booking_appointment_id,
body=body)
def bookings_booking_business_update_customer(client,
booking_business_id,
booking_customer_id,
id_=None,
display_name=None,
email_address=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['email_address'] = email_address
return client.update_customers(booking_business_id=booking_business_id,
booking_customer_id=booking_customer_id,
body=body)
def bookings_booking_business_update_service(client,
booking_business_id,
booking_service_id,
id_=None,
display_name=None,
additional_information=None,
default_duration=None,
default_price=None,
default_price_type=None,
default_reminders=None,
description=None,
is_hidden_from_customers=None,
is_location_online=None,
notes=None,
post_buffer=None,
pre_buffer=None,
scheduling_policy=None,
staff_member_ids=None,
address=None,
coordinates=None,
microsoft_graph_location_display_name=None,
location_email_address=None,
location_type=None,
location_uri=None,
unique_id=None,
unique_id_type=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['additional_information'] = additional_information
body['default_duration'] = default_duration
body['default_price'] = default_price
body['default_price_type'] = default_price_type
body['default_reminders'] = default_reminders
body['description'] = description
body['is_hidden_from_customers'] = is_hidden_from_customers
body['is_location_online'] = is_location_online
body['notes'] = notes
body['post_buffer'] = post_buffer
body['pre_buffer'] = pre_buffer
body['scheduling_policy'] = scheduling_policy
body['staff_member_ids'] = staff_member_ids
body['default_location'] = {}
body['default_location']['address'] = address
body['default_location']['coordinates'] = coordinates
body['default_location']['display_name'] = microsoft_graph_location_display_name
body['default_location']['location_email_address'] = location_email_address
body['default_location']['location_type'] = location_type
body['default_location']['location_uri'] = location_uri
body['default_location']['unique_id'] = unique_id
body['default_location']['unique_id_type'] = unique_id_type
return client.update_services(booking_business_id=booking_business_id,
booking_service_id=booking_service_id,
body=body)
def bookings_booking_business_update_staff_member(client,
booking_business_id,
booking_staff_member_id,
id_=None,
display_name=None,
email_address=None,
availability_is_affected_by_personal_calendar=None,
color_index=None,
role=None,
use_business_hours=None,
working_hours=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['email_address'] = email_address
body['availability_is_affected_by_personal_calendar'] = availability_is_affected_by_personal_calendar
body['color_index'] = color_index
body['role'] = role
body['use_business_hours'] = use_business_hours
body['working_hours'] = working_hours
return client.update_staff_members(booking_business_id=booking_business_id,
booking_staff_member_id=booking_staff_member_id,
body=body)
def bookings_booking_business_appointment_cancel(client,
booking_business_id,
booking_appointment_id,
cancellation_message=None):
body = {}
body['cancellation_message'] = cancellation_message
return client.cancel(booking_business_id=booking_business_id,
booking_appointment_id=booking_appointment_id,
body=body)
def bookings_booking_business_calendar_view_cancel(client,
booking_business_id,
booking_appointment_id,
cancellation_message=None):
body = {}
body['cancellation_message'] = cancellation_message
return client.cancel(booking_business_id=booking_business_id,
booking_appointment_id=booking_appointment_id,
body=body)
def bookings_booking_currency_booking_currency_create_booking_currency(client,
id_=None,
symbol=None):
body = {}
body['id'] = id_
body['symbol'] = symbol
return client.create_booking_currency(body=body)
def bookings_booking_currency_booking_currency_delete_booking_currency(client,
booking_currency_id,
if_match=None):
return client.delete_booking_currency(booking_currency_id=booking_currency_id,
if_match=if_match)
def bookings_booking_currency_booking_currency_list_booking_currency(client,
orderby=None,
select=None,
expand=None):
return client.list_booking_currency(orderby=orderby,
select=select,
expand=expand)
def bookings_booking_currency_booking_currency_show_booking_currency(client,
booking_currency_id,
select=None,
expand=None):
return client.get_booking_currency(booking_currency_id=booking_currency_id,
select=select,
expand=expand)
def bookings_booking_currency_booking_currency_update_booking_currency(client,
booking_currency_id,
id_=None,
symbol=None):
body = {}
body['id'] = id_
body['symbol'] = symbol
return client.update_booking_currency(booking_currency_id=booking_currency_id,
body=body)
| 57.695931 | 120 | 0.450249 | 3,975 | 53,888 | 5.647044 | 0.033962 | 0.096895 | 0.072705 | 0.070566 | 0.975364 | 0.970508 | 0.949837 | 0.940304 | 0.892903 | 0.872411 | 0 | 0 | 0.491074 | 53,888 | 933 | 121 | 57.757771 | 0.818487 | 0.008722 | 0 | 0.910059 | 0 | 0 | 0.096238 | 0.014979 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046154 | false | 0 | 0 | 0.027219 | 0.092308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d415da9981b6ca389d67d55ed809c97988418a6 | 83 | py | Python | website_event_attendee_fields/tests/__init__.py | factorlibre/website-addons | 9a0c7a238e2b6030d57f7a08d48816b4f2431524 | [
"MIT"
] | 1 | 2020-03-01T03:04:21.000Z | 2020-03-01T03:04:21.000Z | website_event_attendee_fields/tests/__init__.py | factorlibre/website-addons | 9a0c7a238e2b6030d57f7a08d48816b4f2431524 | [
"MIT"
] | null | null | null | website_event_attendee_fields/tests/__init__.py | factorlibre/website-addons | 9a0c7a238e2b6030d57f7a08d48816b4f2431524 | [
"MIT"
] | 3 | 2019-07-29T20:23:16.000Z | 2021-01-07T20:51:24.000Z | from . import test_backend
from . import test_security
from . import test_workflow
| 20.75 | 27 | 0.819277 | 12 | 83 | 5.416667 | 0.5 | 0.461538 | 0.646154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144578 | 83 | 3 | 28 | 27.666667 | 0.915493 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5d6a2fbdc764547118d91f28cf0676b630c0ad39 | 57,895 | py | Python | osmchadjango/changeset/tests/test_changeset_views.py | jbronn/osmcha-django | 0de03fa081205f3a43114c5eb25211217f2dc0af | [
"BSD-2-Clause"
] | 27 | 2015-09-06T00:39:39.000Z | 2021-12-09T10:30:52.000Z | osmchadjango/changeset/tests/test_changeset_views.py | jbronn/osmcha-django | 0de03fa081205f3a43114c5eb25211217f2dc0af | [
"BSD-2-Clause"
] | 494 | 2015-09-10T19:39:38.000Z | 2022-03-29T08:07:37.000Z | osmchadjango/changeset/tests/test_changeset_views.py | jbronn/osmcha-django | 0de03fa081205f3a43114c5eb25211217f2dc0af | [
"BSD-2-Clause"
] | 17 | 2015-08-10T22:58:56.000Z | 2021-09-24T17:03:16.000Z | import json
from django.contrib.gis.geos import Polygon
from django.urls import reverse
from django.test import TestCase, override_settings
from social_django.models import UserSocialAuth
from rest_framework.test import APITestCase
from ...users.models import User
from ...supervise.models import BlacklistedUser
from ..models import SuspicionReasons, Tag, Changeset
from ..views import ChangesetListAPIView, PaginatedCSVRenderer
from .modelfactories import (
ChangesetFactory, SuspectChangesetFactory, GoodChangesetFactory,
HarmfulChangesetFactory, TagFactory, UserWhitelistFactory,
MappingTeamFactory
)
class TestChangesetListView(APITestCase):
def setUp(self):
SuspectChangesetFactory.create_batch(26)
ChangesetFactory.create_batch(26)
# list endpoints will not list Changesets with user=""
ChangesetFactory(user="")
self.user = User.objects.create_user(
username='test',
password='password',
email='a@a.com',
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='123123',
)
self.url = reverse('changeset:list')
def test_unauthenticated_changeset_list_response(self):
response = self.client.get(self.url)
self.assertEqual(response.status_code, 401)
def test_authenticated_changeset_list_response(self):
self.client.login(username=self.user.username, password='password')
response = self.client.get(self.url)
self.assertIn('user', response.data['features'][0]['properties'].keys())
self.assertIn('uid', response.data['features'][0]['properties'].keys())
self.assertIn(
'check_user',
response.data['features'][0]['properties'].keys()
)
def test_pagination(self):
self.client.login(username=self.user.username, password='password')
response = self.client.get(self.url, {'page': 2})
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['features']), 2)
self.assertEqual(response.data['count'], 52)
# test page_size parameter
response = self.client.get(self.url, {'page_size': 60})
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['features']), 52)
def test_user_filters(self):
"""Test filters in the changeset list view.
"""
self.client.login(username=self.user.username, password='password')
response = self.client.get(self.url, {'users': 'another_user'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
response = self.client.get(self.url, {'users': 'test'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 52)
response = self.client.get(self.url, {'checked_by': 'another_user'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
response = self.client.get(self.url, {'uids': '98978,43323'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
response = self.client.get(self.url, {'uids': '123123'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 52)
def test_area_lt_filter(self):
"""Test in_bbox in combination with area_lt filter field."""
ChangesetFactory(
bbox=Polygon([(0, 0), (0, 3), (3, 3), (3, 0), (0, 0)])
)
self.client.login(username=self.user.username, password='password')
response = self.client.get(self.url, {'in_bbox': '0,0,1,1', 'area_lt': 10})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 1)
response = self.client.get(self.url, {'in_bbox': '0,0,1,1', 'area_lt': 8})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
response = self.client.get(self.url, {'in_bbox': '0,0,2,2', 'area_lt': 3})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 1)
response = self.client.get(self.url, {'in_bbox': '0,0,2,2', 'area_lt': 2})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
def test_hide_whitelist_filter(self):
UserWhitelistFactory(user=self.user, whitelist_user='test')
# As all changesets in the DB are from a whitelisted user,
# the features count will be zero
self.client.login(username=self.user.username, password='password')
response = self.client.get(self.url, {'hide_whitelist': 'true'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
def test_blacklisted_filter(self):
"""Test getting changesets filtered by blacklisted users"""
ChangesetFactory(uid=444, user='the_user')
BlacklistedUser.objects.create(
uid='123123',
username='test',
added_by=self.user
)
self.client.login(username=self.user.username, password='password')
response = self.client.get(self.url, {'blacklist': 'false'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 53)
response = self.client.get(self.url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 53)
response = self.client.get(self.url, {'blacklist': 'true'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 52)
BlacklistedUser.objects.create(
uid='444',
username='the_user',
added_by=self.user
)
response = self.client.get(self.url, {'blacklist': 'true'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 53)
def test_mapping_team_filter(self):
ChangesetFactory(uid=444, user='the_user')
self.team = MappingTeamFactory(
name="TestCompany",
users=json.dumps([{
"username": "test",
"doj": "2017-02-13T00:00:00Z",
"uid": "123123",
"dol": ""
}])
)
self.team_2 = MappingTeamFactory(
name="MapCompany",
trusted=False,
users=json.dumps([{
"username": "the_user",
"doj": "2017-02-13T00:00:00Z",
"uid": "444",
"dol": ""
}])
)
self.client.login(username=self.user.username, password='password')
# mapping_teams tests
response = self.client.get(self.url, {'mapping_teams': self.team.name})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 52)
response = self.client.get(self.url, {'mapping_teams': self.team_2.name})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 1)
response = self.client.get(
self.url,
{'mapping_teams': 'TestCompany,MapCompany'}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 53)
response = self.client.get(
self.url,
{'mapping_teams': 'TestCompany, MapCompany'}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 53)
# exclude_teams tests
response = self.client.get(self.url, {'exclude_teams': self.team.name})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 1)
response = self.client.get(self.url, {'exclude_teams': self.team_2.name})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 52)
response = self.client.get(
self.url,
{'exclude_teams': 'TestCompany,MapCompany'}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
response = self.client.get(
self.url,
{'exclude_teams': 'TestCompany, MapCompany'}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
# exclude trusted teams
response = self.client.get(
self.url,
{'exclude_trusted_teams': 'true'}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 1)
def test_metadata_filters(self):
"""Test filters by metadata field in the changeset list view.
"""
self.client.login(username=self.user.username, password='password')
response = self.client.get(self.url, {'metadata': 'changesets_count__min=100'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
response = self.client.get(self.url, {'metadata': 'changesets_count__min=99', 'is_suspect': True})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 26)
def test_comments_count_filters(self):
"""Test filters by comments_count field in the changeset list view.
"""
self.client.login(username=self.user.username, password='password')
response = self.client.get(self.url, {'comments_count__gte': 2})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
response = self.client.get(self.url, {'comments_count__gte': 1})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 52)
response = self.client.get(self.url, {'comments_count__lte': 1})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 52)
response = self.client.get(self.url, {'comments_count__lte': 0})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 0)
def test_csv_renderer(self):
self.assertIn(
PaginatedCSVRenderer,
ChangesetListAPIView().renderer_classes
)
self.client.login(username=self.user.username, password='password')
response = self.client.get(self.url, {'format': 'csv', 'page_size': 60})
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['features']), 52)
response = self.client.get(
self.url,
{'is_suspect': 'true', 'format': 'csv'}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['features']), 26)
class TestChangesetFilteredViews(APITestCase):
def setUp(self):
ChangesetFactory()
SuspectChangesetFactory()
HarmfulChangesetFactory()
GoodChangesetFactory()
self.user = User.objects.create_user(
username='test',
password='password',
email='a@a.com',
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='123123',
)
self.client.login(username=self.user.username, password='password')
def test_suspect_changesets_view(self):
url = reverse('changeset:suspect-list')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 3)
def test_no_suspect_changesets_view(self):
url = reverse('changeset:no-suspect-list')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 1)
def test_harmful_changesets_view(self):
url = reverse('changeset:harmful-list')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 1)
def test_no_harmful_changesets_view(self):
url = reverse('changeset:no-harmful-list')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 1)
def test_checked_changesets_view(self):
url = reverse('changeset:checked-list')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 2)
self.assertTrue(
response.data['features'][0]['properties']['check_user'].startswith(
'user '
)
)
def test_unchecked_changesets_view(self):
url = reverse('changeset:unchecked-list')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data['count'], 2)
class TestChangesetListViewOrdering(APITestCase):
def setUp(self):
self.user = User.objects.create_user(
username='test',
password='password',
email='a@a.com',
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='123123',
)
self.client.login(username=self.user.username, password='password')
HarmfulChangesetFactory.create_batch(
24, form_create=20, modify=2, delete=40, comments_count=3
)
GoodChangesetFactory.create_batch(24, form_create=1000, modify=20)
self.url = reverse('changeset:list')
def test_ordering(self):
# default ordering is by descending id
response = self.client.get(self.url)
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.all()]
)
# ascending id
response = self.client.get(self.url, {'order_by': 'id'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('id')]
)
# ascending date ordering
response = self.client.get(self.url, {'order_by': 'date'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('date')]
)
# descending date ordering
response = self.client.get(self.url, {'order_by': '-date'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('-date')]
)
# ascending check_date
response = self.client.get(self.url, {'order_by': 'check_date'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('check_date')]
)
# descending check_date ordering
response = self.client.get(self.url, {'order_by': '-check_date'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('-check_date')]
)
# ascending create ordering
response = self.client.get(self.url, {'order_by': 'create'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('create')]
)
# descending create ordering
response = self.client.get(self.url, {'order_by': '-create'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('-create')]
)
# ascending modify ordering
response = self.client.get(self.url, {'order_by': 'modify'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('modify')]
)
# descending modify ordering
response = self.client.get(self.url, {'order_by': '-modify'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('-modify')]
)
# ascending delete ordering
response = self.client.get(self.url, {'order_by': 'delete'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('delete')]
)
# descending delete ordering
response = self.client.get(self.url, {'order_by': '-delete'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('-delete')]
)
# ascending comments_count ordering
response = self.client.get(self.url, {'order_by': 'comments_count'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('comments_count')]
)
# descending comments_count ordering
response = self.client.get(self.url, {'order_by': '-comments_count'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.order_by('-comments_count')]
)
def test_invalid_ordering_field(self):
# default ordering is by descending id
response = self.client.get(self.url, {'order_by': 'user'})
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[i.id for i in Changeset.objects.all()]
)
def test_number_reasons_ordering(self):
changeset_1, changeset_2 = Changeset.objects.all()[:2]
self.reason_1 = SuspicionReasons.objects.create(name='possible import')
self.reason_1.changesets.add(changeset_1)
self.reason_2 = SuspicionReasons.objects.create(name='suspect word')
self.reason_2.changesets.add(changeset_1, changeset_2)
response = self.client.get(
self.url,
{'order_by': '-number_reasons', 'page_size': 2}
)
self.assertEqual(
[i['id'] for i in response.data.get('features')],
[changeset_1.id, changeset_2.id]
)
response = self.client.get(self.url, {'order_by': 'number_reasons'})
self.assertEqual(
[i['id'] for i in response.data.get('features')[-2:]],
[changeset_2.id, changeset_1.id]
)
class TestChangesetDetailView(APITestCase):
def setUp(self):
self.reason_1 = SuspicionReasons.objects.create(name='possible import')
self.reason_2 = SuspicionReasons.objects.create(
name='Big edit in my city',
is_visible=False
)
self.changeset = HarmfulChangesetFactory(
id=31982803,
new_features=[{
"osm_id": 87765444,
"url": "node-87765444",
"version": 44,
"reasons": [self.reason_1.id, self.reason_2.id],
"name": "Test"
}]
)
self.reason_1.changesets.add(self.changeset)
self.reason_2.changesets.add(self.changeset)
self.tag = Tag.objects.create(name='Vandalism')
self.tag.changesets.add(self.changeset)
def test_unauthenticated_changeset_detail_response(self):
response = self.client.get(
reverse('changeset:detail', args=[self.changeset.id])
)
self.assertEqual(response.status_code, 401)
def test_authenticated_changeset_detail_response(self):
self.user = User.objects.create_user(
username='test',
password='password',
email='a@a.com',
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='123123',
)
self.client.login(username=self.user.username, password='password')
response = self.client.get(
reverse('changeset:detail', args=[self.changeset.id])
)
self.assertEqual(response.status_code, 200)
self.assertEqual(self.changeset.uid, response.data['properties']['uid'])
self.assertEqual(self.changeset.user, response.data['properties']['user'])
self.assertEqual(response.data['properties']['comments_count'], 1)
self.assertEqual(
self.changeset.check_user.name,
response.data['properties']['check_user']
)
def test_changeset_detail_response_with_staff_user(self):
self.user = User.objects.create_user(
username='test',
password='password',
email='a@a.com',
is_staff=True
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='123123',
)
self.client.login(username=self.user.username, password='password')
response = self.client.get(
reverse('changeset:detail', args=[self.changeset.id])
)
self.assertEqual(response.status_code, 200)
self.assertEqual(
len(response.data['properties']['features']),
1
)
self.assertIn(
self.reason_2.id,
response.data['properties']['features'][0]['reasons']
)
class TestReasonsAndTagFieldsInChangesetViews(APITestCase):
def setUp(self):
self.admin_user = User.objects.create_user(
username='admin',
password='password',
email='a@a.com',
is_staff=True
)
UserSocialAuth.objects.create(
user=self.admin_user,
provider='openstreetmap',
uid='123123',
)
self.user = User.objects.create_user(
username='normal_user',
password='password',
email='a@a.com',
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='234312',
)
self.reason_1 = SuspicionReasons.objects.create(name='possible import')
self.reason_2 = SuspicionReasons.objects.create(name='suspect word')
self.reason_3 = SuspicionReasons.objects.create(
name='Big edit in my city',
is_visible=False
)
self.changeset = HarmfulChangesetFactory(id=31982803)
self.reason_1.changesets.add(self.changeset)
self.reason_2.changesets.add(self.changeset)
self.reason_3.changesets.add(self.changeset)
self.tag_1 = Tag.objects.create(name='Vandalism')
self.tag_2 = Tag.objects.create(
name='Vandalism in my city',
is_visible=False
)
self.tag_1.changesets.add(self.changeset)
self.tag_2.changesets.add(self.changeset)
def test_detail_view_by_normal_user(self):
self.client.login(username=self.user.username, password='password')
response = self.client.get(reverse('changeset:detail', args=[self.changeset.id]))
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['properties']['reasons']), 2)
self.assertEqual(len(response.data['properties']['tags']), 1)
self.assertIn(
{'id': self.reason_1.id, 'name': 'possible import'},
response.data['properties']['reasons']
)
self.assertIn(
{'id': self.reason_2.id, 'name': 'suspect word'},
response.data['properties']['reasons']
)
self.assertIn(
{'id': self.tag_1.id, 'name': 'Vandalism'},
response.data['properties']['tags']
)
def test_detail_view_by_admin(self):
self.client.login(username=self.admin_user.username, password='password')
response = self.client.get(reverse('changeset:detail', args=[self.changeset.id]))
self.assertEqual(response.status_code, 200)
self.assertIn(
{'id': self.reason_3.id, 'name': 'Big edit in my city'},
response.data['properties']['reasons']
)
self.assertEqual(len(response.data['properties']['reasons']), 3)
self.assertEqual(len(response.data['properties']['tags']), 2)
self.assertIn(
{'id': self.tag_2.id, 'name': 'Vandalism in my city'},
response.data['properties']['tags']
)
self.assertEqual(response.data.get('id'), 31982803)
self.assertIn('geometry', response.data.keys())
self.assertIn('properties', response.data.keys())
self.assertEqual(self.changeset.uid, response.data['properties']['uid'])
self.assertEqual(
self.changeset.editor,
response.data['properties']['editor']
)
self.assertEqual(self.changeset.user, response.data['properties']['user'])
self.assertEqual(
self.changeset.imagery_used,
response.data['properties']['imagery_used']
)
self.assertEqual(
self.changeset.source,
response.data['properties']['source']
)
self.assertEqual(
self.changeset.comment,
response.data['properties']['comment']
)
self.assertEqual(
self.changeset.create,
response.data['properties']['create']
)
self.assertEqual(
self.changeset.modify,
response.data['properties']['modify']
)
self.assertEqual(
self.changeset.delete,
response.data['properties']['delete']
)
self.assertEqual(
self.changeset.check_user.name,
response.data['properties']['check_user']
)
self.assertTrue(response.data['properties']['is_suspect'])
self.assertTrue(response.data['properties']['checked'])
self.assertTrue(response.data['properties']['harmful'])
self.assertIn('date', response.data['properties'].keys())
self.assertIn('check_date', response.data['properties'].keys())
self.assertEqual(len(response.data['properties']['features']), 0)
def test_list_view_by_normal_user(self):
self.client.login(username=self.user.username, password='password')
response = self.client.get(reverse('changeset:list'))
self.assertEqual(response.status_code, 200)
reasons = response.data['features'][0]['properties']['reasons']
tags = response.data['features'][0]['properties']['tags']
self.assertEqual(len(reasons), 2)
self.assertEqual(len(tags), 1)
self.assertIn(
{'id': self.reason_1.id, 'name': 'possible import'},
reasons
)
self.assertIn({'id': self.reason_2.id, 'name': 'suspect word'}, reasons)
self.assertIn({'id': self.tag_1.id, 'name': 'Vandalism'}, tags)
def test_list_view_by_admin(self):
self.client.login(username=self.admin_user.username, password='password')
response = self.client.get(reverse('changeset:list'))
self.assertEqual(response.status_code, 200)
reasons = response.data['features'][0]['properties']['reasons']
tags = response.data['features'][0]['properties']['tags']
self.assertEqual(len(reasons), 3)
self.assertEqual(len(tags), 2)
self.assertIn(
{'id': self.reason_3.id, 'name': 'Big edit in my city'},
reasons
)
self.assertIn(
{'id': self.tag_2.id, 'name': 'Vandalism in my city'},
tags
)
class TestCheckChangesetViews(APITestCase):
def setUp(self):
self.reason_1 = SuspicionReasons.objects.create(name='possible import')
self.reason_2 = SuspicionReasons.objects.create(name='suspect_word')
self.changeset = SuspectChangesetFactory(
id=31982803, user='test', uid='123123'
)
self.changeset_2 = SuspectChangesetFactory(
id=31982804, user='test2', uid='999999', editor='iD',
)
self.reason_1.changesets.add(self.changeset)
self.reason_2.changesets.add(self.changeset)
self.user = User.objects.create_user(
username='test',
password='password',
email='a@a.com'
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='123123',
extra_data={
'id': '123123',
'access_token': {
'oauth_token': 'aaaa',
'oauth_token_secret': 'bbbb'
}
}
)
self.tag_1 = TagFactory(name='Illegal import')
self.tag_2 = TagFactory(name='Vandalism')
def test_set_harmful_changeset_unlogged(self):
"""Anonymous users can't mark a changeset as harmful."""
response = self.client.put(
reverse('changeset:set-harmful', args=[self.changeset])
)
self.assertEqual(response.status_code, 401)
self.changeset.refresh_from_db()
self.assertIsNone(self.changeset.harmful)
self.assertFalse(self.changeset.checked)
self.assertIsNone(self.changeset.check_user)
self.assertIsNone(self.changeset.check_date)
def test_set_good_changeset_unlogged(self):
"""Anonymous users can't mark a changeset as good."""
response = self.client.put(
reverse('changeset:set-good', args=[self.changeset])
)
self.assertEqual(response.status_code, 401)
self.changeset.refresh_from_db()
self.assertIsNone(self.changeset.harmful)
self.assertFalse(self.changeset.checked)
self.assertIsNone(self.changeset.check_user)
self.assertIsNone(self.changeset.check_date)
def test_set_harmful_changeset_not_allowed(self):
"""User can't mark his own changeset as harmful."""
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:set-harmful', args=[self.changeset])
)
self.assertEqual(response.status_code, 403)
self.changeset.refresh_from_db()
self.assertIsNone(self.changeset.harmful)
self.assertFalse(self.changeset.checked)
self.assertIsNone(self.changeset.check_user)
self.assertIsNone(self.changeset.check_date)
def test_set_good_changeset_not_allowed(self):
"""User can't mark his own changeset as good."""
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:set-good', args=[self.changeset])
)
self.assertEqual(response.status_code, 403)
self.changeset.refresh_from_db()
self.assertIsNone(self.changeset.harmful)
self.assertFalse(self.changeset.checked)
self.assertIsNone(self.changeset.check_user)
self.assertIsNone(self.changeset.check_date)
def test_set_harmful_changeset_get(self):
"""GET is not an allowed method in the set_harmful URL."""
self.client.login(username=self.user.username, password='password')
response = self.client.get(
reverse('changeset:set-harmful', args=[self.changeset_2]),
)
self.assertEqual(response.status_code, 405)
self.changeset_2.refresh_from_db()
self.assertIsNone(self.changeset_2.harmful)
self.assertFalse(self.changeset_2.checked)
self.assertIsNone(self.changeset_2.check_user)
self.assertIsNone(self.changeset_2.check_date)
def test_set_harmful_changeset_put(self):
"""User can set a changeset of another user as harmful with a PUT request.
We can also set the tags of the changeset sending it as data.
"""
self.client.login(username=self.user.username, password='password')
data = {'tags': [self.tag_1.id, self.tag_2.id]}
response = self.client.put(
reverse('changeset:set-harmful', args=[self.changeset_2.pk]),
data
)
self.assertEqual(response.status_code, 200)
self.changeset_2.refresh_from_db()
self.assertTrue(self.changeset_2.harmful)
self.assertTrue(self.changeset_2.checked)
self.assertEqual(self.changeset_2.check_user, self.user)
self.assertIsNotNone(self.changeset_2.check_date)
self.assertEqual(self.changeset_2.tags.count(), 2)
self.assertIn(
self.tag_1,
self.changeset_2.tags.all()
)
self.assertIn(
self.tag_2,
self.changeset_2.tags.all()
)
def test_set_harmful_changeset_with_invalid_tag_id(self):
"""Return a 400 error if a user try to add a invalid tag id to a changeset.
"""
self.client.login(username=self.user.username, password='password')
data = {'tags': [self.tag_1.id, 87765, 898986]}
response = self.client.put(
reverse('changeset:set-harmful', args=[self.changeset_2.pk]),
data
)
self.assertEqual(response.status_code, 400)
self.changeset_2.refresh_from_db()
self.assertIsNone(self.changeset_2.harmful)
self.assertFalse(self.changeset_2.checked)
self.assertIsNone(self.changeset_2.check_user)
self.assertIsNone(self.changeset_2.check_date)
self.assertEqual(self.changeset_2.tags.count(), 0)
def test_set_harmful_changeset_put_without_data(self):
"""Test marking a changeset as harmful without sending data (so the
changeset will not receive tags).
"""
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:set-harmful', args=[self.changeset_2.pk])
)
self.assertEqual(response.status_code, 200)
self.changeset_2.refresh_from_db()
self.assertTrue(self.changeset_2.harmful)
self.assertTrue(self.changeset_2.checked)
self.assertEqual(self.changeset_2.check_user, self.user)
self.assertIsNotNone(self.changeset_2.check_date)
self.assertEqual(self.changeset_2.tags.count(), 0)
def test_set_good_changeset_get(self):
"""GET is not an allowed method in the set_good URL."""
self.client.login(username=self.user.username, password='password')
response = self.client.get(
reverse('changeset:set-good', args=[self.changeset_2]),
)
self.assertEqual(response.status_code, 405)
self.changeset_2.refresh_from_db()
self.assertIsNone(self.changeset_2.harmful)
self.assertFalse(self.changeset_2.checked)
self.assertIsNone(self.changeset_2.check_user)
self.assertIsNone(self.changeset_2.check_date)
def test_set_good_changeset_put(self):
"""User can set a changeset of another user as good with a PUT request.
We can also set the tags of the changeset sending it as data.
"""
self.client.login(username=self.user.username, password='password')
data = {'tags': [self.tag_1.id, self.tag_2.id]}
response = self.client.put(
reverse('changeset:set-good', args=[self.changeset_2]),
data
)
self.assertEqual(response.status_code, 200)
self.changeset_2.refresh_from_db()
self.assertFalse(self.changeset_2.harmful)
self.assertTrue(self.changeset_2.checked)
self.assertEqual(self.changeset_2.check_user, self.user)
self.assertIsNotNone(self.changeset_2.check_date)
self.assertEqual(self.changeset_2.tags.count(), 2)
self.assertIn(
self.tag_1,
self.changeset_2.tags.all()
)
self.assertIn(
self.tag_2,
self.changeset_2.tags.all()
)
def test_set_good_changeset_with_invalid_tag_id(self):
"""Return a 400 error if a user try to add a invalid tag id to a changeset.
"""
self.client.login(username=self.user.username, password='password')
data = {'tags': [self.tag_1.id, 87765, 898986]}
response = self.client.put(
reverse('changeset:set-good', args=[self.changeset_2.pk]),
data
)
self.assertEqual(response.status_code, 400)
self.changeset_2.refresh_from_db()
self.assertIsNone(self.changeset_2.harmful)
self.assertFalse(self.changeset_2.checked)
self.assertIsNone(self.changeset_2.check_user)
self.assertIsNone(self.changeset_2.check_date)
self.assertEqual(self.changeset_2.tags.count(), 0)
def test_set_good_changeset_put_without_data(self):
"""Test marking a changeset as good without sending data (so the
changeset will not receive tags).
"""
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:set-good', args=[self.changeset_2]),
)
self.assertEqual(response.status_code, 200)
self.changeset_2.refresh_from_db()
self.assertFalse(self.changeset_2.harmful)
self.assertTrue(self.changeset_2.checked)
self.assertEqual(self.changeset_2.check_user, self.user)
self.assertIsNotNone(self.changeset_2.check_date)
def test_404(self):
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:set-good', args=[4988787832]),
)
self.assertEqual(response.status_code, 404)
response = self.client.put(
reverse('changeset:set-harmful', args=[4988787832]),
)
self.assertEqual(response.status_code, 404)
def test_try_to_check_changeset_already_checked(self):
"""A PUT request to set_harmful or set_good urls of a checked changeset
will not change anything on it.
"""
changeset = HarmfulChangesetFactory(uid=333)
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:set-good', args=[changeset.pk]),
)
self.assertEqual(response.status_code, 403)
changeset.refresh_from_db()
self.assertNotEqual(changeset.check_user, self.user)
data = {'tags': [self.tag_1.id, self.tag_2.id]}
response = self.client.put(
reverse('changeset:set-harmful', args=[changeset.pk]),
data,
)
self.assertEqual(response.status_code, 403)
changeset.refresh_from_db()
self.assertNotEqual(changeset.check_user, self.user)
class TestUncheckChangesetView(APITestCase):
def setUp(self):
self.user = User.objects.create_user(
username='test_2',
password='password',
email='a@a.com'
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='123123',
extra_data={
'id': '123123',
'access_token': {
'oauth_token': 'aaaa',
'oauth_token_secret': 'bbbb'
}
}
)
self.suspect_changeset = SuspectChangesetFactory()
self.good_changeset = GoodChangesetFactory(check_user=self.user)
self.harmful_changeset = HarmfulChangesetFactory(check_user=self.user)
self.harmful_changeset_2 = HarmfulChangesetFactory()
self.tag = TagFactory(name='Vandalism')
self.tag.changesets.set([
self.good_changeset,
self.harmful_changeset,
self.harmful_changeset_2
])
def test_unauthenticated_response(self):
response = self.client.put(
reverse('changeset:uncheck', args=[self.harmful_changeset.pk]),
)
self.assertEqual(response.status_code, 401)
self.harmful_changeset.refresh_from_db()
self.assertTrue(self.harmful_changeset.harmful)
self.assertTrue(self.harmful_changeset.checked)
self.assertEqual(self.harmful_changeset.check_user, self.user)
self.assertIsNotNone(self.harmful_changeset.check_date)
self.assertEqual(self.harmful_changeset.tags.count(), 1)
self.assertIn(self.tag, self.harmful_changeset.tags.all())
def test_uncheck_harmful_changeset(self):
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:uncheck', args=[self.harmful_changeset.pk]),
)
self.assertEqual(response.status_code, 200)
self.harmful_changeset.refresh_from_db()
self.assertIsNone(self.harmful_changeset.harmful)
self.assertFalse(self.harmful_changeset.checked)
self.assertIsNone(self.harmful_changeset.check_user)
self.assertIsNone(self.harmful_changeset.check_date)
self.assertEqual(self.harmful_changeset.tags.count(), 1)
def test_uncheck_good_changeset(self):
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:uncheck', args=[self.good_changeset.pk]),
)
self.assertEqual(response.status_code, 200)
self.good_changeset.refresh_from_db()
self.assertIsNone(self.good_changeset.harmful)
self.assertFalse(self.good_changeset.checked)
self.assertIsNone(self.good_changeset.check_user)
self.assertIsNone(self.good_changeset.check_date)
self.assertEqual(self.good_changeset.tags.count(), 1)
def test_common_user_uncheck_permission(self):
"""Common user can only uncheck changesets that he checked."""
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:uncheck', args=[self.harmful_changeset_2.pk]),
)
self.assertEqual(response.status_code, 403)
self.harmful_changeset.refresh_from_db()
self.assertTrue(self.harmful_changeset_2.harmful)
self.assertTrue(self.harmful_changeset_2.checked)
self.assertIsNotNone(self.harmful_changeset_2.check_user)
self.assertIsNotNone(self.harmful_changeset_2.check_date)
def test_try_to_uncheck_unchecked_changeset(self):
"""It's not possible to uncheck an unchecked changeset!"""
self.client.login(username=self.user.username, password='password')
response = self.client.put(
reverse('changeset:uncheck', args=[self.suspect_changeset.pk]),
)
self.assertEqual(response.status_code, 403)
def test_staff_user_uncheck_any_changeset(self):
"""A staff user can uncheck changesets checked by any user."""
staff_user = User.objects.create_user(
username='staff_test',
password='password',
email='s@a.com',
is_staff=True
)
UserSocialAuth.objects.create(
user=staff_user,
provider='openstreetmap',
uid='87873',
)
self.client.login(username=staff_user.username, password='password')
response = self.client.put(
reverse('changeset:uncheck', args=[self.good_changeset.pk]),
)
self.assertEqual(response.status_code, 200)
response = self.client.put(
reverse('changeset:uncheck', args=[self.harmful_changeset.pk]),
)
self.assertEqual(response.status_code, 200)
response = self.client.put(
reverse('changeset:uncheck', args=[self.harmful_changeset_2.pk]),
)
self.assertEqual(response.status_code, 200)
self.assertEqual(Changeset.objects.filter(checked=True).count(), 0)
class TestAddTagToChangeset(APITestCase):
def setUp(self):
self.user = User.objects.create_user(
username='user',
email='c@a.com',
password='password',
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='999',
)
self.changeset_user = User.objects.create_user(
username='test',
email='b@a.com',
password='password',
)
UserSocialAuth.objects.create(
user=self.changeset_user,
provider='openstreetmap',
uid='123123',
)
self.changeset = ChangesetFactory()
self.checked_changeset = HarmfulChangesetFactory(check_user=self.user)
self.tag = TagFactory(name='Not verified')
def test_unauthenticated_can_not_add_tag(self):
response = self.client.post(
reverse('changeset:tags', args=[self.changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 401)
self.assertEqual(self.changeset.tags.count(), 0)
def test_can_not_add_invalid_tag_id(self):
"""When the tag id does not exist, it will return a 404 response."""
self.client.login(username=self.user.username, password='password')
response = self.client.post(
reverse('changeset:tags', args=[self.changeset.id, 44534])
)
self.assertEqual(response.status_code, 404)
self.assertEqual(self.changeset.tags.count(), 0)
def test_add_tag(self):
"""A user that is not the creator of the changeset can add tags to an
unchecked changeset.
"""
self.client.login(username=self.user.username, password='password')
response = self.client.post(
reverse('changeset:tags', args=[self.changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 200)
self.assertEqual(self.changeset.tags.count(), 1)
self.assertIn(self.tag, self.changeset.tags.all())
# test add the same tag again
response = self.client.post(
reverse('changeset:tags', args=[self.changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 200)
self.assertEqual(self.changeset.tags.count(), 1)
def test_add_tag_by_changeset_owner(self):
"""The user that created the changeset can not add tags to it."""
self.client.login(username=self.changeset_user.username, password='password')
response = self.client.post(
reverse('changeset:tags', args=[self.changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 403)
self.assertEqual(self.changeset.tags.count(), 0)
def test_add_tag_to_checked_changeset(self):
"""The user that checked the changeset can add tags to it."""
self.client.login(username=self.user.username, password='password')
response = self.client.post(
reverse('changeset:tags', args=[self.checked_changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 200)
self.assertEqual(self.checked_changeset.tags.count(), 1)
self.assertIn(self.tag, self.checked_changeset.tags.all())
def test_other_user_can_not_add_tag_to_checked_changeset(self):
"""A non staff user can not add tags to a changeset that other user have
checked.
"""
other_user = User.objects.create_user(
username='other_user',
email='b@a.com',
password='password',
)
UserSocialAuth.objects.create(
user=other_user,
provider='openstreetmap',
uid='28763',
)
self.client.login(username=other_user.username, password='password')
response = self.client.post(
reverse('changeset:tags', args=[self.checked_changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 403)
self.assertEqual(self.changeset.tags.count(), 0)
def test_staff_user_add_tag_to_checked_changeset(self):
"""A staff user can add tags to a changeset."""
staff_user = User.objects.create_user(
username='admin',
email='b@a.com',
password='password',
is_staff=True
)
UserSocialAuth.objects.create(
user=staff_user,
provider='openstreetmap',
uid='28763',
)
self.client.login(username=staff_user.username, password='password')
response = self.client.post(
reverse('changeset:tags', args=[self.checked_changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 200)
self.assertEqual(self.checked_changeset.tags.count(), 1)
self.assertIn(self.tag, self.checked_changeset.tags.all())
class TestRemoveTagToChangeset(APITestCase):
def setUp(self):
self.user = User.objects.create_user(
username='user',
email='c@a.com',
password='password',
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='999',
)
self.changeset_user = User.objects.create_user(
username='test',
email='b@a.com',
password='password',
)
UserSocialAuth.objects.create(
user=self.changeset_user,
provider='openstreetmap',
uid='123123',
)
self.changeset = ChangesetFactory()
self.checked_changeset = HarmfulChangesetFactory(check_user=self.user)
self.tag = TagFactory(name='Not verified')
self.changeset.tags.add(self.tag)
self.checked_changeset.tags.add(self.tag)
def test_unauthenticated_can_not_remove_tag(self):
response = self.client.delete(
reverse('changeset:tags', args=[self.changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 401)
self.assertEqual(self.changeset.tags.count(), 1)
def test_can_not_remove_invalid_tag_id(self):
"""When the tag id does not exist it will return a 404 response."""
self.client.login(username=self.user.username, password='password')
response = self.client.delete(
reverse('changeset:tags', args=[self.changeset.id, 44534])
)
self.assertEqual(response.status_code, 404)
def test_remove_tag(self):
"""A user that is not the creator of the changeset can remote tags to an
unchecked changeset.
"""
self.client.login(username=self.user.username, password='password')
response = self.client.delete(
reverse('changeset:tags', args=[self.changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 200)
self.assertEqual(self.changeset.tags.count(), 0)
def test_remove_tag_by_changeset_owner(self):
"""The user that created the changeset can not remove its tags."""
self.client.login(username=self.changeset_user.username, password='password')
response = self.client.delete(
reverse('changeset:tags', args=[self.changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 403)
self.assertEqual(self.changeset.tags.count(), 1)
def test_remove_tag_of_checked_changeset(self):
"""The user that checked the changeset can remove its tags."""
self.client.login(username=self.user.username, password='password')
response = self.client.delete(
reverse('changeset:tags', args=[self.checked_changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 200)
self.assertEqual(self.checked_changeset.tags.count(), 0)
def test_other_user_can_not_remove_tag_to_checked_changeset(self):
"""A non staff user can not remove tags of a changeset that other user
have checked.
"""
other_user = User.objects.create_user(
username='other_user',
email='b@a.com',
password='password',
)
UserSocialAuth.objects.create(
user=other_user,
provider='openstreetmap',
uid='28763',
)
self.client.login(username=other_user.username, password='password')
response = self.client.delete(
reverse('changeset:tags', args=[self.checked_changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 403)
self.assertEqual(self.checked_changeset.tags.count(), 1)
def test_staff_user_remove_tag_to_checked_changeset(self):
"""A staff user can remove tags to a changeset."""
staff_user = User.objects.create_user(
username='admin',
email='b@a.com',
password='password',
is_staff=True
)
UserSocialAuth.objects.create(
user=staff_user,
provider='openstreetmap',
uid='28763',
)
self.client.login(username=staff_user.username, password='password')
response = self.client.delete(
reverse('changeset:tags', args=[self.checked_changeset.id, self.tag.id])
)
self.assertEqual(response.status_code, 200)
self.assertEqual(self.checked_changeset.tags.count(), 0)
class TestThrottling(APITestCase):
def setUp(self):
self.changesets = SuspectChangesetFactory.create_batch(
5, user='test2', uid='999999', editor='iD',
)
self.user = User.objects.create_user(
username='test',
password='password',
email='a@a.com'
)
UserSocialAuth.objects.create(
user=self.user,
provider='openstreetmap',
uid='123123',
)
def test_set_harmful_throttling(self):
"""User can only check 3 changesets each minute."""
self.client.login(username=self.user.username, password='password')
for changeset in self.changesets:
response = self.client.put(
reverse('changeset:set-harmful', args=[changeset.pk]),
)
self.assertEqual(response.status_code, 429)
self.assertEqual(Changeset.objects.filter(checked=True).count(), 3)
def test_set_good_throttling(self):
self.client.login(username=self.user.username, password='password')
for changeset in self.changesets:
response = self.client.put(
reverse('changeset:set-good', args=[changeset.pk]),
)
self.assertEqual(response.status_code, 429)
self.assertEqual(Changeset.objects.filter(checked=True).count(), 3)
def test_mixed_throttling(self):
"""Test if both set_harmful and set_good views are throttled together."""
self.client.login(username=self.user.username, password='password')
three_changesets = self.changesets[:3]
for changeset in three_changesets:
response = self.client.put(
reverse('changeset:set-good', args=[changeset.pk]),
)
self.assertEqual(response.status_code, 200)
response = self.client.put(
reverse('changeset:set-harmful', args=[self.changesets[3].pk]),
)
self.assertEqual(response.status_code, 429)
self.assertEqual(Changeset.objects.filter(checked=True).count(), 3)
def test_set_good_by_staff_user(self):
"""Staff users have not limit of checked changesets by minute."""
user = User.objects.create_user(
username='test_staff',
password='password',
email='a@a.com',
is_staff=True
)
UserSocialAuth.objects.create(
user=user,
provider='openstreetmap',
uid='8987',
)
self.client.login(username=user.username, password='password')
for changeset in self.changesets:
response = self.client.put(
reverse('changeset:set-good', args=[changeset.pk]),
)
self.assertEqual(response.status_code, 200)
self.assertEqual(Changeset.objects.filter(checked=True).count(), 5)
def test_set_harmful_by_staff_user(self):
"""Staff users have not limit of checked changesets by minute."""
user = User.objects.create_user(
username='test_staff',
password='password',
email='a@a.com',
is_staff=True
)
UserSocialAuth.objects.create(
user=user,
provider='openstreetmap',
uid='8987',
)
self.client.login(username=user.username, password='password')
for changeset in self.changesets:
response = self.client.put(
reverse('changeset:set-harmful', args=[changeset.pk]),
)
self.assertEqual(response.status_code, 200)
self.assertEqual(Changeset.objects.filter(checked=True).count(), 5)
| 40.799859 | 106 | 0.615424 | 6,450 | 57,895 | 5.395504 | 0.049922 | 0.089222 | 0.085917 | 0.076664 | 0.863452 | 0.839689 | 0.81239 | 0.776012 | 0.765495 | 0.752996 | 0 | 0.021509 | 0.261197 | 57,895 | 1,418 | 107 | 40.828632 | 0.792112 | 0.054875 | 0 | 0.606462 | 0 | 0 | 0.096345 | 0.008922 | 0 | 0 | 0 | 0 | 0.247722 | 1 | 0.062966 | false | 0.060481 | 0.014913 | 0 | 0.086164 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
5d75617963ac002cff2df515b9d0b46a2db06201 | 27,596 | py | Python | tests/compiler/compose/compose_all/test_compose_all_agg.py | CCD-HRI/congregation | a552856b03a64a4295792184107c4e529ca3f4ae | [
"MIT"
] | 3 | 2020-10-05T16:30:15.000Z | 2021-01-22T13:38:02.000Z | tests/compiler/compose/compose_all/test_compose_all_agg.py | CCD-HRI/congregation | a552856b03a64a4295792184107c4e529ca3f4ae | [
"MIT"
] | null | null | null | tests/compiler/compose/compose_all/test_compose_all_agg.py | CCD-HRI/congregation | a552856b03a64a4295792184107c4e529ca3f4ae | [
"MIT"
] | 1 | 2021-02-19T12:40:57.000Z | 2021-02-19T12:40:57.000Z | from congregation.lang import *
from congregation.dag import Dag
from congregation.dag.nodes.internal import *
from congregation.comp import compile_dag
from tests.utils import create_cols, compare_to_expected
import pytest
"""
Tests for correct propagation of the following relation-level
and column-level attributes after the Pushdown, PushUp, InsertCloseOps,
InsertOpenOps, InsertReadOps, and InsertStoreOps phases of the compiler
have been run:
- DAG node order
- node.requires_mpc() attribute
- relation-level stored_with sets
- column-level plaintext sets
- column-level trust_with sets
"""
@pytest.mark.parametrize("party_data, expected", [
(
[
{
"col_names": ["a", "b"],
"stored_with": {1, 2},
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
}
],
{
"node_order": [Create, AggregateMean, Open, Read, Divide, Collect],
"requires_mpc": [True, True, True, False, False],
"ownership_data":[
{
"stored_with": [{1, 2}],
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
},
{
"stored_with": [{1, 2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
}
]
}
)
])
def test_agg_mean(party_data, expected):
cols_in_one = create_cols(party_data[0])
rel_one = create("in1", cols_in_one, party_data[0]["stored_with"])
agg = aggregate(rel_one, "agg", party_data[0]["col_names"][:1], party_data[0]["col_names"][1], "mean")
div = divide(agg, "div", party_data[0]["col_names"][1], [10])
collect(div, {1, 2})
d = Dag({rel_one})
compile_dag(d)
compare_to_expected(d, expected)
@pytest.mark.parametrize("party_data, expected", [
(
[
{
"col_names": ["a", "b"],
"stored_with": {1},
"plaintext_sets": [{1}, {1}],
"trust_with_sets": [{1}, {1}]
},
{
"col_names": ["c", "d"],
"stored_with": {2},
"plaintext_sets": [{2}, {2}],
"trust_with_sets": [{2}, {2}]
}
],
{
"node_order": [
Create,
Create,
AggregateSumSquaresAndCount,
Store,
Close,
AggregateSumSquaresAndCount,
Store,
Close,
Concat,
AggregateStdDev,
Open,
Read,
AggregateStdDevLocalSqrt,
Multiply,
Collect
],
"requires_mpc": [
False, False, False,
False, True, False,
False, True, True,
True, True, False,
False, False
],
"ownership_data":[
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}],
"trust_with_sets": [{1}, {1}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}],
"trust_with_sets": [{2}, {2}]
},
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1}, {1}, {1}, {1}]
},
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1}, {1}, {1}, {1}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1}, {1}, {1}, {1}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [set(), set(), set(), set()],
"trust_with_sets": [set(), set(), set(), set()]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
}
]
}
),
(
[
{
"col_names": ["a", "b"],
"stored_with": {1},
"plaintext_sets": [{1}, {1}],
"trust_with_sets": [{1, 2}, {1}]
},
{
"col_names": ["c", "d"],
"stored_with": {2},
"plaintext_sets": [{2}, {2}],
"trust_with_sets": [{2}, {2}]
}
],
{
"node_order": [
Create,
Create,
AggregateSumSquaresAndCount,
Store,
Close,
AggregateSumSquaresAndCount,
Store,
Close,
Concat,
AggregateStdDev,
Open,
Read,
AggregateStdDevLocalSqrt,
Multiply,
Collect
],
"requires_mpc": [
False, False, False,
False, True, False,
False, True, True,
True, True, False,
False, False
],
"ownership_data":[
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}],
"trust_with_sets": [{1, 2}, {1}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}],
"trust_with_sets": [{2}, {2}]
},
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1, 2}, {1}, {1}, {1, 2}]
},
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1, 2}, {1}, {1}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1, 2}, {1}, {1}, {1, 2}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [set(), set(), set(), set()],
"trust_with_sets": [{2}, set(), set(), {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
}
]
}
),
(
[
{
"col_names": ["a", "b"],
"stored_with": {1, 2},
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
},
{
"col_names": ["c", "d"],
"stored_with": {1, 2},
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
}
],
{
"node_order": [
Create,
Create,
Concat,
AggregateStdDev,
Open,
Read,
AggregateStdDevLocalSqrt,
Multiply,
Collect
],
"requires_mpc": [True, True, True, True, True, False, False, False, False],
"ownership_data":[
{
"stored_with": [{1, 2}],
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
},
{
"stored_with": [{1, 2}],
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
},
{
"stored_with": [{1, 2}],
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
}
]
}
)
])
def test_agg_std_dev(party_data, expected):
cols_in_one = create_cols(party_data[0])
cols_in_two = create_cols(party_data[1])
rel_one = create("in1", cols_in_one, party_data[0]["stored_with"])
rel_two = create("in2", cols_in_two, party_data[1]["stored_with"])
cc = concat([rel_one, rel_two], "concat", party_data[0]["col_names"])
std_dev = aggregate(cc, "std_dev", [party_data[0]["col_names"][0]], party_data[0]["col_names"][1], "std_dev")
mult = multiply(std_dev, "mult", party_data[0]["col_names"][0], [party_data[0]["col_names"][1], 7])
collect(mult, {1, 2})
d = Dag({rel_one, rel_two})
compile_dag(d)
compare_to_expected(d, expected)
@pytest.mark.parametrize("party_data, expected", [
(
[
{
"col_names": ["a", "b"],
"stored_with": {1},
"plaintext_sets": [{1}, {1}],
"trust_with_sets": [{1}, {1}]
},
{
"col_names": ["c", "d"],
"stored_with": {2},
"plaintext_sets": [{2}, {2}],
"trust_with_sets": [{2}, {2}]
}
],
{
"node_order": [
Create,
Create,
AggregateSumSquaresAndCount,
Store,
Close,
AggregateSumSquaresAndCount,
Store,
Close,
Concat,
AggregateVariance,
Open,
Read,
AggregateVarianceLocalDiff,
Multiply,
Collect
],
"requires_mpc": [
False, False, False,
False, True, False,
False, True, True,
True, True, False,
False, False
],
"ownership_data":[
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}],
"trust_with_sets": [{1}, {1}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}],
"trust_with_sets": [{2}, {2}]
},
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1}, {1}, {1}, {1}]
},
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1}, {1}, {1}, {1}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1}, {1}, {1}, {1}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [set(), set(), set(), set()],
"trust_with_sets": [set(), set(), set(), set()]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
}
]
}
),
(
[
{
"col_names": ["a", "b"],
"stored_with": {1},
"plaintext_sets": [{1}, {1}],
"trust_with_sets": [{1, 2}, {1}]
},
{
"col_names": ["c", "d"],
"stored_with": {2},
"plaintext_sets": [{2}, {2}],
"trust_with_sets": [{2}, {2}]
}
],
{
"node_order": [
Create,
Create,
AggregateSumSquaresAndCount,
Store,
Close,
AggregateSumSquaresAndCount,
Store,
Close,
Concat,
AggregateVariance,
Open,
Read,
AggregateVarianceLocalDiff,
Multiply,
Collect
],
"requires_mpc": [
False, False, False,
False, True, False,
False, True, True,
True, True, False,
False, False
],
"ownership_data":[
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}],
"trust_with_sets": [{1, 2}, {1}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}],
"trust_with_sets": [{2}, {2}]
},
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1, 2}, {1}, {1}, {1, 2}]
},
{
"stored_with": [{1}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1, 2}, {1}, {1}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1}, {1}, {1}, {1}],
"trust_with_sets": [{1, 2}, {1}, {1}, {1, 2}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{2}, {2}, {2}, {2}],
"trust_with_sets": [{2}, {2}, {2}, {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [set(), set(), set(), set()],
"trust_with_sets": [{2}, set(), set(), {2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
}
]
}
),
(
[
{
"col_names": ["a", "b"],
"stored_with": {1, 2},
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
},
{
"col_names": ["c", "d"],
"stored_with": {1, 2},
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
}
],
{
"node_order": [
Create,
Create,
Concat,
AggregateVariance,
Open,
Read,
AggregateVarianceLocalDiff,
Multiply,
Collect
],
"requires_mpc": [True, True, True, True, True, False, False, False, False],
"ownership_data":[
{
"stored_with": [{1, 2}],
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
},
{
"stored_with": [{1, 2}],
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
},
{
"stored_with": [{1, 2}],
"plaintext_sets": [set(), set()],
"trust_with_sets": [set(), set()]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
},
{
"stored_with": [{1}, {2}],
"plaintext_sets": [{1, 2}, {1, 2}],
"trust_with_sets": [{1, 2}, {1, 2}]
}
]
}
)
])
def test_agg_variance(party_data, expected):
cols_in_one = create_cols(party_data[0])
cols_in_two = create_cols(party_data[1])
rel_one = create("in1", cols_in_one, party_data[0]["stored_with"])
rel_two = create("in2", cols_in_two, party_data[1]["stored_with"])
cc = concat([rel_one, rel_two], "concat", party_data[0]["col_names"])
variance = aggregate(cc, "variance", [party_data[0]["col_names"][0]], party_data[0]["col_names"][1], "variance")
mult = multiply(variance, "mult", party_data[0]["col_names"][0], [party_data[0]["col_names"][1], 7])
collect(mult, {1, 2})
d = Dag({rel_one, rel_two})
compile_dag(d)
compare_to_expected(d, expected)
| 36.599469 | 116 | 0.302544 | 2,249 | 27,596 | 3.473988 | 0.044464 | 0.072699 | 0.049149 | 0.060412 | 0.912582 | 0.912582 | 0.89991 | 0.89991 | 0.89991 | 0.887367 | 0 | 0.067627 | 0.524714 | 27,596 | 753 | 117 | 36.648074 | 0.528057 | 0 | 0 | 0.651872 | 0 | 0 | 0.168444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004161 | false | 0 | 0.008322 | 0 | 0.012483 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
53c5ec1dd080944e488750a295fc655d541805a6 | 76 | py | Python | py_wake/examples/data/iea34_130rwt/__init__.py | aemoser/PyWake | 889a2c10882195af21339e9bcf2ede0db9b58319 | [
"MIT"
] | 30 | 2019-03-18T14:10:27.000Z | 2022-03-13T17:39:04.000Z | py_wake/examples/data/iea34_130rwt/__init__.py | aemoser/PyWake | 889a2c10882195af21339e9bcf2ede0db9b58319 | [
"MIT"
] | 1 | 2020-11-12T06:13:00.000Z | 2020-11-12T06:43:26.000Z | py_wake/examples/data/iea34_130rwt/__init__.py | aemoser/PyWake | 889a2c10882195af21339e9bcf2ede0db9b58319 | [
"MIT"
] | 20 | 2019-01-11T14:45:13.000Z | 2021-12-13T19:55:29.000Z | from ._iea34_130rwt import IEA34_130_1WT_Surrogate, IEA34_130_2WT_Surrogate
| 38 | 75 | 0.907895 | 12 | 76 | 5.083333 | 0.666667 | 0.262295 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.239437 | 0.065789 | 76 | 1 | 76 | 76 | 0.619718 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
071411cf0734221bee19f6a9a17f6866b6a01108 | 128 | py | Python | Python3/Exercises/SumEvenValues/sum_even_values.py | norbertosanchezdichi/TIL | 2e9719ddd288022f53b094a42679e849bdbcc625 | [
"MIT"
] | null | null | null | Python3/Exercises/SumEvenValues/sum_even_values.py | norbertosanchezdichi/TIL | 2e9719ddd288022f53b094a42679e849bdbcc625 | [
"MIT"
] | null | null | null | Python3/Exercises/SumEvenValues/sum_even_values.py | norbertosanchezdichi/TIL | 2e9719ddd288022f53b094a42679e849bdbcc625 | [
"MIT"
] | null | null | null | def sum_even_values(*args):
return sum(args for args in args if args % 2 == 0)
print(sum_even_values(1, 2, 3, 4, 5, 6)) | 32 | 54 | 0.640625 | 26 | 128 | 3 | 0.653846 | 0.179487 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 0.21875 | 128 | 4 | 55 | 32 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0 | 0.333333 | 0.666667 | 0.333333 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
0718e2bb8811bfefe7d0c060a8a72710349b5f04 | 21,408 | py | Python | smt/examples/run_examples.py | repriem/smt | 76680130682837b1e8e4a8ce19090bb898abafba | [
"BSD-3-Clause"
] | null | null | null | smt/examples/run_examples.py | repriem/smt | 76680130682837b1e8e4a8ce19090bb898abafba | [
"BSD-3-Clause"
] | 1 | 2020-04-14T16:37:33.000Z | 2020-04-14T16:37:33.000Z | smt/examples/run_examples.py | repriem/smt | 76680130682837b1e8e4a8ce19090bb898abafba | [
"BSD-3-Clause"
] | null | null | null | """
Author: Dr. Mohamed A. Bouhlel <mbouhlel@umich>
Dr. John T. Hwang <hwangjt@umich.edu>
This package is distributed under New BSD license.
"""
import numpy as np
from scipy import linalg
from smt.utils import compute_rms_error
from smt.problems import Sphere, NdimRobotArm
from smt.sampling_methods import LHS
from smt.surrogate_models import LS, QP, KPLS, KRG, KPLSK, GEKPLS
try:
from smt.surrogate_models import IDW, RBF, RMTC, RMTB
compiled_available = True
except:
compiled_available = False
try:
import matplotlib.pyplot as plt
plot_status = True
except:
plot_status = False
########### Initialization of the problem, construction of the training and validation points
ndim = 10
ndoe = int(10 * ndim)
# Define the function
fun = Sphere(ndim=ndim)
# Construction of the DOE
sampling = LHS(xlimits=fun.xlimits, criterion="m")
xt = sampling(ndoe)
# Compute the output
yt = fun(xt)
# Compute the gradient
for i in range(ndim):
yd = fun(xt, kx=i)
yt = np.concatenate((yt, yd), axis=1)
# Construction of the validation points
ntest = 500
sampling = LHS(xlimits=fun.xlimits)
xtest = sampling(ntest)
ytest = fun(xtest)
ydtest = np.zeros((ntest, ndim))
for i in range(ndim):
ydtest[:, i] = fun(xtest, kx=i).T
########### The LS model
# Initialization of the model
t = LS(print_prediction=False)
# Add the DOE
t.set_training_values(xt, yt[:, 0])
# Train the model
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("LS, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(4, 3)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[3, 2].arrow(0.3, 0.3, 0.2, 0)
axarr[3, 2].arrow(0.3, 0.3, 0.0, 0.4)
axarr[3, 2].text(0.25, 0.4, r"$\hat{y}$")
axarr[3, 2].text(0.35, 0.15, r"$y_{true}$")
axarr[3, 2].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[3, 2].get_xticklabels(), visible=False)
plt.setp(axarr[3, 2].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the LS model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:10"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"LS, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 2:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
########### The QP model
t = QP(print_prediction=False)
t.set_training_values(xt, yt[:, 0])
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("QP, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(4, 3)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[3, 2].arrow(0.3, 0.3, 0.2, 0)
axarr[3, 2].arrow(0.3, 0.3, 0.0, 0.4)
axarr[3, 2].text(0.25, 0.4, r"$\hat{y}$")
axarr[3, 2].text(0.35, 0.15, r"$y_{true}$")
axarr[3, 2].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[3, 2].get_xticklabels(), visible=False)
plt.setp(axarr[3, 2].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the QP model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:10"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"QP, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 2:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
########### The Kriging model
# The variable 'theta0' is a list of length ndim.
t = KRG(theta0=[1e-2] * ndim, print_prediction=False)
t.set_training_values(xt, yt[:, 0])
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("Kriging, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(4, 3)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[3, 2].arrow(0.3, 0.3, 0.2, 0)
axarr[3, 2].arrow(0.3, 0.3, 0.0, 0.4)
axarr[3, 2].text(0.25, 0.4, r"$\hat{y}$")
axarr[3, 2].text(0.35, 0.15, r"$y_{true}$")
axarr[3, 2].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[3, 2].get_xticklabels(), visible=False)
plt.setp(axarr[3, 2].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the Kriging model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:10"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"Kriging, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 2:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
########### The KPLS model
# The variables 'name' must be equal to 'KPLS'. 'n_comp' and 'theta0' must be
# an integer in [1,ndim[ and a list of length n_comp, respectively. Here is an
# an example using 2 principal components.
t = KPLS(n_comp=2, theta0=[1e-2, 1e-2], print_prediction=False)
t.set_training_values(xt, yt[:, 0])
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("KPLS, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(4, 3)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[3, 2].arrow(0.3, 0.3, 0.2, 0)
axarr[3, 2].arrow(0.3, 0.3, 0.0, 0.4)
axarr[3, 2].text(0.25, 0.4, r"$\hat{y}$")
axarr[3, 2].text(0.35, 0.15, r"$y_{true}$")
axarr[3, 2].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[3, 2].get_xticklabels(), visible=False)
plt.setp(axarr[3, 2].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the KPLS model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:10"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"KPLS, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 2:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
#KPLS + absolute exponential correlation kernel
# The variables 'name' must be equal to 'KPLS'. 'n_comp' and 'theta0' must be
# an integer in [1,ndim[ and a list of length n_comp, respectively. Here is an
# an example using 2 principal components.
t = KPLS(n_comp=2, theta0=[1e-2,1e-2],print_prediction = False,corr='abs_exp')
t.set_training_values(xt,yt[:,0])
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print('KPLS + abs exp, err: '+str(compute_rms_error(t,xtest,ytest)))
########### The KPLSK model
# 'n_comp' and 'theta0' must be an integer in [1,ndim[ and a list of length n_comp, respectively.
t = KPLSK(n_comp=2, theta0=[1e-2, 1e-2], print_prediction=False)
t.set_training_values(xt, yt[:, 0])
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("KPLSK, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(4, 3)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[3, 2].arrow(0.3, 0.3, 0.2, 0)
axarr[3, 2].arrow(0.3, 0.3, 0.0, 0.4)
axarr[3, 2].text(0.25, 0.4, r"$\hat{y}$")
axarr[3, 2].text(0.35, 0.15, r"$y_{true}$")
axarr[3, 2].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[3, 2].get_xticklabels(), visible=False)
plt.setp(axarr[3, 2].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the KPLSK model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:10"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"KPLSK, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 2:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
########### The GEKPLS model using 1 approximating points
# 'n_comp' and 'theta0' must be an integer in [1,ndim[ and a list of length n_comp, respectively.
t = GEKPLS(
n_comp=1,
theta0=[1e-2],
xlimits=fun.xlimits,
delta_x=1e-2,
extra_points=1,
print_prediction=False,
)
t.set_training_values(xt, yt[:, 0])
# Add the gradient information
for i in range(ndim):
t.set_training_derivatives(xt, yt[:, 1 + i].reshape((yt.shape[0], 1)), i)
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("GEKPLS1, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(4, 3)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[3, 2].arrow(0.3, 0.3, 0.2, 0)
axarr[3, 2].arrow(0.3, 0.3, 0.0, 0.4)
axarr[3, 2].text(0.25, 0.4, r"$\hat{y}$")
axarr[3, 2].text(0.35, 0.15, r"$y_{true}$")
axarr[3, 2].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[3, 2].get_xticklabels(), visible=False)
plt.setp(axarr[3, 2].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the GEKPLS1 model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:10"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"GEKPLS1, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 2:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
########### The GEKPLS model using 2 approximating points
# 'n_comp' and 'theta0' must be an integer in [1,ndim[ and a list of length n_comp, respectively.
t = GEKPLS(
n_comp=1,
theta0=[1e-2],
xlimits=fun.xlimits,
delta_x=1e-4,
extra_points=2,
print_prediction=False,
)
t.set_training_values(xt, yt[:, 0])
# Add the gradient information
for i in range(ndim):
t.set_training_derivatives(xt, yt[:, 1 + i].reshape((yt.shape[0], 1)), i)
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("GEKPLS2, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(4, 3)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[3, 2].arrow(0.3, 0.3, 0.2, 0)
axarr[3, 2].arrow(0.3, 0.3, 0.0, 0.4)
axarr[3, 2].text(0.25, 0.4, r"$\hat{y}$")
axarr[3, 2].text(0.35, 0.15, r"$y_{true}$")
axarr[3, 2].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[3, 2].get_xticklabels(), visible=False)
plt.setp(axarr[3, 2].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the GEKPLS2 model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:10"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"GEKPLS2, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 2:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
if compiled_available:
########### The IDW model
t = IDW(print_prediction=False)
t.set_training_values(xt, yt[:, 0])
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("IDW, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
plt.figure()
plt.plot(ytest, ytest, "-.")
plt.plot(ytest, y, ".")
plt.xlabel(r"$y_{true}$")
plt.ylabel(r"$\hat{y}$")
plt.title("Validation of the IDW model")
plt.show()
########### The RBF model
t = RBF(print_prediction=False, poly_degree=0)
t.set_training_values(xt, yt[:, 0])
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("RBF, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(4, 3)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[3, 2].arrow(0.3, 0.3, 0.2, 0)
axarr[3, 2].arrow(0.3, 0.3, 0.0, 0.4)
axarr[3, 2].text(0.25, 0.4, r"$\hat{y}$")
axarr[3, 2].text(0.35, 0.15, r"$y_{true}$")
axarr[3, 2].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[3, 2].get_xticklabels(), visible=False)
plt.setp(axarr[3, 2].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the RBF model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:10"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"RBF, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 2:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
########### The RMTB and RMTC models are suitable for low-dimensional problems
# Initialization of the problem
ndim = 3
ndoe = int(250 * ndim)
# Define the function
fun = NdimRobotArm(ndim=ndim)
# Construction of the DOE
sampling = LHS(xlimits=fun.xlimits)
xt = sampling(ndoe)
# Compute the output
yt = fun(xt)
# Compute the gradient
for i in range(ndim):
yd = fun(xt, kx=i)
yt = np.concatenate((yt, yd), axis=1)
# Construction of the validation points
ntest = 500
sampling = LHS(xlimits=fun.xlimits)
xtest = sampling(ntest)
ytest = fun(xtest)
########### The RMTB model
t = RMTB(
xlimits=fun.xlimits,
min_energy=True,
nonlinear_maxiter=20,
print_prediction=False,
)
t.set_training_values(xt, yt[:, 0])
# Add the gradient information
for i in range(ndim):
t.set_training_derivatives(xt, yt[:, 1 + i].reshape((yt.shape[0], 1)), i)
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("RMTB, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(3, 2)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[2, 0].arrow(0.3, 0.3, 0.2, 0)
axarr[2, 0].arrow(0.3, 0.3, 0.0, 0.4)
axarr[2, 0].text(0.25, 0.4, r"$\hat{y}$")
axarr[2, 0].text(0.35, 0.15, r"$y_{true}$")
axarr[2, 0].axis("off")
axarr[2, 1].set_visible(False)
axarr[2, 1].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[2, 0].get_xticklabels(), visible=False)
plt.setp(axarr[2, 0].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the RMTB model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:3"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"RMTB, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 1:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
########### The RMTC model
t = RMTC(
xlimits=fun.xlimits,
min_energy=True,
nonlinear_maxiter=20,
print_prediction=False,
)
t.set_training_values(xt, yt[:, 0])
# Add the gradient information
for i in range(ndim):
t.set_training_derivatives(xt, yt[:, 1 + i].reshape((yt.shape[0], 1)), i)
t.train()
# Prediction of the validation points
y = t.predict_values(xtest)
print("RMTC, err: " + str(compute_rms_error(t, xtest, ytest)))
if plot_status:
k, l = 0, 0
f, axarr = plt.subplots(3, 2)
axarr[k, l].plot(ytest, ytest, "-.")
axarr[k, l].plot(ytest, y, ".")
l += 1
axarr[2, 0].arrow(0.3, 0.3, 0.2, 0)
axarr[2, 0].arrow(0.3, 0.3, 0.0, 0.4)
axarr[2, 0].text(0.25, 0.4, r"$\hat{y}$")
axarr[2, 0].text(0.35, 0.15, r"$y_{true}$")
axarr[2, 0].axis("off")
axarr[2, 1].set_visible(False)
axarr[2, 1].axis("off")
# Fine-tune figure; hide x ticks for top plots and y ticks for right plots
plt.setp(axarr[2, 0].get_xticklabels(), visible=False)
plt.setp(axarr[2, 0].get_yticklabels(), visible=False)
plt.suptitle(
"Validation of the RMTC model (from left to right then from top to bottom): validation of the prediction model and the i-th prediction of the derivative---i=1:3"
)
# Prediction of the derivatives with regards to each direction space
yd_prediction = np.zeros((ntest, ndim))
for i in range(ndim):
yd_prediction[:, i] = t.predict_derivatives(xtest, kx=i).T
print(
"RMTC, err of the "
+ str(i)
+ "-th derivative: "
+ str(compute_rms_error(t, xtest, ydtest[:, i], kx=i))
)
if plot_status:
axarr[k, l].plot(ydtest[:, i], ydtest[:, i], "-.")
axarr[k, l].plot(ydtest[:, i], yd_prediction[:, i], ".")
if l == 1:
l = 0
k += 1
else:
l += 1
if plot_status:
plt.show()
| 31.116279 | 173 | 0.583333 | 3,380 | 21,408 | 3.623965 | 0.061834 | 0.028982 | 0.032003 | 0.035921 | 0.898441 | 0.88995 | 0.88995 | 0.888073 | 0.88546 | 0.882929 | 0 | 0.03932 | 0.255138 | 21,408 | 687 | 174 | 31.161572 | 0.728835 | 0.166667 | 0 | 0.804 | 0 | 0.02 | 0.138352 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.016 | 0 | 0.016 | 0.068 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
072d99c4af704335386270b5fe5d2da1086a4d12 | 73 | py | Python | tests/legacy_unittest/sample/test_score02/print_func/print_func.py | bayeshack2016/icon-service | 36cab484d2e41548d7f2f74526f127ee3a4423fc | [
"Apache-2.0"
] | 52 | 2018-08-24T02:28:43.000Z | 2021-07-06T04:44:22.000Z | tests/legacy_unittest/sample/test_score02/print_func/print_func.py | bayeshack2016/icon-service | 36cab484d2e41548d7f2f74526f127ee3a4423fc | [
"Apache-2.0"
] | 62 | 2018-09-17T06:59:16.000Z | 2021-12-15T06:02:51.000Z | tests/legacy_unittest/sample/test_score02/print_func/print_func.py | bayeshack2016/icon-service | 36cab484d2e41548d7f2f74526f127ee3a4423fc | [
"Apache-2.0"
] | 35 | 2018-09-14T02:42:10.000Z | 2022-02-05T10:34:46.000Z | from ..test_func import sample_func
def func_test():
sample_func()
| 12.166667 | 35 | 0.726027 | 11 | 73 | 4.454545 | 0.545455 | 0.408163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178082 | 73 | 5 | 36 | 14.6 | 0.816667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
073c07d68a4bafef563bf3eb76bc9fa7450f871d | 20,383 | py | Python | tests/mqtt/protocol/test_handler.py | edenhaus/amqtt | ecf64a4f82c5d4c10974bce4d3f75f7563d6170b | [
"MIT"
] | null | null | null | tests/mqtt/protocol/test_handler.py | edenhaus/amqtt | ecf64a4f82c5d4c10974bce4d3f75f7563d6170b | [
"MIT"
] | null | null | null | tests/mqtt/protocol/test_handler.py | edenhaus/amqtt | ecf64a4f82c5d4c10974bce4d3f75f7563d6170b | [
"MIT"
] | null | null | null | # Copyright (c) 2015 Nicolas JOUANIN
#
# See the file license.txt for copying permission.
import unittest
import asyncio
import logging
import random
from amqtt.plugins.manager import PluginManager
from amqtt.session import (
Session,
OutgoingApplicationMessage,
IncomingApplicationMessage,
)
from amqtt.mqtt.protocol.handler import ProtocolHandler
from amqtt.adapters import StreamWriterAdapter, StreamReaderAdapter
from amqtt.mqtt.constants import QOS_0, QOS_1, QOS_2
from amqtt.mqtt.publish import PublishPacket
from amqtt.mqtt.puback import PubackPacket
from amqtt.mqtt.pubrec import PubrecPacket
from amqtt.mqtt.pubrel import PubrelPacket
from amqtt.mqtt.pubcomp import PubcompPacket
formatter = (
"[%(asctime)s] %(name)s {%(filename)s:%(lineno)d} %(levelname)s - %(message)s"
)
logging.basicConfig(level=logging.DEBUG, format=formatter)
log = logging.getLogger(__name__)
def rand_packet_id():
return random.randint(0, 65535)
def adapt(reader, writer):
return StreamReaderAdapter(reader), StreamWriterAdapter(writer)
class ProtocolHandlerTest(unittest.TestCase):
def setUp(self):
self.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.loop)
self.plugin_manager = PluginManager("amqtt.test.plugins", context=None)
def tearDown(self):
self.loop.close()
def test_init_handler(self):
Session()
handler = ProtocolHandler(self.plugin_manager)
self.assertIsNone(handler.session)
self.assertIs(handler._loop, self.loop)
self.check_empty_waiters(handler)
def test_start_stop(self):
async def server_mock(reader, writer):
pass
async def test_coro():
try:
s = Session()
reader, writer = await asyncio.open_connection("127.0.0.1", 8888)
reader_adapted, writer_adapted = adapt(reader, writer)
handler = ProtocolHandler(self.plugin_manager)
handler.attach(s, reader_adapted, writer_adapted)
await self.start_handler(handler, s)
await self.stop_handler(handler, s)
future.set_result(True)
except Exception as ae:
future.set_exception(ae)
future = asyncio.Future()
coro = asyncio.start_server(server_mock, "127.0.0.1", 8888)
server = self.loop.run_until_complete(coro)
self.loop.run_until_complete(test_coro())
server.close()
self.loop.run_until_complete(server.wait_closed())
if future.exception():
raise future.exception()
def test_publish_qos0(self):
async def server_mock(reader, writer):
try:
packet = await PublishPacket.from_stream(reader)
self.assertEqual(packet.variable_header.topic_name, "/topic")
self.assertEqual(packet.qos, QOS_0)
self.assertIsNone(packet.packet_id)
except Exception as ae:
future.set_exception(ae)
async def test_coro():
try:
s = Session()
reader, writer = await asyncio.open_connection("127.0.0.1", 8888)
reader_adapted, writer_adapted = adapt(reader, writer)
handler = ProtocolHandler(self.plugin_manager)
handler.attach(s, reader_adapted, writer_adapted)
await self.start_handler(handler, s)
message = await handler.mqtt_publish(
"/topic", b"test_data", QOS_0, False
)
self.assertIsInstance(message, OutgoingApplicationMessage)
self.assertIsNotNone(message.publish_packet)
self.assertIsNone(message.puback_packet)
self.assertIsNone(message.pubrec_packet)
self.assertIsNone(message.pubrel_packet)
self.assertIsNone(message.pubcomp_packet)
await self.stop_handler(handler, s)
future.set_result(True)
except Exception as ae:
future.set_exception(ae)
future = asyncio.Future()
coro = asyncio.start_server(server_mock, "127.0.0.1", 8888)
server = self.loop.run_until_complete(coro)
self.loop.run_until_complete(test_coro())
server.close()
self.loop.run_until_complete(server.wait_closed())
if future.exception():
raise future.exception()
def test_publish_qos1(self):
async def server_mock(reader, writer):
packet = await PublishPacket.from_stream(reader)
try:
self.assertEqual(packet.variable_header.topic_name, "/topic")
self.assertEqual(packet.qos, QOS_1)
self.assertIsNotNone(packet.packet_id)
self.assertIn(packet.packet_id, self.session.inflight_out)
self.assertIn(packet.packet_id, self.handler._puback_waiters)
puback = PubackPacket.build(packet.packet_id)
await puback.to_stream(writer)
except Exception as ae:
future.set_exception(ae)
async def test_coro():
try:
reader, writer = await asyncio.open_connection("127.0.0.1", 8888)
reader_adapted, writer_adapted = adapt(reader, writer)
self.handler = ProtocolHandler(self.plugin_manager)
self.handler.attach(self.session, reader_adapted, writer_adapted)
await self.start_handler(self.handler, self.session)
message = await self.handler.mqtt_publish(
"/topic", b"test_data", QOS_1, False
)
self.assertIsInstance(message, OutgoingApplicationMessage)
self.assertIsNotNone(message.publish_packet)
self.assertIsNotNone(message.puback_packet)
self.assertIsNone(message.pubrec_packet)
self.assertIsNone(message.pubrel_packet)
self.assertIsNone(message.pubcomp_packet)
await self.stop_handler(self.handler, self.session)
if not future.done():
future.set_result(True)
except Exception as ae:
future.set_exception(ae)
self.handler = None
self.session = Session()
future = asyncio.Future()
coro = asyncio.start_server(server_mock, "127.0.0.1", 8888)
server = self.loop.run_until_complete(coro)
self.loop.run_until_complete(test_coro())
server.close()
self.loop.run_until_complete(server.wait_closed())
if future.exception():
raise future.exception()
def test_publish_qos2(self):
async def server_mock(reader, writer):
try:
packet = await PublishPacket.from_stream(reader)
self.assertEqual(packet.topic_name, "/topic")
self.assertEqual(packet.qos, QOS_2)
self.assertIsNotNone(packet.packet_id)
self.assertIn(packet.packet_id, self.session.inflight_out)
self.assertIn(packet.packet_id, self.handler._pubrec_waiters)
pubrec = PubrecPacket.build(packet.packet_id)
await pubrec.to_stream(writer)
await PubrelPacket.from_stream(reader)
self.assertIn(packet.packet_id, self.handler._pubcomp_waiters)
pubcomp = PubcompPacket.build(packet.packet_id)
await pubcomp.to_stream(writer)
except Exception as ae:
future.set_exception(ae)
async def test_coro():
try:
reader, writer = await asyncio.open_connection("127.0.0.1", 8888)
reader_adapted, writer_adapted = adapt(reader, writer)
self.handler = ProtocolHandler(self.plugin_manager)
self.handler.attach(self.session, reader_adapted, writer_adapted)
await self.start_handler(self.handler, self.session)
message = await self.handler.mqtt_publish(
"/topic", b"test_data", QOS_2, False
)
self.assertIsInstance(message, OutgoingApplicationMessage)
self.assertIsNotNone(message.publish_packet)
self.assertIsNone(message.puback_packet)
self.assertIsNotNone(message.pubrec_packet)
self.assertIsNotNone(message.pubrel_packet)
self.assertIsNotNone(message.pubcomp_packet)
await self.stop_handler(self.handler, self.session)
if not future.done():
future.set_result(True)
except Exception as ae:
future.set_exception(ae)
self.handler = None
self.session = Session()
future = asyncio.Future()
coro = asyncio.start_server(server_mock, "127.0.0.1", 8888)
server = self.loop.run_until_complete(coro)
self.loop.run_until_complete(test_coro())
server.close()
self.loop.run_until_complete(server.wait_closed())
if future.exception():
raise future.exception()
def test_receive_qos0(self):
async def server_mock(reader, writer):
packet = PublishPacket.build(
"/topic", b"test_data", rand_packet_id(), False, QOS_0, False
)
await packet.to_stream(writer)
async def test_coro():
try:
reader, writer = await asyncio.open_connection("127.0.0.1", 8888)
reader_adapted, writer_adapted = adapt(reader, writer)
self.handler = ProtocolHandler(self.plugin_manager)
self.handler.attach(self.session, reader_adapted, writer_adapted)
await self.start_handler(self.handler, self.session)
message = await self.handler.mqtt_deliver_next_message()
self.assertIsInstance(message, IncomingApplicationMessage)
self.assertIsNotNone(message.publish_packet)
self.assertIsNone(message.puback_packet)
self.assertIsNone(message.pubrec_packet)
self.assertIsNone(message.pubrel_packet)
self.assertIsNone(message.pubcomp_packet)
await self.stop_handler(self.handler, self.session)
future.set_result(True)
except Exception as ae:
future.set_exception(ae)
self.handler = None
self.session = Session()
future = asyncio.Future()
coro = asyncio.start_server(server_mock, "127.0.0.1", 8888)
server = self.loop.run_until_complete(coro)
self.loop.run_until_complete(test_coro())
server.close()
self.loop.run_until_complete(server.wait_closed())
if future.exception():
raise future.exception()
def test_receive_qos1(self):
async def server_mock(reader, writer):
try:
packet = PublishPacket.build(
"/topic", b"test_data", rand_packet_id(), False, QOS_1, False
)
await packet.to_stream(writer)
puback = await PubackPacket.from_stream(reader)
self.assertIsNotNone(puback)
self.assertEqual(packet.packet_id, puback.packet_id)
except Exception as ae:
print(ae)
future.set_exception(ae)
async def test_coro():
try:
reader, writer = await asyncio.open_connection("127.0.0.1", 8888)
reader_adapted, writer_adapted = adapt(reader, writer)
self.handler = ProtocolHandler(self.plugin_manager)
self.handler.attach(self.session, reader_adapted, writer_adapted)
await self.start_handler(self.handler, self.session)
message = await self.handler.mqtt_deliver_next_message()
self.assertIsInstance(message, IncomingApplicationMessage)
self.assertIsNotNone(message.publish_packet)
self.assertIsNotNone(message.puback_packet)
self.assertIsNone(message.pubrec_packet)
self.assertIsNone(message.pubrel_packet)
self.assertIsNone(message.pubcomp_packet)
await self.stop_handler(self.handler, self.session)
future.set_result(True)
except Exception as ae:
future.set_exception(ae)
self.handler = None
self.session = Session()
future = asyncio.Future()
self.event = asyncio.Event()
coro = asyncio.start_server(server_mock, "127.0.0.1", 8888)
server = self.loop.run_until_complete(coro)
self.loop.run_until_complete(test_coro())
server.close()
self.loop.run_until_complete(server.wait_closed())
if future.exception():
raise future.exception()
def test_receive_qos2(self):
async def server_mock(reader, writer):
try:
packet = PublishPacket.build(
"/topic", b"test_data", rand_packet_id(), False, QOS_2, False
)
await packet.to_stream(writer)
pubrec = await PubrecPacket.from_stream(reader)
self.assertIsNotNone(pubrec)
self.assertEqual(packet.packet_id, pubrec.packet_id)
self.assertIn(packet.packet_id, self.handler._pubrel_waiters)
pubrel = PubrelPacket.build(packet.packet_id)
await pubrel.to_stream(writer)
pubcomp = await PubcompPacket.from_stream(reader)
self.assertIsNotNone(pubcomp)
self.assertEqual(packet.packet_id, pubcomp.packet_id)
except Exception as ae:
future.set_exception(ae)
async def test_coro():
try:
reader, writer = await asyncio.open_connection("127.0.0.1", 8888)
reader_adapted, writer_adapted = adapt(reader, writer)
self.handler = ProtocolHandler(self.plugin_manager)
self.handler.attach(self.session, reader_adapted, writer_adapted)
await self.start_handler(self.handler, self.session)
message = await self.handler.mqtt_deliver_next_message()
self.assertIsInstance(message, IncomingApplicationMessage)
self.assertIsNotNone(message.publish_packet)
self.assertIsNone(message.puback_packet)
self.assertIsNotNone(message.pubrec_packet)
self.assertIsNotNone(message.pubrel_packet)
self.assertIsNotNone(message.pubcomp_packet)
await self.stop_handler(self.handler, self.session)
future.set_result(True)
except Exception as ae:
future.set_exception(ae)
self.handler = None
self.session = Session()
future = asyncio.Future()
coro = asyncio.start_server(server_mock, "127.0.0.1", 8888)
server = self.loop.run_until_complete(coro)
self.loop.run_until_complete(test_coro())
server.close()
self.loop.run_until_complete(server.wait_closed())
if future.exception():
raise future.exception()
async def start_handler(self, handler, session):
self.check_empty_waiters(handler)
self.check_no_message(session)
await handler.start()
assert handler._reader_ready
async def stop_handler(self, handler, session):
await handler.stop()
assert handler._reader_stopped
self.check_empty_waiters(handler)
self.check_no_message(session)
def check_empty_waiters(self, handler):
self.assertFalse(handler._puback_waiters)
self.assertFalse(handler._pubrec_waiters)
self.assertFalse(handler._pubrel_waiters)
self.assertFalse(handler._pubcomp_waiters)
def check_no_message(self, session):
self.assertFalse(session.inflight_out)
self.assertFalse(session.inflight_in)
def test_publish_qos1_retry(self):
async def server_mock(reader, writer):
packet = await PublishPacket.from_stream(reader)
try:
self.assertEqual(packet.topic_name, "/topic")
self.assertEqual(packet.qos, QOS_1)
self.assertIsNotNone(packet.packet_id)
self.assertIn(packet.packet_id, self.session.inflight_out)
self.assertIn(packet.packet_id, self.handler._puback_waiters)
puback = PubackPacket.build(packet.packet_id)
await puback.to_stream(writer)
except Exception as ae:
future.set_exception(ae)
async def test_coro():
try:
reader, writer = await asyncio.open_connection("127.0.0.1", 8888)
reader_adapted, writer_adapted = adapt(reader, writer)
self.handler = ProtocolHandler(self.plugin_manager)
self.handler.attach(self.session, reader_adapted, writer_adapted)
await self.handler.start()
await self.stop_handler(self.handler, self.session)
if not future.done():
future.set_result(True)
except Exception as ae:
future.set_exception(ae)
self.handler = None
self.session = Session()
message = OutgoingApplicationMessage(1, "/topic", QOS_1, b"test_data", False)
message.publish_packet = PublishPacket.build(
"/topic", b"test_data", rand_packet_id(), False, QOS_1, False
)
self.session.inflight_out[1] = message
future = asyncio.Future()
coro = asyncio.start_server(server_mock, "127.0.0.1", 8888)
server = self.loop.run_until_complete(coro)
self.loop.run_until_complete(test_coro())
server.close()
self.loop.run_until_complete(server.wait_closed())
if future.exception():
raise future.exception()
def test_publish_qos2_retry(self):
async def server_mock(reader, writer):
try:
packet = await PublishPacket.from_stream(reader)
self.assertEqual(packet.topic_name, "/topic")
self.assertEqual(packet.qos, QOS_2)
self.assertIsNotNone(packet.packet_id)
self.assertIn(packet.packet_id, self.session.inflight_out)
self.assertIn(packet.packet_id, self.handler._pubrec_waiters)
pubrec = PubrecPacket.build(packet.packet_id)
await pubrec.to_stream(writer)
await PubrelPacket.from_stream(reader)
self.assertIn(packet.packet_id, self.handler._pubcomp_waiters)
pubcomp = PubcompPacket.build(packet.packet_id)
await pubcomp.to_stream(writer)
except Exception as ae:
future.set_exception(ae)
async def test_coro():
try:
reader, writer = await asyncio.open_connection("127.0.0.1", 8888)
reader_adapted, writer_adapted = adapt(reader, writer)
self.handler = ProtocolHandler(self.plugin_manager)
self.handler.attach(self.session, reader_adapted, writer_adapted)
await self.handler.start()
await self.stop_handler(self.handler, self.session)
if not future.done():
future.set_result(True)
except Exception as ae:
future.set_exception(ae)
self.handler = None
self.session = Session()
message = OutgoingApplicationMessage(1, "/topic", QOS_2, b"test_data", False)
message.publish_packet = PublishPacket.build(
"/topic", b"test_data", rand_packet_id(), False, QOS_2, False
)
self.session.inflight_out[1] = message
future = asyncio.Future()
coro = asyncio.start_server(server_mock, "127.0.0.1", 8888)
server = self.loop.run_until_complete(coro)
self.loop.run_until_complete(test_coro())
server.close()
self.loop.run_until_complete(server.wait_closed())
if future.exception():
raise future.exception()
| 43.646681 | 85 | 0.617623 | 2,194 | 20,383 | 5.543756 | 0.070191 | 0.045219 | 0.024418 | 0.035518 | 0.852668 | 0.832607 | 0.820275 | 0.816904 | 0.805969 | 0.805969 | 0 | 0.015278 | 0.293529 | 20,383 | 466 | 86 | 43.740343 | 0.829375 | 0.004072 | 0 | 0.776978 | 0 | 0.002398 | 0.021482 | 0.001232 | 0 | 0 | 0 | 0 | 0.18705 | 1 | 0.038369 | false | 0.002398 | 0.033573 | 0.004796 | 0.079137 | 0.002398 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ab364728a121713904de8a24bd0a05e7a78b8cc1 | 15,231 | py | Python | tests/subaccount/test_history_api.py | nickmflorin/django-proper-architecture-testing | da7c4019697e85f921695144375d2f548f1e98ad | [
"MIT"
] | null | null | null | tests/subaccount/test_history_api.py | nickmflorin/django-proper-architecture-testing | da7c4019697e85f921695144375d2f548f1e98ad | [
"MIT"
] | null | null | null | tests/subaccount/test_history_api.py | nickmflorin/django-proper-architecture-testing | da7c4019697e85f921695144375d2f548f1e98ad | [
"MIT"
] | null | null | null | from copy import deepcopy
import pytest
@pytest.mark.freeze_time('2020-01-01')
def test_get_account_subaccounts_history(api_client, create_budget, user,
create_budget_subaccount, create_budget_account, models):
api_client.force_login(user)
budget = create_budget()
account = create_budget_account(budget=budget)
subaccount = create_budget_subaccount(
parent=account,
name="Original Name",
description="Original Description",
identifier="old_identifier",
budget=budget
)
api_client.force_login(user)
response = api_client.patch("/v1/subaccounts/%s/" % subaccount.pk, data={
"name": "New Name",
"description": "New Description",
"identifier": "new_identifier",
"quantity": 10,
"rate": 1.5
})
response = api_client.get(
"/v1/accounts/%s/subaccounts/history/" % account.pk)
assert response.status_code == 200
assert models.FieldAlterationEvent.objects.count() == 5
assert response.json()['count'] == 5
serialized_events = [
{
"created_at": "2020-01-01 00:00:00",
"new_value": "New Description",
"old_value": "Original Description",
"field": "description",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": "new_identifier",
"old_value": "old_identifier",
"field": "identifier",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": 1.5,
"old_value": subaccount.rate,
"field": "rate",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": "New Name",
"old_value": "Original Name",
"field": "name",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": 10,
"old_value": None,
"field": "quantity",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
}
]
for serialized_event in response.json()['data']:
event_without_id = deepcopy(serialized_event)
del event_without_id['id']
assert event_without_id in serialized_events
@pytest.mark.freeze_time('2020-01-01')
def test_get_subaccount_subaccounts_history(api_client, create_budget, user,
create_budget_subaccount, create_budget_account, models):
api_client.force_login(user)
budget = create_budget()
account = create_budget_account(budget=budget)
parent_subaccount = create_budget_subaccount(
parent=account,
budget=budget,
identifier="subaccount-a"
)
subaccount = create_budget_subaccount(
parent=parent_subaccount,
name="Original Name",
description="Original Description",
identifier="old_identifier",
budget=budget
)
api_client.force_login(user)
response = api_client.patch("/v1/subaccounts/%s/" % subaccount.pk, data={
"name": "New Name",
"description": "New Description",
"identifier": "new_identifier",
"quantity": 10,
"rate": 1.5
})
response = api_client.get(
"/v1/subaccounts/%s/subaccounts/history/"
% parent_subaccount.pk
)
assert response.status_code == 200
assert models.FieldAlterationEvent.objects.count() == 5
assert response.json()['count'] == 5
serialized_events = [
{
"created_at": "2020-01-01 00:00:00",
"new_value": "New Description",
"old_value": "Original Description",
"field": "description",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": "new_identifier",
"old_value": "old_identifier",
"field": "identifier",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": 1.5,
"old_value": subaccount.rate,
"field": "rate",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": "New Name",
"old_value": "Original Name",
"field": "name",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": 10,
"old_value": None,
"field": "quantity",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
}
]
for serialized_event in response.json()['data']:
event_without_id = deepcopy(serialized_event)
del event_without_id['id']
assert event_without_id in serialized_events
@pytest.mark.freeze_time('2020-01-01')
def test_get_subaccount_history(api_client, create_budget, user,
create_budget_account, create_budget_subaccount, models):
api_client.force_login(user)
budget = create_budget()
account = create_budget_account(budget=budget)
subaccount = create_budget_subaccount(
parent=account,
name="Original Name",
description="Original Description",
identifier="old_identifier",
budget=budget
)
api_client.force_login(user)
response = api_client.patch("/v1/subaccounts/%s/" % subaccount.pk, data={
"name": "New Name",
"description": "New Description",
"identifier": "new_identifier",
"quantity": 10,
"rate": 1.5
})
response = api_client.get("/v1/subaccounts/%s/history/" % subaccount.pk)
assert response.status_code == 200
assert models.FieldAlterationEvent.objects.count() == 5
assert response.json()['count'] == 5
serialized_events = [
{
"created_at": "2020-01-01 00:00:00",
"new_value": "New Description",
"old_value": "Original Description",
"field": "description",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": "new_identifier",
"old_value": "old_identifier",
"field": "identifier",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": 1.5,
"old_value": subaccount.rate,
"field": "rate",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": "New Name",
"old_value": "Original Name",
"field": "name",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
},
{
"created_at": "2020-01-01 00:00:00",
"new_value": 10,
"old_value": None,
"field": "quantity",
"type": "field_alteration",
"content_object": {
'id': subaccount.pk,
'identifier': 'new_identifier',
'type': 'subaccount',
'name': 'New Name',
'description': 'New Description',
},
"user": {
"id": user.pk,
"first_name": user.first_name,
"last_name": user.last_name,
"full_name": user.full_name,
"email": user.email,
"profile_image": None,
}
}
]
for serialized_event in response.json()['data']:
event_without_id = deepcopy(serialized_event)
del event_without_id['id']
assert event_without_id in serialized_events
| 34.304054 | 77 | 0.474361 | 1,359 | 15,231 | 5.0883 | 0.056659 | 0.052061 | 0.020824 | 0.057267 | 0.975271 | 0.967028 | 0.960521 | 0.960521 | 0.954158 | 0.954158 | 0 | 0.030218 | 0.393802 | 15,231 | 443 | 78 | 34.38149 | 0.718726 | 0 | 0 | 0.81106 | 0 | 0 | 0.275753 | 0.006697 | 0 | 0 | 0 | 0 | 0.02765 | 1 | 0.006912 | false | 0 | 0.004608 | 0 | 0.011521 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ab4d1064eb80ceb7a9798f4ebfa069dbaf6f8db1 | 127 | py | Python | test/unit/extraction/__init__.py | mishioo/tesliper | e09dcbc0eeb5cc5f7d612ea7f913e4c5dd58a327 | [
"BSD-2-Clause"
] | null | null | null | test/unit/extraction/__init__.py | mishioo/tesliper | e09dcbc0eeb5cc5f7d612ea7f913e4c5dd58a327 | [
"BSD-2-Clause"
] | 4 | 2022-02-24T18:28:39.000Z | 2022-03-23T16:27:59.000Z | test/unit/extraction/__init__.py | mishioo/tesliper | e09dcbc0eeb5cc5f7d612ea7f913e4c5dd58a327 | [
"BSD-2-Clause"
] | null | null | null | from .test_extraction import *
from .test_gaussian_parser import *
from .test_parser_base import *
from .test_soxhlet import *
| 25.4 | 35 | 0.811024 | 18 | 127 | 5.388889 | 0.444444 | 0.329897 | 0.43299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125984 | 127 | 4 | 36 | 31.75 | 0.873874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ab545e952ada57fb82ee446dfb4ec11319a6bb4b | 19,537 | py | Python | tests/selenium/test_derived_props.py | cobujo/dash-table | 8dad61c2a7c27588504d17b81e9f2c8ed93e1670 | [
"MIT"
] | null | null | null | tests/selenium/test_derived_props.py | cobujo/dash-table | 8dad61c2a7c27588504d17b81e9f2c8ed93e1670 | [
"MIT"
] | null | null | null | tests/selenium/test_derived_props.py | cobujo/dash-table | 8dad61c2a7c27588504d17b81e9f2c8ed93e1670 | [
"MIT"
] | null | null | null | import dash
from dash.dependencies import Input, Output
import dash_html_components as html
from dash_table import DataTable
from selenium.webdriver.common.keys import Keys
import json
import pandas as pd
url = "https://github.com/plotly/datasets/raw/master/" "26k-consumer-complaints.csv"
rawDf = pd.read_csv(url, nrows=100)
rawDf["id"] = rawDf.index + 3000
df = rawDf.to_dict("records")
props = [
"active_cell",
"start_cell",
"end_cell",
"selected_cells",
"selected_rows",
"selected_row_ids",
"derived_viewport_selected_rows",
"derived_viewport_selected_row_ids",
"derived_virtual_selected_rows",
"derived_virtual_selected_row_ids",
"derived_viewport_indices",
"derived_viewport_row_ids",
"derived_virtual_indices",
"derived_virtual_row_ids",
]
def get_app():
app = dash.Dash(__name__)
app.layout = html.Div(
[
DataTable(
id="table",
columns=[{"name": i, "id": i} for i in rawDf.columns],
data=df,
editable=True,
filter_action="native",
fixed_columns={"headers": True},
fixed_rows={"headers": True},
page_action="native",
page_size=10,
row_deletable=True,
row_selectable=True,
sort_action="native",
),
html.Div(id="props_container", children=["Nothing yet"]),
]
)
@app.callback(
Output("props_container", "children"), [Input("table", prop) for prop in props]
)
def show_props(*args):
# return 'Something yet!'
# print('show props')
return html.Table(
[
html.Tr(
[
html.Td(prop),
html.Td(
json.dumps(val) if val is not None else "None", id=prop
),
]
)
for prop, val in zip(props, args)
]
)
return app
def test_tdrp001_select_rows(test):
test.start_server(get_app())
target = test.table("table")
target.row(0).select()
target.row(1).select()
assert test.find_element("#active_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#start_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#end_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_cells").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_rows").get_attribute("innerHTML") == json.dumps(
list(range(2))
)
assert test.find_element("#selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3002)))
assert test.find_element("#derived_viewport_selected_rows").get_attribute(
"innerHTML"
) == json.dumps(list(range(2)))
assert test.find_element("#derived_viewport_selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3002)))
assert test.find_element("#derived_viewport_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(10)))
assert test.find_element("#derived_viewport_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3010)))
assert test.find_element("#derived_virtual_selected_rows").get_attribute(
"innerHTML"
) == json.dumps(list(range(2)))
assert test.find_element("#derived_virtual_selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3002)))
assert test.find_element("#derived_virtual_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(100)))
assert test.find_element("#derived_virtual_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3100)))
assert test.get_log_errors() == []
def test_tdrp002_select_cell(test):
test.start_server(get_app())
target = test.table("table")
target.cell(0, 0).click()
active = dict(row=0, column=0, column_id=rawDf.columns[0], row_id=3000)
assert test.find_element("#active_cell").get_attribute("innerHTML") == json.dumps(
active
)
assert test.find_element("#start_cell").get_attribute("innerHTML") == json.dumps(
active
)
assert test.find_element("#end_cell").get_attribute("innerHTML") == json.dumps(
active
)
assert test.find_element("#selected_cells").get_attribute(
"innerHTML"
) == json.dumps([active])
assert test.find_element("#selected_rows").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_row_ids").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#derived_viewport_selected_rows").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_selected_row_ids").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(10)))
assert test.find_element("#derived_viewport_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3010)))
assert test.find_element("#derived_virtual_selected_rows").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_virtual_selected_row_ids").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_virtual_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(100)))
assert test.find_element("#derived_virtual_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3100)))
assert test.get_log_errors() == []
def test_tdrp003_select_cells(test):
test.start_server(get_app())
target = test.table("table")
target.cell(0, 0).click()
with test.hold(Keys.SHIFT):
test.send_keys(Keys.DOWN + Keys.DOWN + Keys.RIGHT + Keys.RIGHT)
active = dict(row=0, column=0, column_id=rawDf.columns[0], row_id=3000)
selected = []
for row in range(3):
for col in range(3):
selected.append(
dict(
row=row, column=col, column_id=rawDf.columns[col], row_id=row + 3000
)
)
assert test.find_element("#active_cell").get_attribute("innerHTML") == json.dumps(
active
)
assert test.find_element("#start_cell").get_attribute("innerHTML") == json.dumps(
selected[0]
)
assert test.find_element("#end_cell").get_attribute("innerHTML") == json.dumps(
selected[-1]
)
assert test.find_element("#selected_cells").get_attribute(
"innerHTML"
) == json.dumps(selected)
assert test.find_element("#selected_rows").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_row_ids").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#derived_viewport_selected_rows").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_selected_row_ids").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(10)))
assert test.find_element("#derived_viewport_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3010)))
assert test.find_element("#derived_virtual_selected_rows").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_virtual_selected_row_ids").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_virtual_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(100)))
assert test.find_element("#derived_virtual_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3100)))
# reduce selection
with test.hold(Keys.SHIFT):
test.send_keys(Keys.UP + Keys.LEFT)
selected = []
for row in range(2):
for col in range(2):
selected.append(
dict(
row=row, column=col, column_id=rawDf.columns[col], row_id=row + 3000
)
)
assert test.find_element("#active_cell").get_attribute("innerHTML") == json.dumps(
active
)
assert test.find_element("#start_cell").get_attribute("innerHTML") == json.dumps(
selected[0]
)
assert test.find_element("#end_cell").get_attribute("innerHTML") == json.dumps(
selected[-1]
)
assert test.find_element("#selected_cells").get_attribute(
"innerHTML"
) == json.dumps(selected)
assert test.find_element("#selected_rows").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_row_ids").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#derived_viewport_selected_rows").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_selected_row_ids").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(10)))
assert test.find_element("#derived_viewport_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3010)))
assert test.find_element("#derived_virtual_selected_rows").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_virtual_selected_row_ids").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_virtual_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(100)))
assert test.find_element("#derived_virtual_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3100)))
assert test.get_log_errors() == []
def test_tdrp004_navigate_selected_cells(test):
test.start_server(get_app())
target = test.table("table")
target.cell(0, 0).click()
with test.hold(Keys.SHIFT):
test.send_keys(Keys.DOWN + Keys.DOWN + Keys.RIGHT + Keys.RIGHT)
selected = []
for row in range(3):
for col in range(3):
selected.append(
dict(
row=row, column=col, column_id=rawDf.columns[col], row_id=row + 3000
)
)
for row in range(3):
for col in range(3):
active = dict(
row=row, column=col, column_id=rawDf.columns[col], row_id=row + 3000
)
assert test.find_element("#active_cell").get_attribute(
"innerHTML"
) == json.dumps(active)
assert test.find_element("#start_cell").get_attribute(
"innerHTML"
) == json.dumps(selected[0])
assert test.find_element("#end_cell").get_attribute(
"innerHTML"
) == json.dumps(selected[-1])
assert test.find_element("#selected_cells").get_attribute(
"innerHTML"
) == json.dumps(selected)
assert test.find_element("#selected_rows").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_row_ids").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_selected_rows").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element(
"#derived_viewport_selected_row_ids"
).get_attribute("innerHTML") in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(10)))
assert test.find_element("#derived_viewport_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3010)))
assert test.find_element("#derived_virtual_selected_rows").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_virtual_selected_row_ids").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_virtual_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(100)))
assert test.find_element("#derived_virtual_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3100)))
test.send_keys(Keys.TAB)
assert test.get_log_errors() == []
def test_tdrp005_filtered_and_sorted_row_select(test):
test.start_server(get_app())
target = test.table("table")
target.row(0).select()
target.row(1).select()
target.row(2).select()
assert test.find_element("#active_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#start_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#end_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_cells").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_rows").get_attribute("innerHTML") == json.dumps(
list(range(3))
)
assert test.find_element("#selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3003)))
assert test.find_element("#derived_viewport_selected_rows").get_attribute(
"innerHTML"
) == json.dumps(list(range(3)))
assert test.find_element("#derived_viewport_selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3003)))
assert test.find_element("#derived_viewport_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(10)))
assert test.find_element("#derived_viewport_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3010)))
assert test.find_element("#derived_virtual_selected_rows").get_attribute(
"innerHTML"
) == json.dumps(list(range(3)))
assert test.find_element("#derived_virtual_selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3003)))
assert test.find_element("#derived_virtual_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(100)))
assert test.find_element("#derived_virtual_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3100)))
target.column(rawDf.columns[0]).filter()
test.send_keys("is even" + Keys.ENTER)
assert test.find_element("#active_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#start_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#end_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_cells").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_rows").get_attribute("innerHTML") == json.dumps(
list(range(3))
)
assert test.find_element("#selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3003)))
assert test.find_element("#derived_viewport_selected_rows").get_attribute(
"innerHTML"
) == json.dumps(list(range(0, 2)))
assert test.find_element("#derived_viewport_selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3003, 2)))
assert test.find_element("#derived_viewport_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(0, 20, 2)))
assert test.find_element("#derived_viewport_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3020, 2)))
assert test.find_element("#derived_virtual_selected_rows").get_attribute(
"innerHTML"
) == json.dumps(list(range(0, 2)))
assert test.find_element("#derived_virtual_selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3003, 2)))
assert test.find_element("#derived_virtual_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(0, 100, 2)))
assert test.find_element("#derived_virtual_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3100, 2)))
target.column(rawDf.columns[0]).sort() # None -> ASC
target.column(rawDf.columns[0]).sort() # ASC -> DESC
assert test.find_element("#active_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#start_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#end_cell").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_cells").get_attribute("innerHTML") in [
"None",
json.dumps([]),
]
assert test.find_element("#selected_rows").get_attribute("innerHTML") == json.dumps(
list(range(3))
)
assert test.find_element("#selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3003)))
assert test.find_element("#derived_viewport_selected_rows").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_selected_row_ids").get_attribute(
"innerHTML"
) in ["None", json.dumps([])]
assert test.find_element("#derived_viewport_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(80, 100, 2))[::-1])
assert test.find_element("#derived_viewport_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3080, 3100, 2))[::-1])
assert test.find_element("#derived_virtual_selected_rows").get_attribute(
"innerHTML"
) == json.dumps(list(range(48, 50))[::-1])
assert test.find_element("#derived_virtual_selected_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3003, 2)))
assert test.find_element("#derived_virtual_indices").get_attribute(
"innerHTML"
) == json.dumps(list(range(0, 100, 2))[::-1])
assert test.find_element("#derived_virtual_row_ids").get_attribute(
"innerHTML"
) == json.dumps(list(range(3000, 3100, 2))[::-1])
assert test.get_log_errors() == []
| 34.578761 | 88 | 0.616881 | 2,262 | 19,537 | 5.071618 | 0.068523 | 0.101987 | 0.136681 | 0.205021 | 0.89348 | 0.878312 | 0.873257 | 0.870642 | 0.86977 | 0.866457 | 0 | 0.025244 | 0.233557 | 19,537 | 564 | 89 | 34.640071 | 0.740884 | 0.0043 | 0 | 0.717172 | 0 | 0 | 0.211498 | 0.105569 | 0 | 0 | 0 | 0 | 0.236364 | 1 | 0.014141 | false | 0 | 0.014141 | 0.00202 | 0.032323 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
db911cba05b8595930a6408c3f9c74162afba866 | 20,716 | py | Python | google/cloud/kms_v1/services/key_management_service/pagers.py | anukaal/python-kms | e28126d7ab1b2b44ee54c6a4a1ddc4d5c15a57b6 | [
"Apache-2.0"
] | 24 | 2020-07-07T03:17:32.000Z | 2022-03-30T14:48:01.000Z | google/cloud/kms_v1/services/key_management_service/pagers.py | anukaal/python-kms | e28126d7ab1b2b44ee54c6a4a1ddc4d5c15a57b6 | [
"Apache-2.0"
] | 90 | 2020-02-05T22:20:20.000Z | 2022-03-30T22:42:11.000Z | google/cloud/kms_v1/services/key_management_service/pagers.py | anukaal/python-kms | e28126d7ab1b2b44ee54c6a4a1ddc4d5c15a57b6 | [
"Apache-2.0"
] | 31 | 2020-02-08T13:51:41.000Z | 2022-03-22T01:08:04.000Z | # -*- coding: utf-8 -*-
# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from typing import (
Any,
AsyncIterator,
Awaitable,
Callable,
Sequence,
Tuple,
Optional,
Iterator,
)
from google.cloud.kms_v1.types import resources
from google.cloud.kms_v1.types import service
class ListKeyRingsPager:
"""A pager for iterating through ``list_key_rings`` requests.
This class thinly wraps an initial
:class:`google.cloud.kms_v1.types.ListKeyRingsResponse` object, and
provides an ``__iter__`` method to iterate through its
``key_rings`` field.
If there are more pages, the ``__iter__`` method will make additional
``ListKeyRings`` requests and continue to iterate
through the ``key_rings`` field on the
corresponding responses.
All the usual :class:`google.cloud.kms_v1.types.ListKeyRingsResponse`
attributes are available on the pager. If multiple requests are made, only
the most recent response is retained, and thus used for attribute lookup.
"""
def __init__(
self,
method: Callable[..., service.ListKeyRingsResponse],
request: service.ListKeyRingsRequest,
response: service.ListKeyRingsResponse,
*,
metadata: Sequence[Tuple[str, str]] = ()
):
"""Instantiate the pager.
Args:
method (Callable): The method that was originally called, and
which instantiated this pager.
request (google.cloud.kms_v1.types.ListKeyRingsRequest):
The initial request object.
response (google.cloud.kms_v1.types.ListKeyRingsResponse):
The initial response object.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
self._method = method
self._request = service.ListKeyRingsRequest(request)
self._response = response
self._metadata = metadata
def __getattr__(self, name: str) -> Any:
return getattr(self._response, name)
@property
def pages(self) -> Iterator[service.ListKeyRingsResponse]:
yield self._response
while self._response.next_page_token:
self._request.page_token = self._response.next_page_token
self._response = self._method(self._request, metadata=self._metadata)
yield self._response
def __iter__(self) -> Iterator[resources.KeyRing]:
for page in self.pages:
yield from page.key_rings
def __repr__(self) -> str:
return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
class ListKeyRingsAsyncPager:
"""A pager for iterating through ``list_key_rings`` requests.
This class thinly wraps an initial
:class:`google.cloud.kms_v1.types.ListKeyRingsResponse` object, and
provides an ``__aiter__`` method to iterate through its
``key_rings`` field.
If there are more pages, the ``__aiter__`` method will make additional
``ListKeyRings`` requests and continue to iterate
through the ``key_rings`` field on the
corresponding responses.
All the usual :class:`google.cloud.kms_v1.types.ListKeyRingsResponse`
attributes are available on the pager. If multiple requests are made, only
the most recent response is retained, and thus used for attribute lookup.
"""
def __init__(
self,
method: Callable[..., Awaitable[service.ListKeyRingsResponse]],
request: service.ListKeyRingsRequest,
response: service.ListKeyRingsResponse,
*,
metadata: Sequence[Tuple[str, str]] = ()
):
"""Instantiates the pager.
Args:
method (Callable): The method that was originally called, and
which instantiated this pager.
request (google.cloud.kms_v1.types.ListKeyRingsRequest):
The initial request object.
response (google.cloud.kms_v1.types.ListKeyRingsResponse):
The initial response object.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
self._method = method
self._request = service.ListKeyRingsRequest(request)
self._response = response
self._metadata = metadata
def __getattr__(self, name: str) -> Any:
return getattr(self._response, name)
@property
async def pages(self) -> AsyncIterator[service.ListKeyRingsResponse]:
yield self._response
while self._response.next_page_token:
self._request.page_token = self._response.next_page_token
self._response = await self._method(self._request, metadata=self._metadata)
yield self._response
def __aiter__(self) -> AsyncIterator[resources.KeyRing]:
async def async_generator():
async for page in self.pages:
for response in page.key_rings:
yield response
return async_generator()
def __repr__(self) -> str:
return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
class ListCryptoKeysPager:
"""A pager for iterating through ``list_crypto_keys`` requests.
This class thinly wraps an initial
:class:`google.cloud.kms_v1.types.ListCryptoKeysResponse` object, and
provides an ``__iter__`` method to iterate through its
``crypto_keys`` field.
If there are more pages, the ``__iter__`` method will make additional
``ListCryptoKeys`` requests and continue to iterate
through the ``crypto_keys`` field on the
corresponding responses.
All the usual :class:`google.cloud.kms_v1.types.ListCryptoKeysResponse`
attributes are available on the pager. If multiple requests are made, only
the most recent response is retained, and thus used for attribute lookup.
"""
def __init__(
self,
method: Callable[..., service.ListCryptoKeysResponse],
request: service.ListCryptoKeysRequest,
response: service.ListCryptoKeysResponse,
*,
metadata: Sequence[Tuple[str, str]] = ()
):
"""Instantiate the pager.
Args:
method (Callable): The method that was originally called, and
which instantiated this pager.
request (google.cloud.kms_v1.types.ListCryptoKeysRequest):
The initial request object.
response (google.cloud.kms_v1.types.ListCryptoKeysResponse):
The initial response object.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
self._method = method
self._request = service.ListCryptoKeysRequest(request)
self._response = response
self._metadata = metadata
def __getattr__(self, name: str) -> Any:
return getattr(self._response, name)
@property
def pages(self) -> Iterator[service.ListCryptoKeysResponse]:
yield self._response
while self._response.next_page_token:
self._request.page_token = self._response.next_page_token
self._response = self._method(self._request, metadata=self._metadata)
yield self._response
def __iter__(self) -> Iterator[resources.CryptoKey]:
for page in self.pages:
yield from page.crypto_keys
def __repr__(self) -> str:
return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
class ListCryptoKeysAsyncPager:
"""A pager for iterating through ``list_crypto_keys`` requests.
This class thinly wraps an initial
:class:`google.cloud.kms_v1.types.ListCryptoKeysResponse` object, and
provides an ``__aiter__`` method to iterate through its
``crypto_keys`` field.
If there are more pages, the ``__aiter__`` method will make additional
``ListCryptoKeys`` requests and continue to iterate
through the ``crypto_keys`` field on the
corresponding responses.
All the usual :class:`google.cloud.kms_v1.types.ListCryptoKeysResponse`
attributes are available on the pager. If multiple requests are made, only
the most recent response is retained, and thus used for attribute lookup.
"""
def __init__(
self,
method: Callable[..., Awaitable[service.ListCryptoKeysResponse]],
request: service.ListCryptoKeysRequest,
response: service.ListCryptoKeysResponse,
*,
metadata: Sequence[Tuple[str, str]] = ()
):
"""Instantiates the pager.
Args:
method (Callable): The method that was originally called, and
which instantiated this pager.
request (google.cloud.kms_v1.types.ListCryptoKeysRequest):
The initial request object.
response (google.cloud.kms_v1.types.ListCryptoKeysResponse):
The initial response object.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
self._method = method
self._request = service.ListCryptoKeysRequest(request)
self._response = response
self._metadata = metadata
def __getattr__(self, name: str) -> Any:
return getattr(self._response, name)
@property
async def pages(self) -> AsyncIterator[service.ListCryptoKeysResponse]:
yield self._response
while self._response.next_page_token:
self._request.page_token = self._response.next_page_token
self._response = await self._method(self._request, metadata=self._metadata)
yield self._response
def __aiter__(self) -> AsyncIterator[resources.CryptoKey]:
async def async_generator():
async for page in self.pages:
for response in page.crypto_keys:
yield response
return async_generator()
def __repr__(self) -> str:
return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
class ListCryptoKeyVersionsPager:
"""A pager for iterating through ``list_crypto_key_versions`` requests.
This class thinly wraps an initial
:class:`google.cloud.kms_v1.types.ListCryptoKeyVersionsResponse` object, and
provides an ``__iter__`` method to iterate through its
``crypto_key_versions`` field.
If there are more pages, the ``__iter__`` method will make additional
``ListCryptoKeyVersions`` requests and continue to iterate
through the ``crypto_key_versions`` field on the
corresponding responses.
All the usual :class:`google.cloud.kms_v1.types.ListCryptoKeyVersionsResponse`
attributes are available on the pager. If multiple requests are made, only
the most recent response is retained, and thus used for attribute lookup.
"""
def __init__(
self,
method: Callable[..., service.ListCryptoKeyVersionsResponse],
request: service.ListCryptoKeyVersionsRequest,
response: service.ListCryptoKeyVersionsResponse,
*,
metadata: Sequence[Tuple[str, str]] = ()
):
"""Instantiate the pager.
Args:
method (Callable): The method that was originally called, and
which instantiated this pager.
request (google.cloud.kms_v1.types.ListCryptoKeyVersionsRequest):
The initial request object.
response (google.cloud.kms_v1.types.ListCryptoKeyVersionsResponse):
The initial response object.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
self._method = method
self._request = service.ListCryptoKeyVersionsRequest(request)
self._response = response
self._metadata = metadata
def __getattr__(self, name: str) -> Any:
return getattr(self._response, name)
@property
def pages(self) -> Iterator[service.ListCryptoKeyVersionsResponse]:
yield self._response
while self._response.next_page_token:
self._request.page_token = self._response.next_page_token
self._response = self._method(self._request, metadata=self._metadata)
yield self._response
def __iter__(self) -> Iterator[resources.CryptoKeyVersion]:
for page in self.pages:
yield from page.crypto_key_versions
def __repr__(self) -> str:
return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
class ListCryptoKeyVersionsAsyncPager:
"""A pager for iterating through ``list_crypto_key_versions`` requests.
This class thinly wraps an initial
:class:`google.cloud.kms_v1.types.ListCryptoKeyVersionsResponse` object, and
provides an ``__aiter__`` method to iterate through its
``crypto_key_versions`` field.
If there are more pages, the ``__aiter__`` method will make additional
``ListCryptoKeyVersions`` requests and continue to iterate
through the ``crypto_key_versions`` field on the
corresponding responses.
All the usual :class:`google.cloud.kms_v1.types.ListCryptoKeyVersionsResponse`
attributes are available on the pager. If multiple requests are made, only
the most recent response is retained, and thus used for attribute lookup.
"""
def __init__(
self,
method: Callable[..., Awaitable[service.ListCryptoKeyVersionsResponse]],
request: service.ListCryptoKeyVersionsRequest,
response: service.ListCryptoKeyVersionsResponse,
*,
metadata: Sequence[Tuple[str, str]] = ()
):
"""Instantiates the pager.
Args:
method (Callable): The method that was originally called, and
which instantiated this pager.
request (google.cloud.kms_v1.types.ListCryptoKeyVersionsRequest):
The initial request object.
response (google.cloud.kms_v1.types.ListCryptoKeyVersionsResponse):
The initial response object.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
self._method = method
self._request = service.ListCryptoKeyVersionsRequest(request)
self._response = response
self._metadata = metadata
def __getattr__(self, name: str) -> Any:
return getattr(self._response, name)
@property
async def pages(self) -> AsyncIterator[service.ListCryptoKeyVersionsResponse]:
yield self._response
while self._response.next_page_token:
self._request.page_token = self._response.next_page_token
self._response = await self._method(self._request, metadata=self._metadata)
yield self._response
def __aiter__(self) -> AsyncIterator[resources.CryptoKeyVersion]:
async def async_generator():
async for page in self.pages:
for response in page.crypto_key_versions:
yield response
return async_generator()
def __repr__(self) -> str:
return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
class ListImportJobsPager:
"""A pager for iterating through ``list_import_jobs`` requests.
This class thinly wraps an initial
:class:`google.cloud.kms_v1.types.ListImportJobsResponse` object, and
provides an ``__iter__`` method to iterate through its
``import_jobs`` field.
If there are more pages, the ``__iter__`` method will make additional
``ListImportJobs`` requests and continue to iterate
through the ``import_jobs`` field on the
corresponding responses.
All the usual :class:`google.cloud.kms_v1.types.ListImportJobsResponse`
attributes are available on the pager. If multiple requests are made, only
the most recent response is retained, and thus used for attribute lookup.
"""
def __init__(
self,
method: Callable[..., service.ListImportJobsResponse],
request: service.ListImportJobsRequest,
response: service.ListImportJobsResponse,
*,
metadata: Sequence[Tuple[str, str]] = ()
):
"""Instantiate the pager.
Args:
method (Callable): The method that was originally called, and
which instantiated this pager.
request (google.cloud.kms_v1.types.ListImportJobsRequest):
The initial request object.
response (google.cloud.kms_v1.types.ListImportJobsResponse):
The initial response object.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
self._method = method
self._request = service.ListImportJobsRequest(request)
self._response = response
self._metadata = metadata
def __getattr__(self, name: str) -> Any:
return getattr(self._response, name)
@property
def pages(self) -> Iterator[service.ListImportJobsResponse]:
yield self._response
while self._response.next_page_token:
self._request.page_token = self._response.next_page_token
self._response = self._method(self._request, metadata=self._metadata)
yield self._response
def __iter__(self) -> Iterator[resources.ImportJob]:
for page in self.pages:
yield from page.import_jobs
def __repr__(self) -> str:
return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
class ListImportJobsAsyncPager:
"""A pager for iterating through ``list_import_jobs`` requests.
This class thinly wraps an initial
:class:`google.cloud.kms_v1.types.ListImportJobsResponse` object, and
provides an ``__aiter__`` method to iterate through its
``import_jobs`` field.
If there are more pages, the ``__aiter__`` method will make additional
``ListImportJobs`` requests and continue to iterate
through the ``import_jobs`` field on the
corresponding responses.
All the usual :class:`google.cloud.kms_v1.types.ListImportJobsResponse`
attributes are available on the pager. If multiple requests are made, only
the most recent response is retained, and thus used for attribute lookup.
"""
def __init__(
self,
method: Callable[..., Awaitable[service.ListImportJobsResponse]],
request: service.ListImportJobsRequest,
response: service.ListImportJobsResponse,
*,
metadata: Sequence[Tuple[str, str]] = ()
):
"""Instantiates the pager.
Args:
method (Callable): The method that was originally called, and
which instantiated this pager.
request (google.cloud.kms_v1.types.ListImportJobsRequest):
The initial request object.
response (google.cloud.kms_v1.types.ListImportJobsResponse):
The initial response object.
metadata (Sequence[Tuple[str, str]]): Strings which should be
sent along with the request as metadata.
"""
self._method = method
self._request = service.ListImportJobsRequest(request)
self._response = response
self._metadata = metadata
def __getattr__(self, name: str) -> Any:
return getattr(self._response, name)
@property
async def pages(self) -> AsyncIterator[service.ListImportJobsResponse]:
yield self._response
while self._response.next_page_token:
self._request.page_token = self._response.next_page_token
self._response = await self._method(self._request, metadata=self._metadata)
yield self._response
def __aiter__(self) -> AsyncIterator[resources.ImportJob]:
async def async_generator():
async for page in self.pages:
for response in page.import_jobs:
yield response
return async_generator()
def __repr__(self) -> str:
return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
| 38.292052 | 87 | 0.6676 | 2,280 | 20,716 | 5.84386 | 0.083772 | 0.05764 | 0.035725 | 0.040829 | 0.934179 | 0.934179 | 0.934179 | 0.929526 | 0.924872 | 0.919319 | 0 | 0.003789 | 0.248262 | 20,716 | 540 | 88 | 38.362963 | 0.851795 | 0.459741 | 0 | 0.783333 | 0 | 0 | 0.007955 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0 | 0.079167 | 0.066667 | 0.345833 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
db964ee8de5a6eeeb8a49c57a83eef74f9bd6b97 | 6,683 | py | Python | loldib/getratings/models/NA/na_evelynn/na_evelynn_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_evelynn/na_evelynn_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_evelynn/na_evelynn_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Evelynn_Jng_Aatrox(Ratings):
pass
class NA_Evelynn_Jng_Ahri(Ratings):
pass
class NA_Evelynn_Jng_Akali(Ratings):
pass
class NA_Evelynn_Jng_Alistar(Ratings):
pass
class NA_Evelynn_Jng_Amumu(Ratings):
pass
class NA_Evelynn_Jng_Anivia(Ratings):
pass
class NA_Evelynn_Jng_Annie(Ratings):
pass
class NA_Evelynn_Jng_Ashe(Ratings):
pass
class NA_Evelynn_Jng_AurelionSol(Ratings):
pass
class NA_Evelynn_Jng_Azir(Ratings):
pass
class NA_Evelynn_Jng_Bard(Ratings):
pass
class NA_Evelynn_Jng_Blitzcrank(Ratings):
pass
class NA_Evelynn_Jng_Brand(Ratings):
pass
class NA_Evelynn_Jng_Braum(Ratings):
pass
class NA_Evelynn_Jng_Caitlyn(Ratings):
pass
class NA_Evelynn_Jng_Camille(Ratings):
pass
class NA_Evelynn_Jng_Cassiopeia(Ratings):
pass
class NA_Evelynn_Jng_Chogath(Ratings):
pass
class NA_Evelynn_Jng_Corki(Ratings):
pass
class NA_Evelynn_Jng_Darius(Ratings):
pass
class NA_Evelynn_Jng_Diana(Ratings):
pass
class NA_Evelynn_Jng_Draven(Ratings):
pass
class NA_Evelynn_Jng_DrMundo(Ratings):
pass
class NA_Evelynn_Jng_Ekko(Ratings):
pass
class NA_Evelynn_Jng_Elise(Ratings):
pass
class NA_Evelynn_Jng_Evelynn(Ratings):
pass
class NA_Evelynn_Jng_Ezreal(Ratings):
pass
class NA_Evelynn_Jng_Fiddlesticks(Ratings):
pass
class NA_Evelynn_Jng_Fiora(Ratings):
pass
class NA_Evelynn_Jng_Fizz(Ratings):
pass
class NA_Evelynn_Jng_Galio(Ratings):
pass
class NA_Evelynn_Jng_Gangplank(Ratings):
pass
class NA_Evelynn_Jng_Garen(Ratings):
pass
class NA_Evelynn_Jng_Gnar(Ratings):
pass
class NA_Evelynn_Jng_Gragas(Ratings):
pass
class NA_Evelynn_Jng_Graves(Ratings):
pass
class NA_Evelynn_Jng_Hecarim(Ratings):
pass
class NA_Evelynn_Jng_Heimerdinger(Ratings):
pass
class NA_Evelynn_Jng_Illaoi(Ratings):
pass
class NA_Evelynn_Jng_Irelia(Ratings):
pass
class NA_Evelynn_Jng_Ivern(Ratings):
pass
class NA_Evelynn_Jng_Janna(Ratings):
pass
class NA_Evelynn_Jng_JarvanIV(Ratings):
pass
class NA_Evelynn_Jng_Jax(Ratings):
pass
class NA_Evelynn_Jng_Jayce(Ratings):
pass
class NA_Evelynn_Jng_Jhin(Ratings):
pass
class NA_Evelynn_Jng_Jinx(Ratings):
pass
class NA_Evelynn_Jng_Kalista(Ratings):
pass
class NA_Evelynn_Jng_Karma(Ratings):
pass
class NA_Evelynn_Jng_Karthus(Ratings):
pass
class NA_Evelynn_Jng_Kassadin(Ratings):
pass
class NA_Evelynn_Jng_Katarina(Ratings):
pass
class NA_Evelynn_Jng_Kayle(Ratings):
pass
class NA_Evelynn_Jng_Kayn(Ratings):
pass
class NA_Evelynn_Jng_Kennen(Ratings):
pass
class NA_Evelynn_Jng_Khazix(Ratings):
pass
class NA_Evelynn_Jng_Kindred(Ratings):
pass
class NA_Evelynn_Jng_Kled(Ratings):
pass
class NA_Evelynn_Jng_KogMaw(Ratings):
pass
class NA_Evelynn_Jng_Leblanc(Ratings):
pass
class NA_Evelynn_Jng_LeeSin(Ratings):
pass
class NA_Evelynn_Jng_Leona(Ratings):
pass
class NA_Evelynn_Jng_Lissandra(Ratings):
pass
class NA_Evelynn_Jng_Lucian(Ratings):
pass
class NA_Evelynn_Jng_Lulu(Ratings):
pass
class NA_Evelynn_Jng_Lux(Ratings):
pass
class NA_Evelynn_Jng_Malphite(Ratings):
pass
class NA_Evelynn_Jng_Malzahar(Ratings):
pass
class NA_Evelynn_Jng_Maokai(Ratings):
pass
class NA_Evelynn_Jng_MasterYi(Ratings):
pass
class NA_Evelynn_Jng_MissFortune(Ratings):
pass
class NA_Evelynn_Jng_MonkeyKing(Ratings):
pass
class NA_Evelynn_Jng_Mordekaiser(Ratings):
pass
class NA_Evelynn_Jng_Morgana(Ratings):
pass
class NA_Evelynn_Jng_Nami(Ratings):
pass
class NA_Evelynn_Jng_Nasus(Ratings):
pass
class NA_Evelynn_Jng_Nautilus(Ratings):
pass
class NA_Evelynn_Jng_Nidalee(Ratings):
pass
class NA_Evelynn_Jng_Nocturne(Ratings):
pass
class NA_Evelynn_Jng_Nunu(Ratings):
pass
class NA_Evelynn_Jng_Olaf(Ratings):
pass
class NA_Evelynn_Jng_Orianna(Ratings):
pass
class NA_Evelynn_Jng_Ornn(Ratings):
pass
class NA_Evelynn_Jng_Pantheon(Ratings):
pass
class NA_Evelynn_Jng_Poppy(Ratings):
pass
class NA_Evelynn_Jng_Quinn(Ratings):
pass
class NA_Evelynn_Jng_Rakan(Ratings):
pass
class NA_Evelynn_Jng_Rammus(Ratings):
pass
class NA_Evelynn_Jng_RekSai(Ratings):
pass
class NA_Evelynn_Jng_Renekton(Ratings):
pass
class NA_Evelynn_Jng_Rengar(Ratings):
pass
class NA_Evelynn_Jng_Riven(Ratings):
pass
class NA_Evelynn_Jng_Rumble(Ratings):
pass
class NA_Evelynn_Jng_Ryze(Ratings):
pass
class NA_Evelynn_Jng_Sejuani(Ratings):
pass
class NA_Evelynn_Jng_Shaco(Ratings):
pass
class NA_Evelynn_Jng_Shen(Ratings):
pass
class NA_Evelynn_Jng_Shyvana(Ratings):
pass
class NA_Evelynn_Jng_Singed(Ratings):
pass
class NA_Evelynn_Jng_Sion(Ratings):
pass
class NA_Evelynn_Jng_Sivir(Ratings):
pass
class NA_Evelynn_Jng_Skarner(Ratings):
pass
class NA_Evelynn_Jng_Sona(Ratings):
pass
class NA_Evelynn_Jng_Soraka(Ratings):
pass
class NA_Evelynn_Jng_Swain(Ratings):
pass
class NA_Evelynn_Jng_Syndra(Ratings):
pass
class NA_Evelynn_Jng_TahmKench(Ratings):
pass
class NA_Evelynn_Jng_Taliyah(Ratings):
pass
class NA_Evelynn_Jng_Talon(Ratings):
pass
class NA_Evelynn_Jng_Taric(Ratings):
pass
class NA_Evelynn_Jng_Teemo(Ratings):
pass
class NA_Evelynn_Jng_Thresh(Ratings):
pass
class NA_Evelynn_Jng_Tristana(Ratings):
pass
class NA_Evelynn_Jng_Trundle(Ratings):
pass
class NA_Evelynn_Jng_Tryndamere(Ratings):
pass
class NA_Evelynn_Jng_TwistedFate(Ratings):
pass
class NA_Evelynn_Jng_Twitch(Ratings):
pass
class NA_Evelynn_Jng_Udyr(Ratings):
pass
class NA_Evelynn_Jng_Urgot(Ratings):
pass
class NA_Evelynn_Jng_Varus(Ratings):
pass
class NA_Evelynn_Jng_Vayne(Ratings):
pass
class NA_Evelynn_Jng_Veigar(Ratings):
pass
class NA_Evelynn_Jng_Velkoz(Ratings):
pass
class NA_Evelynn_Jng_Vi(Ratings):
pass
class NA_Evelynn_Jng_Viktor(Ratings):
pass
class NA_Evelynn_Jng_Vladimir(Ratings):
pass
class NA_Evelynn_Jng_Volibear(Ratings):
pass
class NA_Evelynn_Jng_Warwick(Ratings):
pass
class NA_Evelynn_Jng_Xayah(Ratings):
pass
class NA_Evelynn_Jng_Xerath(Ratings):
pass
class NA_Evelynn_Jng_XinZhao(Ratings):
pass
class NA_Evelynn_Jng_Yasuo(Ratings):
pass
class NA_Evelynn_Jng_Yorick(Ratings):
pass
class NA_Evelynn_Jng_Zac(Ratings):
pass
class NA_Evelynn_Jng_Zed(Ratings):
pass
class NA_Evelynn_Jng_Ziggs(Ratings):
pass
class NA_Evelynn_Jng_Zilean(Ratings):
pass
class NA_Evelynn_Jng_Zyra(Ratings):
pass
| 16.026379 | 46 | 0.77151 | 972 | 6,683 | 4.878601 | 0.151235 | 0.203712 | 0.407423 | 0.494728 | 0.808941 | 0.808941 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166243 | 6,683 | 416 | 47 | 16.064904 | 0.851041 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 8 |
9196928c165bef6c94961584a595151dbcb280f8 | 43 | py | Python | instance/config.py | Rustique-Uwimpaye/News-App | a1ae774046cc706f355d7fb2be5aa63bb934141d | [
"MIT"
] | null | null | null | instance/config.py | Rustique-Uwimpaye/News-App | a1ae774046cc706f355d7fb2be5aa63bb934141d | [
"MIT"
] | null | null | null | instance/config.py | Rustique-Uwimpaye/News-App | a1ae774046cc706f355d7fb2be5aa63bb934141d | [
"MIT"
] | null | null | null | API_KEY="4010d6ff152744d5853fbb6953696c1f"
| 21.5 | 42 | 0.906977 | 3 | 43 | 12.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.547619 | 0.023256 | 43 | 1 | 43 | 43 | 0.357143 | 0 | 0 | 0 | 0 | 0 | 0.744186 | 0.744186 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
91e9b4ad06a129c5c51bcafa6c3895489cda88b9 | 36,773 | py | Python | generate.py | joaor96/BLADE | 6f0cd0e7e5fe8f7d200a63719ecb347987fd9a1e | [
"Apache-2.0"
] | 5 | 2020-04-12T22:05:14.000Z | 2021-09-29T08:52:05.000Z | generate.py | joaor96/RBM-tDBN | 6f0cd0e7e5fe8f7d200a63719ecb347987fd9a1e | [
"Apache-2.0"
] | null | null | null | generate.py | joaor96/RBM-tDBN | 6f0cd0e7e5fe8f7d200a63719ecb347987fd9a1e | [
"Apache-2.0"
] | 1 | 2020-09-24T15:57:15.000Z | 2020-09-24T15:57:15.000Z | # -*- coding: utf-8 -*-
"""
Created on Tue Oct 1 10:26:50 2019
@author: joaor
"""
import numpy as np
import pandas as pd
n_instances = 400
n_time_points = 5
def generate_binomial_1(n_instances,n_time_points):
n_features=2
data = np.zeros([n_instances, n_features*n_time_points])
data[:,0] = np.random.binomial(1, 0.5, n_instances)
labels = np.zeros([n_instances, 1])
for i in range(0,n_instances):
labels[i] = np.random.binomial(1, 0.5, 1)
#LABEL 0
if labels[i] == 0:
if data[i,0] == 0:
data[i,1] = np.random.binomial(1, 0.1, 1)
else:
data[i,1] = np.random.binomial(1, 0.9, 1)
for t in range(n_time_points-1):
if data[i,t*n_features] == 0 and data[i,t*n_features+1] == 0:
data[i,t*n_features+2] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+3] = np.random.binomial(1, 0.1, 1)
elif data[i,t*n_features] == 1 and data[i,t*n_features+1] == 1:
data[i,t*n_features+2] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+3] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+2] = np.random.binomial(1, 0.5, 1)
data[i,t*n_features+3] = np.random.binomial(1, 0.5, 1)
#LABEL 1
elif labels[i] == 1:
if data[i,0] == 0:
data[i,1] = np.random.binomial(1, 0.1, 1)
else:
data[i,1] = np.random.binomial(1, 0.9, 1)
for t in range(n_time_points-1):
if data[i,t*n_features] == 0 and data[i,t*n_features+1] == 0:
data[i,t*n_features+2] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+3] = np.random.binomial(1, 0.9, 1)
elif data[i,t*n_features] == 1 and data[i,t*n_features+1] == 1:
data[i,t*n_features+2] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+3] = np.random.binomial(1, 0.1, 1)
else:
data[i,t*n_features+2] = np.random.binomial(1, 0.5, 1)
data[i,t*n_features+3] = np.random.binomial(1, 0.5, 1)
col = []
for t in range(n_time_points):
for f in range(n_features):
col.append("X"+str(f)+"__"+str(t))
df = pd.DataFrame(data=data, # values
index=list(range(n_instances)), # 1st column as index
columns=col)
df.index.name = 'subject_id'
labels_df = pd.DataFrame(data=labels, # values
index=list(range(n_instances)), # 1st column as index
columns=['label'])
labels_df.index.name = 'subject_id'
df.to_csv('binomial_1_'+str(n_time_points)+'_parsed.csv',quoting=1)
labels_df.to_csv('binomial_1_'+str(n_time_points)+'_target.csv',quoting=1)
def generate_binomial_2(n_instances,n_time_points):
n_features=5
data = np.zeros([n_instances, n_features*n_time_points])
data[:,0] = np.random.binomial(1, 0.5, n_instances)
data[:,1] = np.random.binomial(1, 0.5, n_instances)
labels = np.zeros([n_instances, 1])
for i in range(0,n_instances):
labels[i] = np.random.binomial(1, 0.5, 1)
#LABEL 0
if labels[i] == 0:
if data[i,1] == 0:
data[i,2] = np.random.binomial(1, 0.9, 1)
data[i,3] = np.random.binomial(1, 0.1, 1)
else:
data[i,2] = np.random.binomial(1, 0.1, 1)
data[i,3] = np.random.binomial(1, 0.9, 1)
if data[i,2] == 0 and data[i,3] == 1:
data[i,4] = np.random.binomial(1, 0.1, 1)
elif data[i,2] == 1 and data[i,3] == 0:
data[i,4] = np.random.binomial(1, 0.9, 1)
else:
data[i,4] = np.random.binomial(1, 0.5, 1)
for t in range(n_time_points-1):
if data[i,t*n_features] == 0:
data[i,t*n_features+5] = np.random.binomial(1, 0.7, 1)
else:
data[i,t*n_features+5] = np.random.binomial(1, 0.3, 1)
if data[i,t*n_features+5] == 0:
data[i,t*n_features+6] = np.random.binomial(1, 0.1, 1)
else:
data[i,t*n_features+6] = np.random.binomial(1, 0.9, 1)
if data[i,t*n_features+6] == 0:
data[i,t*n_features+7] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+8] = np.random.binomial(1, 0.1, 1)
else:
data[i,t*n_features+7] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+8] = np.random.binomial(1, 0.9, 1)
if data[i,t*n_features+7] == 0 and data[i,t*n_features+8] == 1:
data[i,t*n_features+9] = np.random.binomial(1, 0.1, 1)
elif data[i,t*n_features+7] == 1 and data[i,t*n_features+8] == 0:
data[i,t*n_features+9] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+9] = np.random.binomial(1, 0.5, 1)
#LABEL 1
elif labels[i] == 1:
if data[i,1] == 0:
data[i,2] = np.random.binomial(1, 0.1, 1)
data[i,4] = np.random.binomial(1, 0.9, 1)
else:
data[i,2] = np.random.binomial(1, 0.9, 1)
data[i,4] = np.random.binomial(1, 0.1, 1)
if data[i,2] == 1 and data[i,4] == 0:
data[i,3] = np.random.binomial(1, 0.1, 1)
elif data[i,2] == 0 and data[i,4] == 1:
data[i,3] = np.random.binomial(1, 0.9, 1)
else:
data[i,3] = np.random.binomial(1, 0.5, 1)
for t in range(n_time_points-1):
if data[i,t*n_features] == 0:
data[i,t*n_features+5] = np.random.binomial(1, 0.3, 1)
else:
data[i,t*n_features+5] = np.random.binomial(1, 0.7, 1)
if data[i,t*n_features+5] == 0:
data[i,t*n_features+6] = np.random.binomial(1, 0.1, 1)
else:
data[i,t*n_features+6] = np.random.binomial(1, 0.9, 1)
if data[i,t*n_features+6] == 0:
data[i,t*n_features+7] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+9] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+7] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+9] = np.random.binomial(1, 0.1, 1)
if data[i,t*n_features+7] == 1 and data[i,t*n_features+9] == 0:
data[i,t*n_features+8] = np.random.binomial(1, 0.1, 1)
elif data[i,t*n_features+7] == 0 and data[i,t*n_features+9] == 1:
data[i,t*n_features+8] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+8] = np.random.binomial(1, 0.5, 1)
col = []
for t in range(n_time_points):
for f in range(n_features):
col.append("X"+str(f)+"__"+str(t))
df = pd.DataFrame(data=data, # values
index=list(range(n_instances)), # 1st column as index
columns=col)
df.index.name = 'subject_id'
for t in range(n_time_points):
df.drop(columns=["X0__"+str(t)], inplace=True)
labels_df = pd.DataFrame(data=labels, # values
index=list(range(n_instances)), # 1st column as index
columns=['label'])
labels_df.index.name = 'subject_id'
df.to_csv('binomial_2_'+str(n_time_points)+'_parsed.csv',quoting=1)
labels_df.to_csv('binomial_2_'+str(n_time_points)+'_target.csv',quoting=1)
def generate_binomial_3(n_instances,n_time_points):
n_features=5
data = np.zeros([n_instances, n_features*n_time_points])
data[:,0] = np.random.binomial(1, 0.5, n_instances)
data[:,1] = np.random.binomial(1, 0.5, n_instances)
labels = np.zeros([n_instances, 1])
for i in range(0,n_instances):
labels[i] = np.random.binomial(1, 0.5, 1)
#LABEL 0
if labels[i] == 0:
if data[i,0] == 0:
data[i,2] = np.random.binomial(1, 0.9, 1)
data[i,3] = np.random.binomial(1, 0.7, 1)
else:
data[i,2] = np.random.binomial(1, 0.1, 1)
data[i,3] = np.random.binomial(1, 0.3, 1)
if data[i,1] == 0:
data[i,4] = np.random.binomial(1, 0.9, 1)
else:
data[i,4] = np.random.binomial(1, 0.1, 1)
for t in range(n_time_points-1):
if data[i,t*n_features] == 0:
data[i,t*n_features+5] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+5] = np.random.binomial(1, 0.1, 1)
if data[i,t*n_features+5] == 0:
data[i,t*n_features+7] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+8] = np.random.binomial(1, 0.7, 1)
else:
data[i,t*n_features+7] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+8] = np.random.binomial(1, 0.3, 1)
if data[i,t*n_features+6] == 0:
data[i,t*n_features+9] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+9] = np.random.binomial(1, 0.1, 1)
#LABEL 1
elif labels[i] == 1:
if data[i,0] == 0:
data[i,2] = np.random.binomial(1, 0.1, 1)
data[i,4] = np.random.binomial(1, 0.7, 1)
else:
data[i,2] = np.random.binomial(1, 0.9, 1)
data[i,4] = np.random.binomial(1, 0.3, 1)
if data[i,1] == 0:
data[i,3] = np.random.binomial(1, 0.1, 1)
else:
data[i,3] = np.random.binomial(1, 0.9, 1)
for t in range(n_time_points-1):
if data[i,t*n_features] == 0:
data[i,t*n_features+5] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+5] = np.random.binomial(1, 0.1, 1)
if data[i,t*n_features+1] == 0:
data[i,t*n_features+6] = np.random.binomial(1, 0.6, 1)
else:
data[i,t*n_features+6] = np.random.binomial(1, 0.4, 1)
if data[i,t*n_features+5] == 0:
data[i,t*n_features+7] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+9] = np.random.binomial(1, 0.7, 1)
else:
data[i,t*n_features+7] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+9] = np.random.binomial(1, 0.3, 1)
if data[i,t*n_features+6] == 0:
data[i,t*n_features+8] = np.random.binomial(1, 0.1, 1)
else:
data[i,t*n_features+8] = np.random.binomial(1, 0.9, 1)
col = []
for t in range(n_time_points):
for f in range(n_features):
col.append("X"+str(f)+"__"+str(t))
df = pd.DataFrame(data=data, # values
index=list(range(n_instances)), # 1st column as index
columns=col)
df.index.name = 'subject_id'
for t in range(n_time_points):
df.drop(columns=["X0__"+str(t)], inplace=True)
df.drop(columns=["X1__"+str(t)], inplace=True)
labels_df = pd.DataFrame(data=labels, # values
index=list(range(n_instances)), # 1st column as index
columns=['label'])
labels_df.index.name = 'subject_id'
df.to_csv('binomial_3_'+str(n_time_points)+'_parsed.csv',quoting=1)
labels_df.to_csv('binomial_3_'+str(n_time_points)+'_target.csv',quoting=1)
def generate_multinomial_1(n_instances,n_time_points):
n_features=3
values=np.arange(3)
data = np.zeros([n_instances, n_features*n_time_points])
uniform=np.ones(len(values))/len(values)
data[:,0] = np.random.choice(values,p=uniform, size=n_instances)
labels = np.zeros([n_instances, 1])
for i in range(0,n_instances):
labels[i] = np.random.binomial(1, 0.5, 1)
#LABEL 0
if labels[i] == 0:
if data[i,0] == 2:
data[i,1] = np.random.choice(values,p=[0.9,0.05,0.05])
elif data[i,0] == 0:
data[i,1] = np.random.choice(values,p=[0.05,0.05,0.9])
else:
data[i,1] = np.random.choice(values,p=[0.05,0.9,0.05])
if data[i,0] == 2:
data[i,2] = np.random.choice(values,p=uniform)
elif data[i,0] == 0:
data[i,2] = np.random.choice(values,p=uniform)
else:
data[i,2] = np.random.choice(values,p=uniform)
#THIS FOR TIME SLICE
for t in range(n_time_points-1):
if data[i,t*n_features] == 2 and data[i,t*n_features+1] == 0:
data[i,t*n_features+3] = np.random.choice(values,p=[0.9,0.05,0.05])
data[i,t*n_features+4] = np.random.choice(values,p=[0.05,0.05,0.9])
elif data[i,t*n_features] == 0 and data[i,t*n_features+1] == 2:
data[i,t*n_features+3] = np.random.choice(values,p=[0.05,0.9,0.05])
data[i,t*n_features+4] = np.random.choice(values,p=[0.05,0.9,0.05])
elif data[i,t*n_features] == 1 and data[i,t*n_features+1] == 1:
data[i,t*n_features+3] = np.random.choice(values,p=[0.05,0.05,0.9])
data[i,t*n_features+4] = np.random.choice(values,p=[0.9,0.05,0.05])
else:
data[i,t*n_features+3] = np.random.choice(values,p=uniform)
data[i,t*n_features+4] = np.random.choice(values,p=uniform)
if data[i,t*n_features+3] == 2:
data[i,t*n_features+5] = np.random.choice(values,p=uniform)
elif data[i,t*n_features+3] == 0:
data[i,t*n_features+5] = np.random.choice(values,p=uniform)
else:
data[i,t*n_features+5] = np.random.choice(values,p=uniform)
#LABEL 1
elif labels[i] == 1:
if data[i,0] == 2:
data[i,2] = np.random.choice(values,p=[0.9,0.05,0.05])
elif data[i,0] == 0:
data[i,2] = np.random.choice(values,p=[0.05,0.05,0.9])
else:
data[i,2] = np.random.choice(values,p=[0.05,0.9,0.05])
if data[i,0] == 2:
data[i,1] = np.random.choice(values,p=uniform)
elif data[i,0] == 0:
data[i,1] = np.random.choice(values,p=uniform)
else:
data[i,1] = np.random.choice(values,p=uniform)
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
if data[i,t*n_features] == 2 and data[i,t*n_features+2] == 0:
data[i,t*n_features+3] = np.random.choice(values,p=[0.9,0.05,0.05])
data[i,t*n_features+5] = np.random.choice(values,p=[0.05,0.05,0.9])
elif data[i,t*n_features+0] == 0 and data[i,t*n_features+2] == 2:
data[i,t*n_features+3] = np.random.choice(values,p=[0.05,0.9,0.05])
data[i,t*n_features+5] = np.random.choice(values,p=[0.05,0.9,0.05])
elif data[i,t*n_features] == 1 and data[i,t*n_features+2] == 1:
data[i,t*n_features+3] = np.random.choice(values,p=[0.05,0.05,0.9])
data[i,t*n_features+5] = np.random.choice(values,p=[0.9,0.05,0.05])
else:
data[i,t*n_features+3] = np.random.choice(values,p=uniform)
data[i,t*n_features+5] = np.random.choice(values,p=uniform)
if data[i,t*n_features+3] == 2:
data[i,t*n_features+4] = np.random.choice(values,p=uniform)
elif data[i,t*n_features+4] == 0:
data[i,t*n_features+4] = np.random.choice(values,p=uniform)
else:
data[i,t*n_features+4] = np.random.choice(values,p=uniform)
col = []
for t in range(n_time_points):
for f in range(n_features):
col.append("X"+str(f)+"__"+str(t))
df = pd.DataFrame(data=data, # values
index=list(range(n_instances)), # 1st column as index
columns=col)
df.index.name = 'subject_id'
labels_df = pd.DataFrame(data=labels, # values
index=list(range(n_instances)), # 1st column as index
columns=['label'])
labels_df.index.name = 'subject_id'
df.to_csv('multinomial_1_'+str(n_time_points)+'_parsed.csv',quoting=1)
labels_df.to_csv('multinomial_1_'+str(n_time_points)+'_target.csv',quoting=1)
def generate_multinomial_2(n_instances,n_time_points):
n_features=4
values=np.arange(3)
data = np.zeros([n_instances, n_features*n_time_points])
uniform=np.ones(len(values))/len(values)
labels = np.zeros([n_instances, 1])
for i in range(0,n_instances):
labels[i] = np.random.binomial(1, 0.5, 1)
#LABEL 0
if labels[i] == 0:
data[i,0] = np.random.choice(values,p=uniform)
if data[i,0] == 2:
data[i,2] = np.random.choice(values,p=[0.9,0.05,0.05])
elif data[i,0] == 0:
data[i,2] = np.random.choice(values,p=[0.05,0.05,0.9])
else:
data[i,2] = np.random.choice(values,p=uniform)
data[i,1] = np.random.choice(values,p=uniform)
data[i,3] = np.random.choice(values,p=uniform)
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
if data[i,t*n_features] == 2 and data[i,t*n_features+2] == 0:
data[i,t*n_features+4] = np.random.choice(values,p=[0.9,0.05,0.05])
data[i,t*n_features+6] = np.random.choice(values,p=[0.05,0.05,0.9])
elif data[i,t*n_features] == 0 and data[i,t*n_features+2] == 2:
data[i,t*n_features+4] = np.random.choice(values,p=[0.05,0.05,0.9])
data[i,t*n_features+6] = np.random.choice(values,p=[0.9,0.05,0.05])
else:
data[i,t*n_features+4] = np.random.choice(values,p=uniform)
data[i,t*n_features+6] = np.random.choice(values,p=uniform)
data[i,t*n_features+5] = np.random.choice(values,p=uniform)
data[i,t*n_features+7] = np.random.choice(values,p=uniform)
#LABEL 1
elif labels[i] == 1:
data[i,1] = np.random.choice(values,p=uniform)
if data[i,1] == 2:
data[i,3] = np.random.choice(values,p=[0.9,0.05,0.05])
elif data[i,1] == 0:
data[i,3] = np.random.choice(values,p=[0.05,0.05,0.9])
else:
data[i,3] = np.random.choice(values,p=uniform)
data[i,0] = np.random.choice(values,p=uniform)
data[i,2] = np.random.choice(values,p=uniform)
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
if data[i,t*n_features+1] == 2 and data[i,t*n_features+3] == 0:
data[i,t*n_features+5] = np.random.choice(values,p=[0.9,0.05,0.05])
data[i,t*n_features+7] = np.random.choice(values,p=[0.05,0.05,0.9])
elif data[i,t*n_features+1] == 0 and data[i,t*n_features+3] == 2:
data[i,t*n_features+5] = np.random.choice(values,p=[0.05,0.05,0.9])
data[i,t*n_features+7] = np.random.choice(values,p=[0.9,0.05,0.05])
else:
data[i,t*n_features+5] = np.random.choice(values,p=uniform)
data[i,t*n_features+7] = np.random.choice(values,p=uniform)
data[i,t*n_features+4] = np.random.choice(values,p=uniform)
data[i,t*n_features+6] = np.random.choice(values,p=uniform)
col = []
for t in range(n_time_points):
for f in range(n_features):
col.append("X"+str(f)+"__"+str(t))
df = pd.DataFrame(data=data, # values
index=list(range(n_instances)), # 1st column as index
columns=col)
df.index.name = 'subject_id'
labels_df = pd.DataFrame(data=labels, # values
index=list(range(n_instances)), # 1st column as index
columns=['label'])
labels_df.index.name = 'subject_id'
df.to_csv('multinomial_2_'+str(n_time_points)+'_parsed.csv',quoting=1)
labels_df.to_csv('multinomial_2_'+str(n_time_points)+'_target.csv',quoting=1)
def generate_multiclass(n_instances,n_time_points):
n_features=10
n_values = 4
values=np.arange(n_values)
classes=np.arange(6)
data = np.zeros([n_instances, n_features*n_time_points])
uniform=np.ones(n_values)/n_values
uniform_class=np.ones(len(classes))/len(classes)
for i in range(n_instances):
for j in range(n_features*n_time_points):
data[i,j] = np.random.choice(values,p=uniform)
labels = np.zeros([n_instances, 1])
for i in range(0,n_instances):
labels[i] = np.random.choice(classes,p=uniform_class)
#LABEL 0
if labels[i] == 0:
data[i,0] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,1] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,2] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
data[i,3] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+0] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+1] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+2] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
data[i,t*n_features+n_features+3] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
#LABEL 1
elif labels[i] == 1:
data[i,0] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,1] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,2] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,3] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+0] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+1] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+2] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,t*n_features+n_features+3] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
#LABEL 2
elif labels[i] == 2:
data[i,2] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,3] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,4] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,5] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+2] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,t*n_features+n_features+3] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,t*n_features+n_features+4] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,t*n_features+n_features+5] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
#LABEL 3
elif labels[i] == 3:
data[i,2] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,3] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,4] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,5] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+2] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,t*n_features+n_features+3] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,t*n_features+n_features+4] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,t*n_features+n_features+5] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
#LABEL 4
elif labels[i] == 4:
data[i,4] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,5] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,6] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
data[i,7] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+4] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,t*n_features+n_features+5] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,t*n_features+n_features+6] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
data[i,t*n_features+n_features+7] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
#LABEL 5
elif labels[i] == 5:
data[i,4] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,5] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,6] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,7] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+4] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,t*n_features+n_features+5] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,t*n_features+n_features+6] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+7] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
#LABEL 6
elif labels[i] == 6:
data[i,6] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,7] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,8] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
data[i,9] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+6] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+7] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+8] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
data[i,t*n_features+n_features+9] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
#LABEL 7
elif labels[i] == 7:
data[i,7] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,6] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,8] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,9] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+6] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+7] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+8] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,t*n_features+n_features+9] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
#LABEL 8
elif labels[i] == 8:
data[i,0] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,1] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,8] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,9] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+0] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,t*n_features+n_features+1] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,t*n_features+n_features+8] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,t*n_features+n_features+9] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
#LABEL 9
elif labels[i] == 9:
data[i,0] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,1] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,8] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,9] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
#THIS FOR TIME SLICE 1
for t in range(n_time_points-1):
data[i,t*n_features+n_features+0] = np.random.choice(values,p=[0.05,0.05,0.05,0.85])
data[i,t*n_features+n_features+1] = np.random.choice(values,p=[0.85,0.05,0.05,0.05])
data[i,t*n_features+n_features+8] = np.random.choice(values,p=[0.05,0.05,0.85,0.05])
data[i,t*n_features+n_features+9] = np.random.choice(values,p=[0.05,0.85,0.05,0.05])
col = []
for t in range(n_time_points):
for f in range(n_features):
col.append("X"+str(f)+"__"+str(t))
df = pd.DataFrame(data=data, # values
index=list(range(n_instances)), # 1st column as index
columns=col)
df.index.name = 'subject_id'
labels_df = pd.DataFrame(data=labels, # values
index=list(range(n_instances)), # 1st column as index
columns=['label'])
labels_df.index.name = 'subject_id'
df.to_csv('multiclass_'+str(len(classes))+'_parsed.csv',quoting=1)
labels_df.to_csv('multiclass_'+str(len(classes))+'_target.csv',quoting=1)
def generate_binomial_4(n_instances,n_time_points):
n_features=10
data = np.zeros([n_instances, n_features*n_time_points])
labels = np.zeros([n_instances, 1])
for j in range(n_features*n_time_points):
data[:,j] = np.random.binomial(1, 0.5, n_instances)
for i in range(0,n_instances):
labels[i] = np.random.binomial(1, 0.5, 1)
#LABEL 0
if labels[i] == 0:
if data[i,0] == 0:
data[i,1] = np.random.binomial(1, 0.1, 1)
else:
data[i,1] = np.random.binomial(1, 0.9, 1)
if data[i,2] == 0:
data[i,3] = np.random.binomial(1, 0.9, 1)
else:
data[i,3] = np.random.binomial(1, 0.1, 1)
for t in range(n_time_points-1):
if data[i,t*n_features+0] == 0 and data[i,t*n_features+1] == 0:
data[i,t*n_features+n_features+0] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+n_features+1] = np.random.binomial(1, 0.9, 1)
elif data[i,t*n_features+0] == 1 and data[i,t*n_features+1] == 1:
data[i,t*n_features+n_features+0] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+n_features+1] = np.random.binomial(1, 0.1, 1)
else:
data[i,t*n_features+n_features+0] = np.random.binomial(1, 0.5, 1)
data[i,t*n_features+n_features+1] = np.random.binomial(1, 0.5, 1)
if data[i,t*n_features+2] == 0 and data[i,t*n_features+3] == 1:
data[i,t*n_features+n_features+2] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+n_features+3] = np.random.binomial(1, 0.1, 1)
elif data[i,t*n_features+2] == 1 and data[i,t*n_features+3] == 0:
data[i,t*n_features+n_features+2] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+n_features+3] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+n_features+2] = np.random.binomial(1, 0.5, 1)
data[i,t*n_features+n_features+3] = np.random.binomial(1, 0.5, 1)
#LABEL 1
elif labels[i] == 1:
if data[i,0] == 0:
data[i,3] = np.random.binomial(1, 0.9, 1)
else:
data[i,3] = np.random.binomial(1, 0.1, 1)
if data[i,1] == 0:
data[i,2] = np.random.binomial(1, 0.9, 1)
else:
data[i,2] = np.random.binomial(1, 0.1, 1)
for t in range(n_time_points-1):
if data[i,t*n_features+0] == 0 and data[i,t*n_features+3] == 1:
data[i,t*n_features+n_features+0] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+n_features+3] = np.random.binomial(1, 0.1, 1)
elif data[i,t*n_features+0] == 1 and data[i,t*n_features+3] == 0:
data[i,t*n_features+n_features+0] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+n_features+3] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+n_features+0] = np.random.binomial(1, 0.5, 1)
data[i,t*n_features+n_features+3] = np.random.binomial(1, 0.5, 1)
if data[i,t*n_features+1] == 0 and data[i,t*n_features+2] == 1:
data[i,t*n_features+n_features+1] = np.random.binomial(1, 0.9, 1)
data[i,t*n_features+n_features+2] = np.random.binomial(1, 0.1, 1)
elif data[i,t*n_features+1] == 1 and data[i,t*n_features+2] == 0:
data[i,t*n_features+n_features+1] = np.random.binomial(1, 0.1, 1)
data[i,t*n_features+n_features+2] = np.random.binomial(1, 0.9, 1)
else:
data[i,t*n_features+n_features+1] = np.random.binomial(1, 0.5, 1)
data[i,t*n_features+n_features+2] = np.random.binomial(1, 0.5, 1)
col = []
for t in range(n_time_points):
for f in range(n_features):
col.append("X"+str(f)+"__"+str(t))
df = pd.DataFrame(data=data, # values
index=list(range(n_instances)), # 1st column as index
columns=col)
df.index.name = 'subject_id'
labels_df = pd.DataFrame(data=labels, # values
index=list(range(n_instances)), # 1st column as index
columns=['label'])
labels_df.index.name = 'subject_id'
df.to_csv('binomial_joao_parsed.csv',quoting=1)
labels_df.to_csv('binomial_joao_target.csv',quoting=1) | 47.387887 | 100 | 0.503141 | 6,012 | 36,773 | 2.977212 | 0.015635 | 0.100006 | 0.050059 | 0.087212 | 0.975697 | 0.974971 | 0.97335 | 0.963518 | 0.9505 | 0.938097 | 0 | 0.09425 | 0.331765 | 36,773 | 776 | 101 | 47.387887 | 0.634151 | 0.024447 | 0 | 0.798978 | 1 | 0 | 0.014859 | 0.001341 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011925 | false | 0 | 0.003407 | 0 | 0.015332 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
53275dd0a614e032bd723228fec2c3b5916a7706 | 64 | py | Python | experiments/__init__.py | DreamerDeo/tensor2struct-public | 48e41b7faf041189c17dff8445d9e2b4d709e753 | [
"MIT"
] | null | null | null | experiments/__init__.py | DreamerDeo/tensor2struct-public | 48e41b7faf041189c17dff8445d9e2b4d709e753 | [
"MIT"
] | null | null | null | experiments/__init__.py | DreamerDeo/tensor2struct-public | 48e41b7faf041189c17dff8445d9e2b4d709e753 | [
"MIT"
] | null | null | null | from experiments import semi_sup
from experiments import sql2nl
| 21.333333 | 32 | 0.875 | 9 | 64 | 6.111111 | 0.666667 | 0.545455 | 0.763636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 0.125 | 64 | 2 | 33 | 32 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
72633c6497f4d63895e716268a34687d2c0fa9d5 | 18,692 | py | Python | matrixprofile/algorithms/top_k_motifs.py | MORE-EU/matrixprofile | 7c598385f7723f337d7bf7d3f90cffb690c6b0df | [
"Apache-2.0"
] | null | null | null | matrixprofile/algorithms/top_k_motifs.py | MORE-EU/matrixprofile | 7c598385f7723f337d7bf7d3f90cffb690c6b0df | [
"Apache-2.0"
] | null | null | null | matrixprofile/algorithms/top_k_motifs.py | MORE-EU/matrixprofile | 7c598385f7723f337d7bf7d3f90cffb690c6b0df | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
range = getattr(__builtins__, 'xrange', range)
# end of py2 compatability boilerplate
import numpy as np
from matrixprofile import core
from matrixprofile.algorithms.mass2 import mass2
def pmp_top_k_motifs(profile, exclusion_zone=None, k=3, max_neighbors=10, radius=3):
"""
Find the top K number of motifs (patterns) given a pan matrix profile. By
default the algorithm will find up to 3 motifs (k) and up to 10 of their
neighbors with a radius of 3 * min_dist.
Parameters
----------
profile : dict
The output from one of the pan matrix profile algorithms.
exclusion_zone : int, Default to algorithm ez
Desired number of values to exclude on both sides of the motif. This
avoids trivial matches. It defaults to half of the computed window
size. Setting the exclusion zone to 0 makes it not apply.
k : int, Default = 3
Desired number of motifs to find.
max_neighbors : int, Default = 10
The maximum number of neighbors to include for a given motif.
radius : int, Default = 3
The radius is used to associate a neighbor by checking if the
neighbor's distance is less than or equal to dist * radius
Returns
-------
profile : dict
The original input obj with the addition of the "motifs" key. The
motifs key consists of the following structure.
A list of dicts containing motif indices and their corresponding
neighbor indices. Note that each index is a (row, col) index
corresponding to the pan matrix profile.
>>> [
>>> {
>>> 'motifs': [first_index, second_index],
>>> 'neighbors': [index, index, index ...max_neighbors]
>>> }
>>> ]
"""
if not core.is_pmp_obj(profile):
raise ValueError('Expecting PMP data structure!')
data = profile.get('data', None)
ts = data.get('ts', None)
data_len = len(ts)
pmp = profile.get('pmp', None)
profile_len = pmp.shape[1]
pmpi = profile.get('pmpi', None)
windows = profile.get('windows', None)
# make sure we are working with Euclidean distances
tmp = None
if core.is_pearson_array(pmp):
tmp = core.pearson_to_euclidean(pmp, windows)
else:
tmp = np.copy(pmp).astype('d')
# replace nan and infs with infinity
tmp[core.nan_inf_indices(tmp)] = np.inf
motifs = []
for _ in range(k):
min_idx = np.unravel_index(np.argmin(tmp), tmp.shape)
min_dist = tmp[min_idx]
# nothing else to find...
if core.is_nan_inf(min_dist):
break
# create the motif pair
min_row_idx = min_idx[0]
min_col_idx = min_idx[1]
# motif pairs are respective to the column of the matching row
first_idx = np.min([min_col_idx, pmpi[min_row_idx][min_col_idx]])
second_idx = np.max([min_col_idx, pmpi[min_row_idx][min_col_idx]])
# compute distance profile for first appearance
window_size = windows[min_row_idx]
query = ts[first_idx:first_idx + window_size]
distance_profile = mass2(ts, query)
# extend the distance profile to be as long as the original
infs = np.full(profile_len - len(distance_profile), np.inf)
distance_profile = np.append(distance_profile, infs)
# exclude already picked motifs and neighbors
mask = core.nan_inf_indices(pmp[min_row_idx])
distance_profile[mask] = np.inf
# determine the exclusion zone if not set
if not exclusion_zone:
exclusion_zone = int(np.floor(window_size / 2))
# apply exclusion zone for motif pair
for j in (first_idx, second_idx):
distance_profile = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
j,
distance_profile
)
tmp2 = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
j,
tmp[min_row_idx]
)
tmp[min_row_idx] = tmp2
# find up to max_neighbors
neighbors = []
for j in range(max_neighbors):
neighbor_idx = np.argmin(distance_profile)
neighbor_dist = np.real(distance_profile[neighbor_idx])
not_in_radius = not ((radius * min_dist) >= neighbor_dist)
# no more neighbors exist based on radius
if core.is_nan_inf(neighbor_dist) or not_in_radius:
break
# add neighbor and apply exclusion zone
neighbors.append((min_row_idx, neighbor_idx))
distance_profile = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
neighbor_idx,
distance_profile
)
tmp2 = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
neighbor_idx,
tmp[min_row_idx]
)
tmp[min_row_idx] = tmp2
# add the motifs and neighbors
# note that they are (row, col) indices
motifs.append({
'motifs': [(min_row_idx, first_idx), (min_row_idx, second_idx)],
'neighbors': neighbors
})
profile['motifs'] = motifs
return profile
def mp_top_k_motifs(profile, exclusion_zone=None, k=3, max_neighbors=10, radius=3, use_cmp=False):
"""
Find the top K number of motifs (patterns) given a matrix profile. By
default the algorithm will find up to 3 motifs (k) and up to 10 of their
neighbors with a radius of 3 * min_dist using the regular matrix profile.
Parameters
----------
profile : dict
The output from one of the matrix profile algorithms.
exclusion_zone : int, Default to algorithm ez
Desired number of values to exclude on both sides of the motif. This
avoids trivial matches. It defaults to half of the computed window
size. Setting the exclusion zone to 0 makes it not apply.
k : int, Default = 3
Desired number of motifs to find.
max_neighbors : int, Default = 10
The maximum number of neighbors to include for a given motif.
radius : int, Default = 3
The radius is used to associate a neighbor by checking if the
neighbor's distance is less than or equal to dist * radius
use_cmp : bool, Default = False
Use the Corrected Matrix Profile to compute the motifs.
Returns
-------
dict : profile
The original input obj with the addition of the "motifs" key. The
motifs key consists of the following structure.
A list of dicts containing motif indices and their corresponding
neighbor indices.
>>> [
>>> {
>>> 'motifs': [first_index, second_index],
>>> 'neighbors': [index, index, index ...max_neighbors]
>>> }
>>> ]
"""
if not core.is_mp_obj(profile):
raise ValueError('Expecting MP data structure!')
window_size = profile['w']
data = profile.get('data', None)
if data:
ts = data.get('ts', None)
data_len = len(ts)
motifs = []
mp = np.copy(profile['mp'])
if use_cmp:
mp = np.copy(profile['cmp'])
mpi = profile['pi']
# TODO: this is based on STOMP standards when this motif finding algorithm
# originally came out. Should we default this to 4.0 instead? That seems
# to be the common value now per new research.
if exclusion_zone is None:
exclusion_zone = profile.get('ez', None)
for i in range(k):
min_idx = np.argmin(mp)
min_dist = mp[min_idx]
# we no longer have any motifs to find as all values are nan/inf
if core.is_nan_inf(min_dist):
break
# create a motif pair corresponding to the first appearance and
# second appearance
first_idx = np.min([min_idx, mpi[min_idx]])
second_idx = np.max([min_idx, mpi[min_idx]])
# compute distance profile using mass2 for first appearance
query = ts[first_idx:first_idx + window_size]
distance_profile = mass2(ts, query)
# exclude already picked motifs and neighbors
mask = core.nan_inf_indices(mp)
distance_profile[mask] = np.inf
# apply exclusion zone for motif pair
for j in (first_idx, second_idx):
distance_profile = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
j,
distance_profile
)
mp = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
j,
mp
)
# find up to max_neighbors
neighbors = []
for j in range(max_neighbors):
neighbor_idx = np.argmin(distance_profile)
neighbor_dist = distance_profile[neighbor_idx]
not_in_radius = not ((radius * min_dist) >= neighbor_dist)
# no more neighbors exist based on radius
if core.is_nan_inf(neighbor_dist) or not_in_radius:
break
# add neighbor and apply exclusion zone
neighbors.append(neighbor_idx)
distance_profile = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
neighbor_idx,
distance_profile
)
mp = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
neighbor_idx,
mp
)
# add motifs and neighbors to results
motifs.append({
'motifs': [first_idx, second_idx],
'neighbors': neighbors
})
profile['motifs'] = motifs
return profile
def mp_top_k_motifs_md(profile, exclusion_zone=None, k=3, max_neighbors=10, radius=3, use_cmp=False):
# Custom counterpart of mp_top_k_motifs that works on multidimensional timeseries
"""
Find the top K number of motifs (patterns) given a matrix profile. By
default the algorithm will find up to 3 motifs (k) and up to 10 of their
neighbors with a radius of 3 * min_dist using the regular matrix profile.
Parameters
----------
profile : dict
The output from one of the matrix profile algorithms.
exclusion_zone : int, Default to algorithm ez
Desired number of values to exclude on both sides of the motif. This
avoids trivial matches. It defaults to half of the computed window
size. Setting the exclusion zone to 0 makes it not apply.
k : int, Default = 3
Desired number of motifs to find.
max_neighbors : int, Default = 10
The maximum number of neighbors to include for a given motif.
radius : int, Default = 3
The radius is used to associate a neighbor by checking if the
neighbor's distance is less than or equal to dist * radius
use_cmp : bool, Default = False
Use the Corrected Matrix Profile to compute the motifs.
Returns
-------
dict : profile
The original input obj with the addition of the "motifs" key. The
motifs key consists of the following structure.
A list of dicts containing motif indices and their corresponding
neighbor indices.
>>> [
>>> {
>>> 'motifs': [first_index, second_index],
>>> 'neighbors': [index, index, index ...max_neighbors]
>>> }
>>> ]
"""
if not core.is_mp_obj(profile):
raise ValueError('Expecting MP data structure!')
window_size = profile['w']
data = profile.get('data', None)
if data:
ts = data.get('ts', None).T
dims = ts.shape[0]
data_len = ts.shape[1]
dp_len = data_len - window_size + 1
motifs = []
mp = np.copy(profile['mp'])
if use_cmp:
mp = np.copy(profile['cmp'])
mpi = profile['pi']
# TODO: this is based on STOMP standards when this motif finding algorithm
# originally came out. Should we default this to 4.0 instead? That seems
# to be the common value now per new research.
if exclusion_zone is None:
exclusion_zone = profile.get('ez', None)
for i in range(k):
min_idx = np.argmin(mp)
min_dist = mp[min_idx]
# we no longer have any motifs to find as all values are nan/inf
if core.is_nan_inf(min_dist):
break
# create a motif pair corresponding to the first appearance and
# second appearance
first_idx = np.min([min_idx, mpi[min_idx]])
second_idx = np.max([min_idx, mpi[min_idx]])
# compute distance profile using mass2 for first appearance
# create the multi dimensional distance profile
md_distance_profile = np.zeros((dims, dp_len), dtype='complex128')
for i in range(0, dims):
ts_i = ts[i, :]
query_i = ts_i[first_idx:first_idx + window_size]
md_distance_profile[i, :] = mass2(ts_i, query_i)
D = md_distance_profile
D.sort(axis=0, kind="mergesort")
D_prime = np.zeros(dp_len)
for i in range(dims):
D_prime = D_prime + D[i]
D[i, :] = D_prime / (i + 1)
# reassign to keep compatibility with the rest of the code
distance_profile = D[dims - 1, :]
# exclude already picked motifs and neighbors
mask = core.nan_inf_indices(mp)
distance_profile[mask] = np.inf
# apply exclusion zone for motif pair
for j in (first_idx, second_idx):
distance_profile = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
j,
distance_profile
)
mp = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
j,
mp
)
# find up to max_neighbors
neighbors = []
for j in range(max_neighbors):
neighbor_idx = np.argmin(distance_profile)
neighbor_dist = distance_profile[neighbor_idx]
not_in_radius = not ((radius * min_dist) >= neighbor_dist)
# no more neighbors exist based on radius
if core.is_nan_inf(neighbor_dist) or not_in_radius:
break
# add neighbor and apply exclusion zone
neighbors.append(neighbor_idx)
distance_profile = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
neighbor_idx,
distance_profile
)
mp = core.apply_exclusion_zone(
exclusion_zone,
False,
window_size,
data_len,
neighbor_idx,
mp
)
# add motifs and neighbors to results
motifs.append({
'motifs': [first_idx, second_idx],
'neighbors': neighbors
})
profile['motifs'] = motifs
return profile
def top_k_motifs(profile, exclusion_zone=None, k=3, max_neighbors=10, radius=3, use_cmp=False):
"""
Find the top K number of motifs (patterns) given a matrix profile or a pan
matrix profile. By default the algorithm will find up to 3 motifs (k) and
up to 10 of their neighbors with a radius of 3 * min_dist using the
regular matrix profile. If the profile is a Matrix Profile data structure,
you can also use a Corrected Matrix Profile to compute the motifs.
Parameters
----------
profile : dict
The output from one of the matrix profile algorithms.
exclusion_zone : int, Default to algorithm ez
Desired number of values to exclude on both sides of the motif. This
avoids trivial matches. It defaults to half of the computed window
size. Setting the exclusion zone to 0 makes it not apply.
k : int, Default = 3
Desired number of motifs to find.
max_neighbors : int, Default = 10
The maximum number of neighbors to include for a given motif.
radius : int, Default = 3
The radius is used to associate a neighbor by checking if the
neighbor's distance is less than or equal to dist * radius
use_cmp : bool, Default = False
Use the Corrected Matrix Profile to compute the motifs (only for
a Matrix Profile data structure).
Returns
-------
dict : profile
The original input profile with the addition of the "motifs" key. The
motifs key consists of the following structure.
A list of dicts containing motif indices and their corresponding
neighbor indices.
>>> [
>>> {
>>> 'motifs': [first_index, second_index],
>>> 'neighbors': [index, index, index ...max_neighbors]
>>> }
>>> ]
The index is a single value when a MatrixProfile is passed in otherwise
the index contains a row and column index for Pan-MatrixProfile.
"""
if not core.is_mp_or_pmp_obj(profile):
raise ValueError('Expecting MP or PMP data structure!')
cls = profile.get('class', None)
func = None
if cls == 'MatrixProfile':
func = mp_top_k_motifs_md # call to custom function !!
elif cls == 'PMP':
func = pmp_top_k_motifs
else:
raise ValueError('Unsupported data structure!')
if cls == 'PMP':
return func(
profile,
exclusion_zone=exclusion_zone,
k=k,
max_neighbors=max_neighbors,
radius=radius
)
return func(
profile,
exclusion_zone=exclusion_zone,
k=k,
max_neighbors=max_neighbors,
radius=radius,
use_cmp=use_cmp
)
| 33.740072 | 101 | 0.592767 | 2,373 | 18,692 | 4.508639 | 0.112094 | 0.064399 | 0.030283 | 0.036452 | 0.813067 | 0.799794 | 0.784279 | 0.78054 | 0.78054 | 0.764744 | 0 | 0.006674 | 0.334689 | 18,692 | 553 | 102 | 33.801085 | 0.853651 | 0.424246 | 0 | 0.703971 | 0 | 0 | 0.030821 | 0 | 0 | 0 | 0 | 0.003617 | 0 | 1 | 0.01444 | false | 0 | 0.025271 | 0 | 0.057762 | 0.00361 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72adcde7235841235929114c8841764b06dc9e4c | 12,295 | py | Python | energylenserver/meter/edge_matching.py | manaswis/energylensplus | dee76dfd4a9948b906acd3e77cf28900744ef19a | [
"Apache-2.0"
] | null | null | null | energylenserver/meter/edge_matching.py | manaswis/energylensplus | dee76dfd4a9948b906acd3e77cf28900744ef19a | [
"Apache-2.0"
] | null | null | null | energylenserver/meter/edge_matching.py | manaswis/energylensplus | dee76dfd4a9948b906acd3e77cf28900744ef19a | [
"Apache-2.0"
] | null | null | null | """
Edge Matching Module
"""
import math
import numpy as np
import pandas as pd
from django_pandas.io import read_frame
from energylenserver.common_imports import *
from energylenserver.models import functions as mod_func
from energylenserver.core import functions as func
from energylenserver import common_offline as off_func
# Enable Logging
logger = logging.getLogger('energylensplus_django')
def match_events_offline(off_event):
"""
Matches the off event with the rise event
"""
apt_no = off_event.apt_no
off_time = off_event.timestamp
off_mag = math.fabs(off_event.magnitude)
off_location = off_event.location
off_appliance = off_event.appliance
# Extract rising edges occurring before the fall edge
# with similar power
on_events = off_func.get_on_events(apt_no, off_time)
if len(on_events) == 0:
return False
# Filter i: Remove all on events greater than 12 hours from off event
new_on_events = on_events[off_time - on_events.timestamp < 12 * 60 * 60]
id_list = []
mag_diff = []
time_diff = []
location = []
appliance = []
event_mag = []
for idx in new_on_events.index:
on_event = new_on_events.ix[idx]
id_list.append(on_event.id)
event_mag.append(on_event.magnitude)
mag_diff.append(math.fabs(off_mag - on_event.magnitude))
time_diff.append(math.fabs(off_time - on_event.timestamp))
location.append(on_event.location)
appliance.append(on_event.appliance)
df = pd.DataFrame({'id': id_list, 'event_mag': event_mag,
'mag_diff': mag_diff, 'time_diff': time_diff,
'location': location, 'appliance': appliance},
columns=['id', 'event_mag', 'mag_diff', 'time_diff',
'location', 'appliance'])
# logger.debug("On Events DF: \n%s", df)
# Get Metadata
data = mod_func.retrieve_metadata(apt_no)
metadata_df = read_frame(data, verbose=False)
# Range of OFF event
power = off_mag * percent_change
min_mag = off_mag - power
max_mag = off_mag + power
logger.debug("Magnitude::%s per_change: %s", off_mag, power)
logger.debug("Between min=[%s] max=[%s]", min_mag, max_mag)
# Filter 1: Determine if appliance is multi-state
if func.determine_multi_state(metadata_df, off_location, off_appliance):
filtered_df = df[(df.location == off_location) &
(df.appliance == off_appliance)]
filtered_df.sort(['id'], inplace=True)
filtered_df.reset_index(drop=True, inplace=True)
logger.debug("Filtered on events of a multi-state appl: \n%s", filtered_df)
if len(filtered_df) == 0:
return False
if len(filtered_df) > 1:
'''
Version 1:
# Filter 4: Based on power consumption
# Taking the ON event's which is the closest to the OFF event's magnitude
min_mag_diff = filtered_df.mag_diff.min()
filtered_df = filtered_df[filtered_df.mag_diff == min_mag_diff]
filtered_df.reset_index(drop=True, inplace=True)
'''
'''
Version 2.1:
# Filter 4: Based on duration of usage
# Take the ON event which closest to the OFF event's timestamp
min_time_diff = filtered_df.time_diff.min()
filtered_df = filtered_df[filtered_df.time_diff == min_time_diff]
filtered_df.reset_index(drop=True, inplace=True)
if len(filtered_df) == 0:
return False
logger.debug("Final set:%s \n", filtered_df)
'''
'''
Version 2.1:
'''
# Multiple ON events are found
# Check if sum of ON events is equivalent to the OFF event mag
on_mag_sum = float(filtered_df.event_mag.sum())
print on_mag_sum
if on_mag_sum >= min_mag and on_mag_sum <= max_mag:
# Select first ON event
on_event_record = new_on_events[new_on_events.id == filtered_df.ix[0]['id']]
on_event = on_event_record.ix[on_event_record.index[0]]
return on_event
else:
on_event_mag = float(filtered_df.ix[0]['event_mag'])
if on_event_mag >= min_mag and on_event_mag <= max_mag:
on_event_record = new_on_events[new_on_events.id == filtered_df.ix[0]['id']]
on_event = on_event_record.ix[on_event_record.index[0]]
return on_event
else:
# Extract OFF events between first ON and this OFF event
on_event_record = new_on_events[new_on_events.id == filtered_df.ix[0]['id']]
first_on_event = on_event_record.ix[on_event_record.index[0]]
on_event_time = int(first_on_event.timestamp)
off_event_df = off_func.get_off_events_by_offline(apt_no, on_event_time,
off_time, off_location,
off_appliance)
if len(off_event_df) > 1:
off_mag_sum = float(off_event_df.magnitude.sum())
# Range of ON event
power = off_mag_sum * percent_change
min_mag = off_mag_sum - power
max_mag = off_mag_sum + power
if on_event_mag >= min_mag and on_event_mag <= max_mag:
on_event_record = new_on_events[new_on_events.id == filtered_df.ix[0]['id']]
on_event = on_event_record.ix[on_event_record.index[0]]
return on_event
return False
# Filter 2: Match falling with rising edges where its magnitude is between
# a power threshold window
filtered_df = df[(df.event_mag >= min_mag) & (df.event_mag <= max_mag)]
# logger.debug("Filtered on events based on magnitude range: \n%s", filtered_df)
if len(filtered_df) == 0:
return False
# Filter 3: Matching with the same location and appliance
# if appliance is a presence based appliance
metadata_df['appliance'] = metadata_df.appliance.apply(lambda s: s.split('_')[0])
if off_appliance != "Unknown":
metadata_df = metadata_df[metadata_df.appliance == off_appliance]
logger.debug("Off appliance: %s metadata_df %s", off_appliance, metadata_df)
metadata_df = metadata_df.ix[:, ['appliance', 'presence_based']].drop_duplicates()
metadata_df.reset_index(inplace=True, drop=True)
if not metadata_df.ix[0]['presence_based']:
filtered_df = filtered_df[filtered_df.appliance == off_appliance]
else:
filtered_df = filtered_df[(filtered_df.location == off_location) &
(filtered_df.appliance == off_appliance)]
else:
filtered_df = filtered_df[filtered_df.location == off_location]
filtered_df.reset_index(drop=True, inplace=True)
if len(filtered_df) == 0:
return False
logger.debug("Matched ON DF:\n%s", filtered_df)
# Resolve conflicts by --
if len(filtered_df) > 1:
'''
Version 1:
# Filter 4: Based on power consumption
# Taking the ON event's which is the closest to the OFF event's magnitude
min_mag_diff = filtered_df.mag_diff.min()
filtered_df = filtered_df[filtered_df.mag_diff == min_mag_diff]
filtered_df.reset_index(drop=True, inplace=True)
'''
'''
Version 2:
'''
# Filter 4: Based on duration of usage
# Take the ON event which closest to the OFF event's timestamp
min_time_diff = filtered_df.time_diff.min()
filtered_df = filtered_df[filtered_df.time_diff == min_time_diff]
filtered_df.reset_index(drop=True, inplace=True)
if len(filtered_df) == 0:
return False
logger.debug("Final set:%s \n", filtered_df)
on_event_record = new_on_events[new_on_events.id == filtered_df.ix[0]['id']]
on_event = on_event_record.ix[on_event_record.index[0]]
return on_event
def match_events(apt_no, off_event):
"""
Matches the off event with the rise event
"""
off_time = off_event.event_time
off_mag = math.fabs(off_event.edge.magnitude)
off_location = off_event.location
off_appliance = off_event.appliance
# Extract rising edges occurring before the fall edge
# with similar power
on_events = mod_func.get_on_events(apt_no, off_time)
if on_events.count() == 0:
return False
# Filter i: Remove all on events greater than 24 hours from off event
new_on_events = []
for event in on_events:
on_time = event.event_time
if (off_time - on_time) < 12 * 60 * 60:
new_on_events.append(event)
id_list = []
mag_diff = []
location = []
appliance = []
event_mag = []
for on_event in new_on_events:
id_list.append(on_event.id)
event_mag.append(on_event.edge.magnitude)
mag_diff.append(math.fabs(off_mag - on_event.edge.magnitude))
location.append(on_event.location)
appliance.append(on_event.appliance)
df = pd.DataFrame({'id': id_list, 'event_mag': event_mag, 'mag_diff': mag_diff,
'location': location, 'appliance': appliance},
columns=['id', 'event_mag', 'mag_diff', 'location', 'appliance'])
# logger.debug("On Events DF: \n%s", df)
# Get Metadata
data = mod_func.retrieve_metadata(apt_no)
metadata_df = read_frame(data, verbose=False)
# Range of OFF event
power = off_mag * percent_change
min_mag = off_mag - power
max_mag = off_mag + power
logger.debug("Magnitude::%s per_change: %s", off_mag, power)
logger.debug("Between min=[%s] max=[%s]", min_mag, max_mag)
# Filter 1: Determine if appliance is multi-state
if func.determine_multi_state(metadata_df, off_location, off_appliance):
filtered_df = df[(df.location == off_location) &
(df.appliance == off_appliance)]
filtered_df.reset_index(drop=True, inplace=True)
logger.debug("Filtered on events of a multi-state appl: \n%s", filtered_df)
if len(filtered_df) == 0:
return False
return mod_func.get_on_event_by_id(filtered_df.ix[0]['id'])
# Filter 2: Match falling with rising edges where its magnitude is between
# a power threshold window
filtered_df = df[(df.event_mag >= min_mag) & (df.event_mag <= max_mag)]
# logger.debug("Filtered on events based on magnitude range: \n%s", filtered_df)
if len(filtered_df) == 0:
return False
# Filter 3: Matching with the same location and appliance
# if appliance is a presence based appliance
metadata_df['appliance'] = metadata_df.appliance.apply(lambda s: s.split('_')[0])
if off_appliance != "Unknown":
metadata_df = metadata_df[metadata_df.appliance == off_appliance]
metadata_df = metadata_df.ix[:, ['appliance', 'presence_based']].drop_duplicates()
metadata_df.reset_index(inplace=True, drop=True)
if not metadata_df.ix[0]['presence_based']:
filtered_df = filtered_df[filtered_df.appliance == off_appliance]
else:
filtered_df = filtered_df[(filtered_df.location == off_location) &
(filtered_df.appliance == off_appliance)]
else:
filtered_df = filtered_df[filtered_df.location == off_location]
filtered_df.reset_index(drop=True, inplace=True)
if len(filtered_df) == 0:
return False
logger.debug("Matched ON DF:\n%s", filtered_df)
# Resolve conflicts by --
# Filter 4: Taking the rising edge which is the closest to the off magnitude
min_mag_diff = filtered_df.mag_diff.min()
filtered_df = filtered_df[filtered_df.mag_diff == min_mag_diff]
filtered_df.reset_index(drop=True, inplace=True)
if len(filtered_df) == 0:
return False
logger.debug("Final set:%s \n", filtered_df)
return mod_func.get_on_event_by_id(filtered_df.ix[0]['id'])
| 36.48368 | 100 | 0.628548 | 1,676 | 12,295 | 4.337112 | 0.096062 | 0.114184 | 0.054478 | 0.060531 | 0.826661 | 0.819783 | 0.794745 | 0.783602 | 0.783602 | 0.775623 | 0 | 0.007289 | 0.274746 | 12,295 | 336 | 101 | 36.592262 | 0.807895 | 0.121025 | 0 | 0.662983 | 0 | 0 | 0.065704 | 0.002247 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.044199 | null | null | 0.005525 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72b21cc419e501c618869da78b80b9a6905532cf | 561 | py | Python | python/anyascii/_data/_022.py | casept/anyascii | d4f426b91751254b68eaa84c6cd23099edd668e6 | [
"ISC"
] | null | null | null | python/anyascii/_data/_022.py | casept/anyascii | d4f426b91751254b68eaa84c6cd23099edd668e6 | [
"ISC"
] | null | null | null | python/anyascii/_data/_022.py | casept/anyascii | d4f426b91751254b68eaa84c6cd23099edd668e6 | [
"ISC"
] | null | null | null | b='V C d E E 0 ^ D E E E E E E # P U S - -+ + / \\ * * * sqrt cbrt 4rt ~ inf L < < < | | || || ^ v ^ v S S SSS S SS SSS S S S : : : :: _ -: - ~ ~ ~ ~ ~ ~ ~ -~ ~ ~ ~= = = ~ ~ ~ ~ ~= = = = = = = = := =: = = = = = = = = = = = = = = <= >= <= >= < > << >> () = < > < > <~ >~ < > = = = = < > < > < > < > < > < > < > < > < > v v v [ ] [ ] ^ v + - x / * o * = - + - x * + + + + + + + + + + + + + + < > < > < > - - - + T v ^ v < / ^ v ^ v * * * * >< >< >< > < ~= v ^ < > ^ v + # < > <<< >>> = = =< => < > < > [ ] [ ] < > < > < > < > | - / \\ + + + + + + + + + + + + + E' | 561 | 561 | 0.114082 | 48 | 561 | 1.3125 | 0.395833 | 0.285714 | 0.285714 | 0.253968 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006536 | 0.454545 | 561 | 1 | 561 | 561 | 0.199346 | 0 | 0 | 0 | 0 | 1 | 0.991103 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
72f709bba727f2785d0349868978d0cd970ff03f | 1,451 | py | Python | tests/test_clisops_main_var.py | roocs/proto-lib-34e | bbc157651f2242ac08bca58c96bf8cdbc6506c31 | [
"BSD-2-Clause"
] | null | null | null | tests/test_clisops_main_var.py | roocs/proto-lib-34e | bbc157651f2242ac08bca58c96bf8cdbc6506c31 | [
"BSD-2-Clause"
] | 11 | 2020-02-27T17:59:08.000Z | 2020-03-23T10:48:20.000Z | tests/test_clisops_main_var.py | roocs/proto-lib-34e | bbc157651f2242ac08bca58c96bf8cdbc6506c31 | [
"BSD-2-Clause"
] | 1 | 2020-02-21T13:43:18.000Z | 2020-02-21T13:43:18.000Z | from clisops import utils
from .common import CMIP5_ARCHIVE_BASE
import xarray as xr
# setup for tests
def setup_module(module):
module.CMIP5_ARCHIVE_BASE = 'mini-esgf-data/test_data/badc/cmip5/data'
module.CMIP5_FPATHS = [
CMIP5_ARCHIVE_BASE + '/cmip5/output1/INM/inmcm4/rcp45/mon/ocean/Omon/r1i1p1/latest/zostoga/*.nc',
CMIP5_ARCHIVE_BASE + '/cmip5/output1/MOHC/HadGEM2-ES/rcp85/mon/atmos/Amon/r1i1p1/latest/tas/*.nc',
CMIP5_ARCHIVE_BASE + '/cmip5/output1/MOHC/HadGEM2-ES/historical/mon/land/Lmon/r1i1p1/latest/rh/*.nc'
]
def test_get_main_var_1():
ds = xr.open_mfdataset(CMIP5_FPATHS[0])
var_id = utils.get_main_variable(ds)
assert var_id == 'zostoga'
def test_get_main_var_2():
ds = xr.open_mfdataset(CMIP5_FPATHS[1])
var_id = utils.get_main_variable(ds)
assert var_id == 'tas'
def test_get_main_var_3():
ds = xr.open_mfdataset(CMIP5_FPATHS[2])
var_id = utils.get_main_variable(ds)
assert var_id == 'rh'
def teardown_module(module):
module.CMIP5_ARCHIVE_BASE = 'mini-esgf-data/test_data/badc/cmip5/data'
module.CMIP5_FPATHS = [
CMIP5_ARCHIVE_BASE + '/cmip5/output1/INM/inmcm4/rcp45/mon/ocean/Omon/r1i1p1/latest/zostoga/*.nc',
CMIP5_ARCHIVE_BASE + '/cmip5/output1/MOHC/HadGEM2-ES/rcp85/mon/atmos/Amon/r1i1p1/latest/tas/*.nc',
CMIP5_ARCHIVE_BASE + '/cmip5/output1/MOHC/HadGEM2-ES/historical/mon/land/Lmon/r1i1p1/latest/rh/*.nc'
]
| 34.547619 | 108 | 0.721571 | 221 | 1,451 | 4.493213 | 0.262443 | 0.108761 | 0.145015 | 0.126888 | 0.879154 | 0.827795 | 0.743202 | 0.743202 | 0.743202 | 0.743202 | 0 | 0.053055 | 0.14266 | 1,451 | 41 | 109 | 35.390244 | 0.745177 | 0.010338 | 0 | 0.448276 | 0 | 0.206897 | 0.376569 | 0.368201 | 0 | 0 | 0 | 0 | 0.103448 | 1 | 0.172414 | false | 0 | 0.103448 | 0 | 0.275862 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f48a9484b92f4f21902f15b78cbae3fe67f0ec22 | 13,745 | py | Python | cirq-rigetti/cirq_rigetti/qcs_sampler_and_service_test.py | Saibaba-Alapati/Cirq | 782efcd04c3bbf73a0d630306a3d1cfd9966521d | [
"Apache-2.0"
] | 3,326 | 2018-07-18T23:17:21.000Z | 2022-03-29T22:28:24.000Z | cirq-rigetti/cirq_rigetti/qcs_sampler_and_service_test.py | Saibaba-Alapati/Cirq | 782efcd04c3bbf73a0d630306a3d1cfd9966521d | [
"Apache-2.0"
] | 3,443 | 2018-07-18T21:07:28.000Z | 2022-03-31T20:23:21.000Z | cirq-rigetti/cirq_rigetti/qcs_sampler_and_service_test.py | Saibaba-Alapati/Cirq | 782efcd04c3bbf73a0d630306a3d1cfd9966521d | [
"Apache-2.0"
] | 865 | 2018-07-18T23:30:24.000Z | 2022-03-30T11:43:23.000Z | from typing import Tuple, Any, List
import cirq
import pytest
from pyquil import Program
from pyquil.api import QuantumComputer
import numpy as np
from pyquil.gates import MEASURE, RX, X, DECLARE, H, CNOT
from cirq_rigetti import RigettiQCSService
from typing_extensions import Protocol
from cirq_rigetti import circuit_transformers as transformers
from cirq_rigetti import circuit_sweep_executors as executors
_default_executor = executors.with_quilc_compilation_and_cirq_parameter_resolution
class _ResultBuilder(Protocol):
def __call__(
self,
mock_qpu_implementer: Any,
circuit: cirq.Circuit,
sweepable: cirq.Sweepable,
*,
executor: executors.CircuitSweepExecutor = _default_executor,
transformer: transformers.CircuitTransformer = transformers.default,
) -> Tuple[
List[cirq.Result],
QuantumComputer,
List["np.ndarray[Any, np.dtype[np.float_]]"],
List[cirq.ParamResolver],
]:
pass
def _build_service_results(
mock_qpu_implementer: Any,
circuit: cirq.Circuit,
sweepable: cirq.Sweepable,
*,
executor: executors.CircuitSweepExecutor = _default_executor,
transformer: transformers.CircuitTransformer = transformers.default,
) -> Tuple[
List[cirq.Result],
QuantumComputer,
List["np.ndarray[Any, np.dtype[np.float_]]"],
List[cirq.ParamResolver],
]:
repetitions = 2
param_resolvers = [r for r in cirq.to_resolvers(sweepable)]
param_resolver_index = min(1, len(param_resolvers) - 1)
param_resolver = param_resolvers[param_resolver_index]
expected_results = [
np.ones((repetitions,))
* (param_resolver["t"] if "t" in param_resolver else param_resolver_index)
]
quantum_computer = mock_qpu_implementer.implement_passive_quantum_computer_with_results(
expected_results
)
service = RigettiQCSService(
quantum_computer=quantum_computer,
executor=executor,
transformer=transformer,
)
result = service.run(
circuit=circuit,
param_resolver=param_resolver,
repetitions=repetitions,
)
return [result], quantum_computer, expected_results, [param_resolver]
def _build_sampler_results(
mock_qpu_implementer: Any,
circuit: cirq.Circuit,
sweepable: cirq.Sweepable,
*,
executor: executors.CircuitSweepExecutor = _default_executor,
transformer: transformers.CircuitTransformer = transformers.default,
) -> Tuple[
List[cirq.Result],
QuantumComputer,
List["np.ndarray[Any, np.dtype[np.float_]]"],
cirq.Sweepable,
]:
repetitions = 2
param_resolvers = [r for r in cirq.to_resolvers(sweepable)]
expected_results = [
np.ones((repetitions,)) * (params["t"] if "t" in params else i)
for i, params in enumerate(param_resolvers)
]
quantum_computer = mock_qpu_implementer.implement_passive_quantum_computer_with_results(
expected_results
)
service = RigettiQCSService(
quantum_computer=quantum_computer,
executor=executor,
transformer=transformer,
)
sampler = service.sampler()
results = sampler.run_sweep(
program=circuit,
params=param_resolvers,
repetitions=repetitions,
)
return results, quantum_computer, expected_results, param_resolvers
@pytest.mark.parametrize("result_builder", [_build_service_results, _build_sampler_results])
def test_parametric_circuit(
mock_qpu_implementer: Any,
parametric_circuit_with_params: Tuple[cirq.Circuit, cirq.Sweepable],
result_builder: _ResultBuilder,
) -> None:
"""test that RigettiQCSService and RigettiQCSSampler can run a parametric
circuit with a specified set of parameters and return expected cirq.Results.
"""
parametric_circuit = parametric_circuit_with_params[0]
sweepable = parametric_circuit_with_params[1]
results, quantum_computer, expected_results, param_resolvers = result_builder(
mock_qpu_implementer, parametric_circuit, sweepable
)
assert len(param_resolvers) == len(
results
), "should return a result for every element in sweepable"
for i, param_resolver in enumerate(param_resolvers):
result = results[i]
assert param_resolver == result.params
assert np.allclose(
result.measurements["m"],
expected_results[i],
), "should return an ordered list of results with correct set of measurements"
def test_executable(i: int, program: Program) -> None:
params = param_resolvers[i]
t = params["t"]
if t == 1:
assert (
X(0) in program.instructions
), f"executable should contain an X(0) instruction at {i}"
else:
assert (
RX(np.pi * t, 0) in program.instructions
), f"executable should contain an RX(pi*{t}) 0 instruction at {i}"
assert DECLARE("m0") in program.instructions, "executable should declare a read out bit"
assert (
MEASURE(0, ("m0", 0)) in program.instructions
), "executable should measure the read out bit"
param_sweeps = len(param_resolvers)
assert param_sweeps == quantum_computer.compiler.quil_to_native_quil.call_count # type: ignore
for i, call_args in enumerate(
quantum_computer.compiler.quil_to_native_quil.call_args_list # type: ignore
):
test_executable(i, call_args[0][0])
assert (
param_sweeps
== quantum_computer.compiler.native_quil_to_executable.call_count # type: ignore
)
for i, call_args in enumerate(
quantum_computer.compiler.native_quil_to_executable.call_args_list # type: ignore
):
test_executable(i, call_args[0][0])
assert param_sweeps == quantum_computer.qam.run.call_count # type: ignore
for i, call_args in enumerate(quantum_computer.qam.run.call_args_list): # type: ignore
test_executable(i, call_args[0][0])
@pytest.mark.parametrize("result_builder", [_build_service_results, _build_sampler_results])
def test_bell_circuit(
mock_qpu_implementer: Any,
bell_circuit: cirq.Circuit,
result_builder: _ResultBuilder,
) -> None:
"""test that RigettiQCSService and RigettiQCSSampler can run a basic Bell circuit
with two read out bits and return expected cirq.Results.
"""
param_resolvers = [cirq.ParamResolver({})]
results, quantum_computer, expected_results, param_resolvers = result_builder(
mock_qpu_implementer, bell_circuit, param_resolvers
)
assert len(param_resolvers) == len(
results
), "should return a result for every element in sweepable"
for i, param_resolver in enumerate(param_resolvers):
result = results[i]
assert param_resolver == result.params
assert np.allclose(
result.measurements["m"],
expected_results[i],
), "should return an ordered list of results with correct set of measurements"
def test_executable(program: Program) -> None:
assert H(0) in program.instructions, "bell circuit should include Hadamard"
assert CNOT(0, 1) in program.instructions, "bell circuit should include CNOT"
assert (
DECLARE("m0", memory_size=2) in program.instructions
), "executable should declare a read out bit"
assert (
MEASURE(0, ("m0", 0)) in program.instructions
), "executable should measure the first qubit to the first read out bit"
assert (
MEASURE(1, ("m0", 1)) in program.instructions
), "executable should measure the second qubit to the second read out bit"
param_sweeps = len(param_resolvers)
assert param_sweeps == quantum_computer.compiler.quil_to_native_quil.call_count # type: ignore
for i, call_args in enumerate(
quantum_computer.compiler.quil_to_native_quil.call_args_list # type: ignore
):
test_executable(call_args[0][0])
assert (
param_sweeps
== quantum_computer.compiler.native_quil_to_executable.call_count # type: ignore
)
for i, call_args in enumerate(
quantum_computer.compiler.native_quil_to_executable.call_args_list # type: ignore
):
test_executable(call_args[0][0])
assert param_sweeps == quantum_computer.qam.run.call_count # type: ignore
for i, call_args in enumerate(quantum_computer.qam.run.call_args_list): # type: ignore
test_executable(call_args[0][0])
@pytest.mark.parametrize("result_builder", [_build_service_results, _build_sampler_results])
def test_explicit_qubit_id_map(
mock_qpu_implementer: Any,
bell_circuit_with_qids: Tuple[cirq.Circuit, List[cirq.LineQubit]],
result_builder: _ResultBuilder,
) -> None:
"""test that RigettiQCSService and RigettiQCSSampler accept explicit ``qubit_id_map``
to map ``cirq.Qid`` s to physical qubits.
"""
bell_circuit, qubits = bell_circuit_with_qids
qubit_id_map = {
qubits[1]: "11",
qubits[0]: "13",
}
param_resolvers = [cirq.ParamResolver({})]
results, quantum_computer, expected_results, param_resolvers = result_builder(
mock_qpu_implementer,
bell_circuit,
param_resolvers,
transformer=transformers.build(
qubit_id_map=qubit_id_map, # type: ignore
),
)
assert len(param_resolvers) == len(
results
), "should return a result for every element in sweepable"
for i, param_resolver in enumerate(param_resolvers):
result = results[i]
assert param_resolver == result.params
assert np.allclose(
result.measurements["m"],
expected_results[i],
), "should return an ordered list of results with correct set of measurements"
def test_executable(program: Program) -> None:
assert H(13) in program.instructions, "bell circuit should include Hadamard"
assert CNOT(13, 11) in program.instructions, "bell circuit should include CNOT"
assert (
DECLARE("m0", memory_size=2) in program.instructions
), "executable should declare a read out bit"
assert (
MEASURE(13, ("m0", 0)) in program.instructions
), "executable should measure the first qubit to the first read out bit"
assert (
MEASURE(11, ("m0", 1)) in program.instructions
), "executable should measure the second qubit to the second read out bit"
param_sweeps = len(param_resolvers)
assert param_sweeps == quantum_computer.compiler.quil_to_native_quil.call_count # type: ignore
for i, call_args in enumerate(
quantum_computer.compiler.quil_to_native_quil.call_args_list # type: ignore
):
test_executable(call_args[0][0])
assert (
param_sweeps
== quantum_computer.compiler.native_quil_to_executable.call_count # type: ignore
)
for i, call_args in enumerate(
quantum_computer.compiler.native_quil_to_executable.call_args_list # type: ignore
):
test_executable(call_args[0][0])
assert param_sweeps == quantum_computer.qam.run.call_count # type: ignore
for i, call_args in enumerate(quantum_computer.qam.run.call_args_list): # type: ignore
test_executable(call_args[0][0])
@pytest.mark.parametrize("result_builder", [_build_service_results, _build_sampler_results])
def test_run_without_quilc_compilation(
mock_qpu_implementer: Any,
bell_circuit: cirq.Circuit,
result_builder: _ResultBuilder,
) -> None:
"""test that RigettiQCSService and RigettiQCSSampler allow users to execute
without using quilc to compile to native Quil.
"""
param_resolvers = [cirq.ParamResolver({})]
results, quantum_computer, expected_results, param_resolvers = result_builder(
mock_qpu_implementer,
bell_circuit,
param_resolvers,
executor=executors.without_quilc_compilation,
)
assert len(param_resolvers) == len(
results
), "should return a result for every element in sweepable"
for i, param_resolver in enumerate(param_resolvers):
result = results[i]
assert param_resolver == result.params
assert np.allclose(
result.measurements["m"],
expected_results[i],
), "should return an ordered list of results with correct set of measurements"
def test_executable(program: Program) -> None:
assert H(0) in program.instructions, "bell circuit should include Hadamard"
assert CNOT(0, 1) in program.instructions, "bell circuit should include CNOT"
assert (
DECLARE("m0", memory_size=2) in program.instructions
), "executable should declare a read out bit"
assert (
MEASURE(0, ("m0", 0)) in program.instructions
), "executable should measure the first qubit to the first read out bit"
assert (
MEASURE(1, ("m0", 1)) in program.instructions
), "executable should measure the second qubit to the second read out bit"
assert 0 == quantum_computer.compiler.quil_to_native_quil.call_count # type: ignore
param_sweeps = len(param_resolvers)
assert (
param_sweeps
== quantum_computer.compiler.native_quil_to_executable.call_count # type: ignore
)
for i, call_args in enumerate(
quantum_computer.compiler.native_quil_to_executable.call_args_list # type: ignore
):
test_executable(call_args[0][0])
assert param_sweeps == quantum_computer.qam.run.call_count # type: ignore
for i, call_args in enumerate(quantum_computer.qam.run.call_args_list): # type: ignore
test_executable(call_args[0][0])
| 37.452316 | 99 | 0.691524 | 1,653 | 13,745 | 5.513007 | 0.098609 | 0.060902 | 0.043784 | 0.025019 | 0.823988 | 0.796993 | 0.793482 | 0.787885 | 0.787885 | 0.769121 | 0 | 0.007681 | 0.223281 | 13,745 | 366 | 100 | 37.554645 | 0.845916 | 0.061186 | 0 | 0.722222 | 0 | 0 | 0.127027 | 0 | 0 | 0 | 0 | 0 | 0.140523 | 1 | 0.035948 | false | 0.009804 | 0.035948 | 0 | 0.081699 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f48ee7fd8bec1b0c5c0516b4e1bf8e66dd2d70b4 | 787 | py | Python | 2017/10_Oct/11/02-expandtabs.py | z727354123/pyCharmTest | 9cbd770e19929cb4feb3be2f13b60dc0b1f68b56 | [
"Apache-2.0"
] | null | null | null | 2017/10_Oct/11/02-expandtabs.py | z727354123/pyCharmTest | 9cbd770e19929cb4feb3be2f13b60dc0b1f68b56 | [
"Apache-2.0"
] | null | null | null | 2017/10_Oct/11/02-expandtabs.py | z727354123/pyCharmTest | 9cbd770e19929cb4feb3be2f13b60dc0b1f68b56 | [
"Apache-2.0"
] | null | null | null | myStr = '1\t12\t123\t1234\t12345\t123456\t'
print(myStr.expandtabs()) # 1 12 123 1234 12345 123456
print(myStr.expandtabs(7)) # 1 12 123 1234 12345 123456
print(myStr.expandtabs(6)) # 1 12 123 1234 12345 123456
print(myStr.expandtabs(5)) # 1 12 123 1234 12345 123456
print(myStr.expandtabs(4)) # 1 12 123 1234 12345 123456
print(myStr.expandtabs(3)) # 1 12 123 1234 12345 123456
print(myStr.expandtabs(2)) # 1 12 123 1234 12345 123456
print(myStr.expandtabs(1)) # 1 12 123 1234 12345 123456
print(myStr.expandtabs(0)) # 112123123412345123456
myStr = '1\t12\t\n123456789\t123456\t'
print(myStr.expandtabs())
# 1 12
# 123456789 123456
print(myStr.expandtabs(10))
# 1 12
# 123456789 123456
| 41.421053 | 76 | 0.654384 | 114 | 787 | 4.517544 | 0.22807 | 0.213592 | 0.427184 | 0.454369 | 0.751456 | 0.751456 | 0.751456 | 0.636893 | 0.636893 | 0 | 0 | 0.456667 | 0.237611 | 787 | 18 | 77 | 43.722222 | 0.401667 | 0.47014 | 0 | 0.153846 | 0 | 0 | 0.151741 | 0.151741 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.846154 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
be3da82b203b8ed525339d3c50a57c87ba1a2266 | 28,407 | py | Python | problems/reversing/obfuscatedPython1/obfuscated_python1.py | wanqizhu/treectf | 6b9178ce664a729be61492dad9bd2549c4c79104 | [
"Apache-2.0"
] | 1 | 2018-02-19T18:37:02.000Z | 2018-02-19T18:37:02.000Z | problems/reversing/obfuscatedPython1/obfuscated_python1.py | wanqizhu/treectf | 6b9178ce664a729be61492dad9bd2549c4c79104 | [
"Apache-2.0"
] | null | null | null | problems/reversing/obfuscatedPython1/obfuscated_python1.py | wanqizhu/treectf | 6b9178ce664a729be61492dad9bd2549c4c79104 | [
"Apache-2.0"
] | 2 | 2018-03-12T02:15:48.000Z | 2018-08-14T13:36:21.000Z | def sup(x, why):
j = [ ]
for i in x:
j.append( x[ int((
x.index(i )^ord( i))*
1.337) %len( x )])
return; j[:why:-1]
"""
#.#
#.#.#
#.#
"""
x=evl('[' + str(str)[(all([]) + all([]) + all([]) + all([])
+all([])+all([])+all([]))]+ eval(''+eval('str(str) [+all([])]')
+eval('str(str' + eval('str(' +str(eval) [eval(str((+all([])))
+str((+all([[]]))))]+'l'+str(eval) [eval(str((+all([])))+str((all([])+all([])+all([])
+all([])+all([])+all([]))))]
+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))
+str((all([])+all([])+all([])+all([])+all([])+all([]))))]
+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])
+all([])+all([])+all([])+all([])
+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')
+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])
+all([])+all([])))+')')
+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]
+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))
+str((+all([[]]))))]+'l'
+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])
+all([]))))]+'at((+all([]))))[(+all([]))]')
+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]
+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([])))+')')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'a'+eval('str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'[(+all([[]]))]'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+'e'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([])))+')')+eval('str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]')+'t'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])+all([])+all([])))+')')+eval('str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]')+eval('str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]')+'al'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+'[(+all([[]]))]'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+eval('str(eval)[eval(str((+all([])))+str((+all([[]]))))]')+'r'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])+all([])+all([])))+')')+'e'+eval('str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((+all([[]])))+')')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'[(+all([[]]))]'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'ell'+eval('str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]')+'t'+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'ere'+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+eval('str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([])))+')')+'are'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((all([])+all([])))+str((+all([])))+')')+eval('str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]')+eval('str(eval)[(all([])+all([]))]')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((+all([[]])))+')')+eval('str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])+all([])+all([])))+')')+eval('str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])))+')')+'t'+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])+all([])+all([])))+')')+'s'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])+all([])+all([])))+')')+'sa'+eval('str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((+all([[]])))+str((all([])+all([])+all([])+all([])+all([])))+')')+eval('str(str)[+all([])]')+'e'+eval('str(eval)[eval(str((+all([])))+str((+all([[]]))))]')+eval('str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((+all([])))+str((all([])+all([])))+str((+all([[]])))+')')+eval('str(eval)[eval(str((+all([])))+str((+all([[]]))))]')+'e'+eval('str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]')+eval('str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+'['+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])+all([])))+')')+'(+all([]))]'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'e'+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+eval('str(str)[+all([])]')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+']');flag == 'treeCTF{' + '_'.join([str(ord(_)) for _ in sup(x, 3)]) + '}'#)]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+'['+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])+all([])))+')')+'(+all([]))]'+eval(''+eval('str(str)[+all([])]')+eval('str(str'+eval('str('+str(eval)[eval(str((+all([])))+str((+all([[]]))))]+'l'+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+'at((+all([]))))[(+all([]))]')+str(str)[+all([])]+str(eval)[eval(str((+all([])))+str((all([])+all([])+all([])+all([])+all([])+all([]))))]+str(eval)[(all([])+all([]))]+str(eval)[(all([])+all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'t)[(all([])+all([])+all([])+all([]))]')+'r('+str((all([])+all([])+all([])+all([])))+str((all([])+all([])+all([])+all([])))+')')+str(str)[(all([])+all([])+all([])+all([])+all([])+all([])+all([]))]+'e'+str(str)[(all([])+all([])+all([])+all([])+all([]) | 359.582278 | 25,600 | 0.397578 | 3,692 | 28,407 | 3.058234 | 0.009751 | 0.702506 | 0.772385 | 0.750332 | 0.990258 | 0.990258 | 0.990081 | 0.990081 | 0.988752 | 0.988221 | 0 | 0.00022 | 0.038054 | 28,407 | 79 | 25,600 | 359.582278 | 0.412977 | 0.094097 | 0 | 0 | 0 | 0.269231 | 0.203724 | 0.148919 | 0.038462 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
be43f566227be3ffddeb72a54bdae560e70558ee | 8,851 | py | Python | tests/test_rank_filter.py | jakirkham/rank_filter | 967eafb76949eaaab0a3e90b1ce2df58427b0a4d | [
"BSD-3-Clause"
] | 3 | 2015-12-14T08:07:48.000Z | 2018-04-03T10:50:15.000Z | tests/test_rank_filter.py | jakirkham/rank_filter | 967eafb76949eaaab0a3e90b1ce2df58427b0a4d | [
"BSD-3-Clause"
] | 61 | 2015-09-30T23:34:41.000Z | 2018-10-19T01:28:16.000Z | tests/test_rank_filter.py | nanshe-org/rank_filter | 967eafb76949eaaab0a3e90b1ce2df58427b0a4d | [
"BSD-3-Clause"
] | 4 | 2015-09-30T12:03:56.000Z | 2021-02-11T14:46:51.000Z | #!/usr/bin/env python
import itertools
import sys
import unittest
import numpy
import rank_filter
if sys.version_info.major >= 3:
xrange = range
class TestRankFilter(unittest.TestCase):
def setUp(self):
self.size = 10
self.size_1 = (self.size,)
self.array_1 = numpy.zeros(self.size_1, dtype=float)
self.reverse_array_1 = numpy.zeros(self.size_1, dtype=float)
self.expected_result_1 = numpy.zeros(self.size_1, dtype=float)
self.result_1 = numpy.zeros(self.size_1, dtype=float)
self.size_2 = (self.size, self.size,)
self.array_2 = numpy.zeros(self.size_2, dtype=float)
self.reverse_array_2 = numpy.zeros(self.size_2, dtype=float)
self.expected_result_2 = numpy.zeros(self.size_2, dtype=float)
self.result_2 = numpy.zeros(self.size_2, dtype=float)
for i, in itertools.product(*[xrange(_) for _ in self.array_1.shape]):
self.array_1[i] = i
for i, in itertools.product(*[xrange(_) for _ in self.reverse_array_1.shape]):
self.reverse_array_1[i] = self.reverse_array_1.shape[0] - i - 1
for i, j in itertools.product(*[xrange(_) for _ in self.array_2.shape]):
self.array_2[i] = self.array_2.shape[1]*i + j
for i, j in itertools.product(*[xrange(_) for _ in self.reverse_array_2.shape]):
self.reverse_array_2[i] = self.reverse_array_2.shape[1]*(self.reverse_array_2.shape[0] - i - 1) +\
(self.reverse_array_2.shape[1] - j - 1)
def tearDown(self):
del self.size_1
del self.array_1
del self.reverse_array_1
del self.expected_result_1
del self.result_1
del self.size_2
del self.array_2
del self.reverse_array_2
del self.expected_result_2
del self.result_2
def test_rank_filter_1(self):
self.expected_result_1[:] = self.array_1
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.array_1, 0, 0.5, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_2(self):
self.expected_result_1[:] = self.reverse_array_1
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.reverse_array_1, 0, 0.5, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_3(self):
self.expected_result_1[:] = self.array_1
self.expected_result_1[0] = self.expected_result_1[1]
self.expected_result_1[-1] = self.expected_result_1[-2]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.array_1, 1, 0.5, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_4(self):
self.expected_result_1[:] = self.reverse_array_1
self.expected_result_1[0] = self.expected_result_1[1]
self.expected_result_1[-1] = self.expected_result_1[-2]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.reverse_array_1, 1, 0.5, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_5(self):
self.expected_result_1[:] = self.array_1
self.expected_result_1[0] = self.expected_result_1[1]
self.expected_result_1[-1] = self.expected_result_1[-2]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.array_1, 2, 0.5, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_6(self):
self.expected_result_1[:] = self.reverse_array_1
self.expected_result_1[0] = self.expected_result_1[1]
self.expected_result_1[-1] = self.expected_result_1[-2]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.reverse_array_1, 2, 0.5, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_7(self):
self.expected_result_1[:] = self.array_1
self.expected_result_1[0] = self.expected_result_1[1] = self.expected_result_1[2]
self.expected_result_1[-1] = self.expected_result_1[-2] = self.expected_result_1[-3]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.array_1, 3, 0.5, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_8(self):
self.expected_result_1[:] = self.reverse_array_1
self.expected_result_1[0] = self.expected_result_1[1] = self.expected_result_1[2]
self.expected_result_1[-1] = self.expected_result_1[-2] = self.expected_result_1[-3]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.reverse_array_1, 3, 0.5, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_9(self):
self.expected_result_1[:] = self.array_1
self.expected_result_1[0] = self.expected_result_1[1] = self.expected_result_1[2]
self.expected_result_1[-1] = self.expected_result_1[-2] = self.expected_result_1[-3]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.array_1, 3, 0.5, 0, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_10(self):
self.expected_result_1[:] = self.reverse_array_1
self.expected_result_1[0] = self.expected_result_1[1] = self.expected_result_1[2]
self.expected_result_1[-1] = self.expected_result_1[-2] = self.expected_result_1[-3]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.reverse_array_1, 3, 0.5, 0, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_11(self):
self.expected_result_2[:] = self.array_2
self.result_2[:] = 0
rank_filter.lineRankOrderFilter(self.array_2, 0, 0.5, out=self.result_2)
assert((self.expected_result_2 == self.result_2).all())
def test_rank_filter_12(self):
self.expected_result_2[:] = self.reverse_array_2
self.result_2[:] = 0
rank_filter.lineRankOrderFilter(self.reverse_array_2, 0, 0.5, out=self.result_2)
assert((self.expected_result_2 == self.result_2).all())
def test_rank_filter_13(self):
self.expected_result_1[:] = self.array_1
self.expected_result_1[0] = self.expected_result_1[1]
self.expected_result_1[-1] = self.expected_result_1[-2]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.array_1, 1, 0.5, 0, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
def test_rank_filter_14(self):
self.expected_result_2[:] = self.reverse_array_2
self.expected_result_2[0] = self.expected_result_2[1]
self.expected_result_2[-1] = self.expected_result_2[-2]
self.result_2[:] = 0
rank_filter.lineRankOrderFilter(self.reverse_array_2, 1, 0.5, 0, out=self.result_2)
assert((self.expected_result_2 == self.result_2).all())
def test_rank_filter_15(self):
self.expected_result_2[:] = self.array_2
self.expected_result_2[..., 0] = self.expected_result_2[..., 1]
self.expected_result_2[..., -1] = self.expected_result_2[..., -2]
self.result_2[:] = 0
rank_filter.lineRankOrderFilter(self.array_2, 1, 0.5, -1, out=self.result_2)
assert((self.expected_result_2 == self.result_2).all())
def test_rank_filter_16(self):
self.expected_result_2[:] = self.reverse_array_2
self.expected_result_2[..., 0] = self.expected_result_2[..., 1]
self.expected_result_2[..., -1] = self.expected_result_2[..., -2]
self.result_2[:] = 0
rank_filter.lineRankOrderFilter(self.reverse_array_2, 1, 0.5, -1, out=self.result_2)
assert((self.expected_result_2 == self.result_2).all())
def test_rank_filter_17(self):
self.expected_result_2[:] = self.reverse_array_2
self.expected_result_2[..., 0] = self.expected_result_2[..., 1]
self.expected_result_2[..., -1] = self.expected_result_2[..., -2]
self.result_2 = self.reverse_array_2
rank_filter.lineRankOrderFilter(self.result_2, 1, 0.5, -1, out=self.result_2)
assert((self.expected_result_2 == self.result_2).all())
def test_rank_filter_18(self):
for i in xrange(len(self.array_1)):
self.expected_result_1[i] = self.array_1[5] if (i < 5) else self.array_1[4]
self.result_1[:] = 0
rank_filter.lineRankOrderFilter(self.array_1, len(self.array_1) - 1, 0.5, out=self.result_1)
assert((self.expected_result_1 == self.result_1).all())
if __name__ == "__main__":
unittest.main()
| 31.386525 | 110 | 0.66049 | 1,340 | 8,851 | 4.026119 | 0.05 | 0.14013 | 0.333642 | 0.246525 | 0.880074 | 0.868211 | 0.852271 | 0.84671 | 0.844856 | 0.79759 | 0 | 0.05848 | 0.207886 | 8,851 | 281 | 111 | 31.498221 | 0.711026 | 0.00226 | 0 | 0.477987 | 0 | 0 | 0.000906 | 0 | 0 | 0 | 0 | 0 | 0.113208 | 1 | 0.125786 | false | 0 | 0.031447 | 0 | 0.163522 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
be4cce2b77b9696e6de8bcca3988975297c1bcec | 3,028 | py | Python | tests/iterators/index_of_test.py | SSouik/pyutil | d2250fb585679e49eb9056a3051bf239a58c2e8b | [
"MIT"
] | null | null | null | tests/iterators/index_of_test.py | SSouik/pyutil | d2250fb585679e49eb9056a3051bf239a58c2e8b | [
"MIT"
] | 21 | 2022-01-05T04:51:33.000Z | 2022-01-28T05:45:57.000Z | tests/iterators/index_of_test.py | SSouik/pyutil | d2250fb585679e49eb9056a3051bf239a58c2e8b | [
"MIT"
] | null | null | null | import pytest
from pyutil import index_of
sample_data = [1, 2, 3, 4, 5, 6, 7, 8, 9]
def test_index_of_when_seq_is_empty_list():
actual = index_of([], "foo")
expected = -1
assert actual == expected
def test_index_of_when_seq_is_empty_tuple():
actual = index_of(tuple([]), "foo")
expected = -1
assert actual == expected
def test_index_of_when_seq_is_empty_string():
actual = index_of("", "foo")
expected = -1
assert actual == expected
def test_index_of_with_list():
actual = index_of(sample_data, 3)
expected = 2
assert actual == expected
def test_index_of_with_tuple():
actual = index_of(tuple(sample_data), 3)
expected = 2
assert actual == expected
def test_index_of_with_string():
actual = index_of("foobar", "b")
expected = 3
assert actual == expected
def test_index_of_with_list_with_from_index():
actual = index_of(sample_data, 3, 1)
expected = 2
assert actual == expected
def test_index_of_with_tuple_from_index():
actual = index_of(tuple(sample_data), 3, 1)
expected = 2
assert actual == expected
def test_index_of_with_string_from_index():
actual = index_of("foobar", "b", 1)
expected = 3
assert actual == expected
def test_index_of_with_list_with_from_index_not_found():
actual = index_of(sample_data, 3, 3)
expected = -1
assert actual == expected
def test_index_of_with_tuple_from_index_not_found():
actual = index_of(tuple(sample_data), 3, 3)
expected = -1
assert actual == expected
def test_index_of_with_string_from_index_not_found():
actual = index_of("foobar", "b", 4)
expected = -1
assert actual == expected
def test_index_of_with_list_with_from_index_is_negative():
actual = index_of(sample_data, 3, -2)
expected = 2
assert actual == expected
def test_index_of_with_tuple_from_index_is_negative():
actual = index_of(tuple(sample_data), 3, -2)
expected = 2
assert actual == expected
def test_index_of_with_string_from_index_is_negative():
actual = index_of("foobar", "b", -2)
expected = 3
assert actual == expected
def test_index_of_with_list_with_from_index_is_negative_not_found():
actual = index_of(sample_data, 3, -8)
expected = -1
assert actual == expected
def test_index_of_with_tuple_from_index_is_negative_not_found():
actual = index_of(tuple(sample_data), 3, -8)
expected = -1
assert actual == expected
def test_index_of_with_string_from_index_is_negative_not_found():
actual = index_of("foobar", "b", -4)
expected = -1
assert actual == expected
def test_index_of_when_from_index_is_out_of_range():
actual = index_of(sample_data, 12)
expected = -1
assert actual == expected
def test_index_of_when_from_index_is_out_of_range_and_negative():
actual = index_of(sample_data, -12)
expected = -1
assert actual == expected
def test_index_of_when_seq_is_not_valid():
with pytest.raises(TypeError):
index_of(123, 123)
| 23.292308 | 68 | 0.707728 | 455 | 3,028 | 4.281319 | 0.096703 | 0.154517 | 0.129363 | 0.150924 | 0.940965 | 0.905031 | 0.887577 | 0.846509 | 0.821355 | 0.783881 | 0 | 0.025021 | 0.194848 | 3,028 | 129 | 69 | 23.472868 | 0.773995 | 0 | 0 | 0.465116 | 0 | 0 | 0.014531 | 0 | 0 | 0 | 0 | 0 | 0.232558 | 1 | 0.244186 | false | 0 | 0.023256 | 0 | 0.267442 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fe93a8750c705d8e8d0feb7470ac97f3a14c25f4 | 6,079 | py | Python | graphtheory/flow/edmondskarp.py | gitter-badger/graphs-dict | 2be1a5b140feb050eec799d6cadf6de5eef01745 | [
"BSD-3-Clause"
] | 36 | 2015-09-20T20:55:39.000Z | 2021-09-20T05:49:03.000Z | graphtheory/flow/edmondskarp.py | gitter-badger/graphs-dict | 2be1a5b140feb050eec799d6cadf6de5eef01745 | [
"BSD-3-Clause"
] | 6 | 2016-03-25T21:41:46.000Z | 2020-02-12T03:18:59.000Z | graphtheory/flow/edmondskarp.py | gitter-badger/graphs-dict | 2be1a5b140feb050eec799d6cadf6de5eef01745 | [
"BSD-3-Clause"
] | 9 | 2016-09-12T07:57:27.000Z | 2022-03-21T16:15:39.000Z | #!/usr/bin/python
try:
from Queue import Queue
except ImportError: # Python 3
from queue import Queue
from graphtheory.structures.edges import Edge
class EdmondsKarp:
"""The Edmonds-Karp algorithm for computing the maximum flow.
Attributes
----------
graph : input directed graph (flow network)
residual : directed graph (residual network)
flow : dict-of-dict
source : node
sink : node
max_flow : number
Notes
-----
Based on:
https://en.wikipedia.org/wiki/Edmonds-Karp_algorithm
"""
def __init__(self, graph):
"""The algorithm initialization."""
if not graph.is_directed():
raise ValueError("the graph is not directed")
self.graph = graph
self.residual = self.graph.__class__(self.graph.v(), directed=True)
for node in self.graph.iternodes():
self.residual.add_node(node)
# Initial capacities for the residual network.
for edge in self.graph.iteredges():
self.residual.add_edge(edge) # original capacity
self.residual.add_edge(Edge(edge.target, edge.source, 0))
# Legal flow.
self.flow = dict()
for source in self.graph.iternodes():
self.flow[source] = dict()
for target in self.graph.iternodes():
self.flow[source][target] = 0
# Initial flow is zero.
self.max_flow = 0
def run(self, source, sink):
"""Executable pseudocode."""
if source == sink:
raise ValueError("source and sink are the same")
self.source = source
self.sink = sink
while True:
min_capacity = self._find_path_bfs()
if min_capacity > 0:
self.max_flow += min_capacity
else:
break
def _find_path_bfs(self):
"""Finding augmenting paths in the residual network."""
parent = dict((node, None) for node in self.residual.iternodes())
# Capacity of found path to node.
capacity = {self.source: float("inf")}
Q = Queue()
Q.put(self.source)
while not Q.empty():
node = Q.get()
for edge in self.residual.iteroutedges(node):
cap = edge.weight - self.flow[edge.source][edge.target]
if cap > 0 and parent[edge.target] is None:
parent[edge.target] = edge.source
capacity[edge.target] = min(capacity[edge.source], cap)
if edge.target != self.sink:
Q.put(edge.target)
else:
# Backtrack search and write flow.
target = self.sink
while target != self.source:
node = parent[target]
self.flow[node][target] += capacity[self.sink]
self.flow[target][node] -= capacity[self.sink]
target = node
return capacity[self.sink]
return 0
class EdmondsKarpSparse:
"""The Edmonds-Karp algorithm for computing the maximum flow.
Attributes
----------
graph : input directed graph (flow network)
residual : directed graph (residual network)
flow : dict-of-dict
source : node
sink : node
max_flow : number
Notes
-----
Based on:
https://en.wikipedia.org/wiki/Edmonds-Karp_algorithm
"""
def __init__(self, graph):
"""The algorithm initialization."""
if not graph.is_directed():
raise ValueError("the graph is not directed")
self.graph = graph
self.residual = self.graph.__class__(self.graph.v(), directed=True)
for node in self.graph.iternodes():
self.residual.add_node(node)
# Legal flow.
self.flow = dict()
for node in self.graph.iternodes():
self.flow[node] = dict()
# Initial capacities for the residual network.
for edge in self.graph.iteredges():
self.residual.add_edge(edge) # original capacity
self.residual.add_edge(Edge(edge.target, edge.source, 0))
self.flow[edge.source][edge.target] = 0
self.flow[edge.target][edge.source] = 0
# Initial flow is zero.
self.max_flow = 0
def run(self, source, sink):
"""Executable pseudocode."""
if source == sink:
raise ValueError("source and sink are the same")
self.source = source
self.sink = sink
while True:
min_capacity = self._find_path_bfs()
if min_capacity > 0:
self.max_flow += min_capacity
else:
break
def _find_path_bfs(self):
"""Finding augmenting paths in the residual network."""
parent = dict((node, None) for node in self.residual.iternodes())
# Capacity of found path to node.
capacity = {self.source: float("inf")}
Q = Queue()
Q.put(self.source)
while not Q.empty():
node = Q.get()
for edge in self.residual.iteroutedges(node):
cap = edge.weight - self.flow[edge.source][edge.target]
if cap > 0 and parent[edge.target] is None:
parent[edge.target] = edge.source
capacity[edge.target] = min(capacity[edge.source], cap)
if edge.target != self.sink:
Q.put(edge.target)
else:
# Backtrack search and write flow.
target = self.sink
while target != self.source:
node = parent[target]
self.flow[node][target] += capacity[self.sink]
self.flow[target][node] -= capacity[self.sink]
target = node
return capacity[self.sink]
return 0
# EOF
| 35.138728 | 75 | 0.539562 | 676 | 6,079 | 4.778107 | 0.152367 | 0.049536 | 0.023839 | 0.020124 | 0.937461 | 0.931269 | 0.90743 | 0.87678 | 0.87678 | 0.87678 | 0 | 0.003592 | 0.358776 | 6,079 | 172 | 76 | 35.343023 | 0.825038 | 0.194604 | 0 | 0.869159 | 0 | 0 | 0.023774 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056075 | false | 0 | 0.037383 | 0 | 0.149533 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
22d43f3d33651a828f2a32f8aa03a957a3b634ab | 118 | py | Python | fantasy_sports/views.py | HowardSnable/aoe_fantasy | bca280730a71a10801a3f90df05c75c2a5279c5c | [
"MIT"
] | null | null | null | fantasy_sports/views.py | HowardSnable/aoe_fantasy | bca280730a71a10801a3f90df05c75c2a5279c5c | [
"MIT"
] | null | null | null | fantasy_sports/views.py | HowardSnable/aoe_fantasy | bca280730a71a10801a3f90df05c75c2a5279c5c | [
"MIT"
] | null | null | null | from django.shortcuts import redirect, reverse
def redirect_view(request):
return redirect(reverse('boa:home'))
| 19.666667 | 46 | 0.771186 | 15 | 118 | 6 | 0.8 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127119 | 118 | 5 | 47 | 23.6 | 0.873786 | 0 | 0 | 0 | 0 | 0 | 0.067797 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
fe03ffc6a9e47f306542afea34da874036d0e3b0 | 1,944 | py | Python | unr21-s2-individual/combined/exploit.py | albisorua/ctf-challenges | cbbeb607dd795c340c4583dfdb00964336e289ab | [
"MIT"
] | null | null | null | unr21-s2-individual/combined/exploit.py | albisorua/ctf-challenges | cbbeb607dd795c340c4583dfdb00964336e289ab | [
"MIT"
] | null | null | null | unr21-s2-individual/combined/exploit.py | albisorua/ctf-challenges | cbbeb607dd795c340c4583dfdb00964336e289ab | [
"MIT"
] | null | null | null | x = '0x430xf10x250x0b0xac0xa20x2e0xb60xb20x540x3a0x7d0x4f0x6e0x1d0x2e0x7e0xd10x460x8a0x080xa30x600x970x330x8b0x1a0x7b0xb70x8c0x4a0x820x2f0x9b0xb10x440x660xc90x510xd30x9c0x4b0x690xde0x0c0x650x050x6a0x4f0x370x170x000x670x230x340x110xf00x6d0x650x810x400xc80xc90x300xa70xd30x4e0xc50xc00x0d0x2f0x970x320x5f0x1b0xbe0x250x1a0x260x580x050x310xa80x290x090x9e0xf60xb60xbc0x680x380xf50xc40x360x810x290xdc0x650x440x330x8e0x310x890x6d0x220xda0x920x870x650x570xe20x100x580x350x2e0x650xc80x610xc50x100x200x6f0x450x800x2a0xc50x330x420xcc0xd80xf30xc00x590xfb0x7a0x300x3c0xed0xef0xdf0x020xb20x210x1a0x340x4c0xfb0x520x020x2f0x4a0xd30x8a0x310xab0xf30x1b0x0a0x570xcc0x7e0xec0x370x5c0xa20xe90x6b0xbb0x470x490x550x660xef0x040x390xde0x150xc30xf00x970x350xfd0x470x280xcd0x330x380x2a0x8e0x640x290xa30x910xf60x9e0xd60xee0x860x330xb40xbd0x5b0xa70x6b0xfd0xfd0x020x330x440xfd0x1f0x5d0x4b0xe20x9c0x1f0x330x2e0x910xf50x830xe60x970xad0x0b0x620x190x580xb40x650xc60x8c0xcc0x840x340x630xcd0xcc0xd30xdf0xec0x6a0xfa0x300x530x290x4e0x970x1d0x530x6f0x610x630xd50x6a0x1a0x1d0xdf0xea0x580xcf0x320x2e0x860x7b0x990x2b0x940x780xf40x320xce0x150x360x960x930x540x330xa50x640x5d0xe20x470x8d0x690xa00xf80xe90x390xbb0x010xdb0x1e0xd70x8a0x5c0xba0x620xaf0x700x410xcd0x7e0x440xf50x090x320xad0xb30x970xce0x680xfc0x3b0xe90x360x2b0xea0x930x900x3f0x0b0xd50xe00x630x610x6b0x9f0x790x480x430x680x320x310x020xc10xf40x390xec0x3b0x0c0xde0x610x080xa60x3a0x880xb60x080xb90x490x650x0d0x920x7e0x210x140x170xeb0xe30x620xea0xfb0x7f0x0e0x830x210xf60x1d0x650xcc0x4d0x510x970x010x060xb30x7d0x640x7b0x590x300xdb0x050x310xde0x590x340x150xe60x270xdf0x900x180x5e0x3b0x7d0x830x430xe80x780x2d0x2d0x0c0x530x870xd10xa20x340x2a0x140xfa0xb30x470xd10x190x870xb40x7f0xb80xe30xc40xf10xb50x940x8b0xaa0x590x850xa30x040x7a0x610xe50x860xe70x410x650x300xec0x4f0x610x860x4c0x2f0x5a0x0b0x420x760x20'
x = x.split('0x')
x = x[1:]
n = len(x)
flag = []
for i in range(0, n, 9):
flag.append(chr(int(x[i], 16)))
print(''.join(flag))
| 129.6 | 1,806 | 0.96142 | 30 | 1,944 | 62.3 | 0.666667 | 0.00214 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.550262 | 0.01749 | 1,944 | 14 | 1,807 | 138.857143 | 0.428272 | 0 | 0 | 0 | 0 | 0 | 0.926955 | 0.925926 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a3a653f59d10625b4b5595785bfd9201381dead6 | 488 | py | Python | Platforms/Web/Processing/Api/Discord/Commands/__init__.py | The-CJ/Phaazebot | 83a9563d210718071d4e2cdcca3b212c87abaf51 | [
"MIT"
] | 2 | 2017-09-14T08:07:55.000Z | 2021-05-18T05:05:05.000Z | Platforms/Web/Processing/Api/Discord/Commands/__init__.py | The-CJ/Phaazebot | 83a9563d210718071d4e2cdcca3b212c87abaf51 | [
"MIT"
] | 111 | 2018-04-15T14:32:14.000Z | 2021-03-28T21:06:29.000Z | Platforms/Web/Processing/Api/Discord/Commands/__init__.py | The-CJ/Phaazebot | 83a9563d210718071d4e2cdcca3b212c87abaf51 | [
"MIT"
] | 1 | 2018-04-15T13:24:44.000Z | 2018-04-15T13:24:44.000Z | import Platforms.Web.Processing.Api.Discord.Commands.create as create
import Platforms.Web.Processing.Api.Discord.Commands.delete as delete
import Platforms.Web.Processing.Api.Discord.Commands.edit as edit
import Platforms.Web.Processing.Api.Discord.Commands.errors as errors
import Platforms.Web.Processing.Api.Discord.Commands.get as get
import Platforms.Web.Processing.Api.Discord.Commands.listcommands as listcommands
import Platforms.Web.Processing.Api.Discord.Commands.main as main
| 61 | 81 | 0.856557 | 70 | 488 | 5.971429 | 0.214286 | 0.251196 | 0.301435 | 0.4689 | 0.770335 | 0.770335 | 0.770335 | 0 | 0 | 0 | 0 | 0 | 0.057377 | 488 | 7 | 82 | 69.714286 | 0.908696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
4330b4a63bf33af742bb98dfbc34284375a0e309 | 160,811 | py | Python | aries_cloudagent/protocols/out_of_band/v1_0/tests/test_manager.py | SNU-Blockchain-2021-Fall-Group-H/aries-cloudagent-python | a08bd0243bb27221c2d16655808ad90774ec1126 | [
"Apache-2.0"
] | 247 | 2019-07-02T21:10:21.000Z | 2022-03-30T13:55:33.000Z | aries_cloudagent/protocols/out_of_band/v1_0/tests/test_manager.py | SNU-Blockchain-2021-Fall-Group-H/aries-cloudagent-python | a08bd0243bb27221c2d16655808ad90774ec1126 | [
"Apache-2.0"
] | 1,462 | 2019-07-02T20:57:30.000Z | 2022-03-31T23:13:35.000Z | aries_cloudagent/protocols/out_of_band/v1_0/tests/test_manager.py | SNU-Blockchain-2021-Fall-Group-H/aries-cloudagent-python | a08bd0243bb27221c2d16655808ad90774ec1126 | [
"Apache-2.0"
] | 377 | 2019-06-20T21:01:31.000Z | 2022-03-30T08:27:53.000Z | """Test OOB Manager."""
import asyncio
import json
from asynctest import mock as async_mock, TestCase as AsyncTestCase
from copy import deepcopy
from datetime import datetime, timezone
from uuid import UUID
from .....connections.models.conn_record import ConnRecord
from .....connections.models.connection_target import ConnectionTarget
from .....connections.models.diddoc import DIDDoc, PublicKey, PublicKeyType, Service
from .....core.in_memory import InMemoryProfile
from .....core.profile import ProfileSession
from .....did.did_key import DIDKey
from .....indy.holder import IndyHolder
from .....indy.models.pres_preview import (
IndyPresAttrSpec,
IndyPresPredSpec,
IndyPresPreview,
)
from .....messaging.decorators.attach_decorator import AttachDecorator
from .....messaging.responder import BaseResponder, MockResponder
from .....messaging.util import str_to_epoch
from .....multitenant.base import BaseMultitenantManager
from .....multitenant.manager import MultitenantManager
from .....protocols.coordinate_mediation.v1_0.models.mediation_record import (
MediationRecord,
)
from .....protocols.coordinate_mediation.v1_0.manager import MediationManager
from .....protocols.didexchange.v1_0.manager import DIDXManager
from .....protocols.issue_credential.v1_0.manager import (
CredentialManager as V10CredManager,
)
from .....protocols.issue_credential.v1_0.messages.credential_offer import (
CredentialOffer as V10CredOffer,
)
from .....protocols.issue_credential.v1_0.messages.inner.credential_preview import (
CredentialPreview as V10CredentialPreview,
CredAttrSpec as V10CredAttrSpec,
)
from .....protocols.issue_credential.v1_0.tests import (
INDY_OFFER,
INDY_CRED_REQ,
)
from .....protocols.issue_credential.v2_0.manager import V20CredManager
from .....protocols.issue_credential.v2_0.messages.cred_format import V20CredFormat
from .....protocols.issue_credential.v2_0.messages.inner.cred_preview import (
V20CredPreview,
V20CredAttrSpec,
)
from .....protocols.issue_credential.v2_0.messages.cred_offer import V20CredOffer
from .....protocols.issue_credential.v2_0.models.cred_ex_record import V20CredExRecord
from .....protocols.issue_credential.v2_0.message_types import (
ATTACHMENT_FORMAT as V20_CRED_ATTACH_FORMAT,
CRED_20_OFFER,
)
from .....protocols.present_proof.v1_0.manager import PresentationManager
from .....protocols.present_proof.v1_0.message_types import (
PRESENTATION_REQUEST,
ATTACH_DECO_IDS as V10_PRES_ATTACH_FORMAT,
)
from .....protocols.present_proof.v1_0.messages.presentation import Presentation
from .....protocols.present_proof.v1_0.messages.presentation_request import (
PresentationRequest,
)
from .....protocols.present_proof.v1_0.models.presentation_exchange import (
V10PresentationExchange,
)
from .....protocols.present_proof.v2_0.manager import V20PresManager
from .....protocols.present_proof.v2_0.message_types import (
ATTACHMENT_FORMAT as V20_PRES_ATTACH_FORMAT,
PRES_20,
PRES_20_REQUEST,
)
from .....protocols.present_proof.v2_0.messages.pres import V20Pres
from .....protocols.present_proof.v2_0.messages.pres_format import V20PresFormat
from .....protocols.present_proof.v2_0.messages.pres_request import V20PresRequest
from .....storage.error import StorageNotFoundError
from .....storage.vc_holder.base import VCHolder
from .....storage.vc_holder.vc_record import VCRecord
from .....transport.inbound.receipt import MessageReceipt
from .....wallet.did_info import DIDInfo, KeyInfo
from .....wallet.did_method import DIDMethod
from .....wallet.in_memory import InMemoryWallet
from .....wallet.key_type import KeyType
from ....didcomm_prefix import DIDCommPrefix
from ....issue_credential.v1_0.models.credential_exchange import V10CredentialExchange
from .. import manager as test_module
from ..manager import (
OutOfBandManager,
OutOfBandManagerError,
)
from ..message_types import INVITATION
from ..messages.invitation import HSProto, InvitationMessage
from ..messages.reuse import HandshakeReuse
from ..messages.reuse_accept import HandshakeReuseAccept
from ..messages.problem_report import ProblemReport, ProblemReportReason
from ..models.invitation import InvitationRecord
class TestConfig:
test_did = "55GkHamhTU1ZbTbV2ab9DE"
test_verkey = "3Dn1SJNPaCXcvvJvSbsFWP2xaCjMom3can8CQNhWrTRx"
test_endpoint = "http://localhost"
test_target_did = "GbuDUYXaUZRfHD2jeDuQuP"
their_public_did = "55GkHamhTU1ZbTbV2ab9DE"
NOW_8601 = datetime.utcnow().replace(tzinfo=timezone.utc).isoformat(" ", "seconds")
NOW_EPOCH = str_to_epoch(NOW_8601)
CD_ID = "GMm4vMw8LLrLJjp81kRRLp:3:CL:12:tag"
INDY_PROOF_REQ = json.loads(
f"""{{
"name": "proof-req",
"version": "1.0",
"nonce": "12345",
"requested_attributes": {{
"0_player_uuid": {{
"name": "player",
"restrictions": [
{{
"cred_def_id": "{CD_ID}"
}}
],
"non_revoked": {{
"from": {NOW_EPOCH},
"to": {NOW_EPOCH}
}}
}},
"0_screencapture_uuid": {{
"name": "screenCapture",
"restrictions": [
{{
"cred_def_id": "{CD_ID}"
}}
],
"non_revoked": {{
"from": {NOW_EPOCH},
"to": {NOW_EPOCH}
}}
}}
}},
"requested_predicates": {{
"0_highscore_GE_uuid": {{
"name": "highScore",
"p_type": ">=",
"p_value": 1000000,
"restrictions": [
{{
"cred_def_id": "{CD_ID}"
}}
],
"non_revoked": {{
"from": {NOW_EPOCH},
"to": {NOW_EPOCH}
}}
}}
}}
}}"""
)
DIF_PROOF_REQ = {
"presentation_definition": {
"id": "32f54163-7166-48f1-93d8-ff217bdb0654",
"submission_requirements": [
{
"name": "Citizenship Information",
"rule": "pick",
"min": 1,
"from": "A",
}
],
"input_descriptors": [
{
"id": "citizenship_input_1",
"name": "EU Driver's License",
"group": ["A"],
"schema": [
{
"uri": "https://www.w3.org/2018/credentials#VerifiableCredential"
}
],
"constraints": {
"limit_disclosure": "required",
"fields": [
{
"path": ["$.credentialSubject.givenName"],
"purpose": "The claim must be from one of the specified issuers",
"filter": {
"type": "string",
"enum": ["JOHN", "CAI"],
},
}
],
},
}
],
},
}
PRES_PREVIEW = IndyPresPreview(
attributes=[
IndyPresAttrSpec(name="player", cred_def_id=CD_ID, value="Richie Knucklez"),
IndyPresAttrSpec(
name="screenCapture",
cred_def_id=CD_ID,
mime_type="image/png",
value="aW1hZ2luZSBhIHNjcmVlbiBjYXB0dXJl",
),
],
predicates=[
IndyPresPredSpec(
name="highScore", cred_def_id=CD_ID, predicate=">=", threshold=1000000
)
],
)
PRES_REQ_V1 = PresentationRequest(
comment="Test",
request_presentations_attach=[
AttachDecorator.data_base64(
mapping=INDY_PROOF_REQ,
ident=V10_PRES_ATTACH_FORMAT[PRESENTATION_REQUEST],
)
],
)
pres_req_dict = PRES_REQ_V1.request_presentations_attach[0].serialize()
req_attach_v1 = {
"@id": "request-0",
"mime-type": "application/json",
"data": {
"json": {
"@type": DIDCommPrefix.qualify_current(PRESENTATION_REQUEST),
"@id": "12345678-0123-4567-1234-567812345678",
"comment": "some comment",
"request_presentations~attach": [pres_req_dict],
}
},
}
PRES_REQ_V2 = V20PresRequest(
comment="some comment",
will_confirm=True,
formats=[
V20PresFormat(
attach_id="indy",
format_=V20_PRES_ATTACH_FORMAT[PRES_20_REQUEST][
V20PresFormat.Format.INDY.api
],
)
],
request_presentations_attach=[
AttachDecorator.data_base64(mapping=INDY_PROOF_REQ, ident="indy")
],
)
DIF_PRES_REQ_V2 = V20PresRequest(
comment="some comment",
will_confirm=True,
formats=[
V20PresFormat(
attach_id="dif",
format_=V20_PRES_ATTACH_FORMAT[PRES_20_REQUEST][
V20PresFormat.Format.DIF.api
],
)
],
request_presentations_attach=[
AttachDecorator.data_json(mapping=DIF_PROOF_REQ, ident="dif")
],
)
CRED_OFFER_V1 = V10CredOffer(
credential_preview=V10CredentialPreview(
attributes=(
V10CredAttrSpec(name="legalName", value="value"),
V10CredAttrSpec(name="jurisdictionId", value="value"),
V10CredAttrSpec(name="incorporationDate", value="value"),
)
),
offers_attach=[V10CredOffer.wrap_indy_offer(INDY_OFFER)],
)
CRED_OFFER_V2 = V20CredOffer(
credential_preview=V20CredPreview(
attributes=V20CredAttrSpec.list_plain(
{
"legalName": "value",
"jurisdictionId": "value",
"incorporationDate": "value",
}
),
),
formats=[
V20CredFormat(
attach_id="indy",
format_=V20_CRED_ATTACH_FORMAT[CRED_20_OFFER][
V20CredFormat.Format.INDY.api
],
)
],
offers_attach=[AttachDecorator.data_base64(INDY_OFFER, ident="indy")],
)
req_attach_v2 = AttachDecorator.data_json(
mapping=PRES_REQ_V2.serialize(),
ident="request-0",
).serialize()
indy_cred_req = {
"schema_id": f"{test_did}:2:bc-reg:1.0",
"cred_def_id": f"{test_did}:3:CL:12:tag1",
}
cred_req_meta = {}
def make_did_doc(self, did, verkey):
doc = DIDDoc(did=did)
controller = did
ident = "1"
pk_value = verkey
pk = PublicKey(
did, ident, pk_value, PublicKeyType.ED25519_SIG_2018, controller, False
)
doc.set(pk)
recip_keys = [pk]
router_keys = []
service = Service(
did, "indy", "IndyAgent", recip_keys, router_keys, TestConfig.test_endpoint
)
doc.set(service)
return doc
class TestOOBManager(AsyncTestCase, TestConfig):
def setUp(self):
self.responder = MockResponder()
self.responder.send = async_mock.CoroutineMock()
self.profile = InMemoryProfile.test_profile(
{
"default_endpoint": TestConfig.test_endpoint,
"default_label": "This guy",
"additional_endpoints": ["http://aries.ca/another-endpoint"],
"debug.auto_accept_invites": True,
"debug.auto_accept_requests": True,
}
)
self.profile.context.injector.bind_instance(BaseResponder, self.responder)
self.mt_mgr = async_mock.MagicMock()
self.mt_mgr = async_mock.create_autospec(MultitenantManager)
self.profile.context.injector.bind_instance(BaseMultitenantManager, self.mt_mgr)
self.multitenant_mgr = async_mock.MagicMock(MultitenantManager, autospec=True)
self.profile.context.injector.bind_instance(
BaseMultitenantManager, self.multitenant_mgr
)
self.manager = OutOfBandManager(self.profile)
assert self.manager.profile
self.manager.resolve_invitation = async_mock.CoroutineMock()
self.manager.resolve_invitation.return_value = (
TestConfig.test_endpoint,
[TestConfig.test_verkey],
[],
)
self.test_conn_rec = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_role=None,
state=ConnRecord.State.COMPLETED,
their_public_did=self.their_public_did,
)
self.test_mediator_routing_keys = [
"3Dn1SJNPaCXcvvJvSbsFWP2xaCjMom3can8CQNhWrTRR"
]
self.test_mediator_conn_id = "mediator-conn-id"
self.test_mediator_endpoint = "http://mediator.example.com"
async def test_create_invitation_handshake_succeeds(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did:
mock_wallet_get_public_did.return_value = DIDInfo(
TestConfig.test_did,
TestConfig.test_verkey,
None,
method=DIDMethod.SOV,
key_type=KeyType.ED25519,
)
invi_rec = await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
public=True,
hs_protos=[HSProto.RFC23],
)
assert invi_rec._invitation.ser["@type"] == DIDCommPrefix.qualify_current(
INVITATION
)
assert not invi_rec._invitation.ser.get("requests~attach")
assert (
DIDCommPrefix.qualify_current(HSProto.RFC23.name)
in invi_rec.invitation.handshake_protocols
)
assert invi_rec._invitation.ser["services"] == [
f"did:sov:{TestConfig.test_did}"
]
async def test_create_invitation_mediation_overwrites_routing_and_endpoint(self):
async with self.profile.session() as session:
mock_conn_rec = async_mock.MagicMock()
mediation_record = MediationRecord(
role=MediationRecord.ROLE_CLIENT,
state=MediationRecord.STATE_GRANTED,
connection_id=self.test_mediator_conn_id,
routing_keys=self.test_mediator_routing_keys,
endpoint=self.test_mediator_endpoint,
)
await mediation_record.save(session)
with async_mock.patch.object(
MediationManager,
"get_default_mediator_id",
) as mock_get_default_mediator, async_mock.patch.object(
mock_conn_rec, "metadata_set", async_mock.CoroutineMock()
) as mock_metadata_set:
invite = await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
my_label="test123",
hs_protos=[HSProto.RFC23],
mediation_id=mediation_record.mediation_id,
)
assert isinstance(invite, InvitationRecord)
assert invite._invitation.ser["@type"] == DIDCommPrefix.qualify_current(
INVITATION
)
assert invite.invitation.label == "test123"
mock_get_default_mediator.assert_not_called()
async def test_create_invitation_multitenant_local(self):
self.profile.context.update_settings(
{
"multitenant.enabled": True,
"wallet.id": "test_wallet",
}
)
self.multitenant_mgr.add_key = async_mock.CoroutineMock()
with async_mock.patch.object(
InMemoryWallet, "create_signing_key", autospec=True
) as mock_wallet_create_signing_key, async_mock.patch.object(
self.multitenant_mgr, "get_default_mediator"
) as mock_get_default_mediator:
mock_wallet_create_signing_key.return_value = KeyInfo(
TestConfig.test_verkey, None, KeyType.ED25519
)
mock_get_default_mediator.return_value = MediationRecord()
await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
hs_protos=[HSProto.RFC23],
multi_use=False,
)
self.multitenant_mgr.add_key.assert_called_once_with(
"test_wallet", TestConfig.test_verkey
)
async def test_create_invitation_multitenant_public(self):
self.profile.context.update_settings(
{
"multitenant.enabled": True,
"wallet.id": "test_wallet",
"public_invites": True,
}
)
self.multitenant_mgr.add_key = async_mock.CoroutineMock()
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did:
mock_wallet_get_public_did.return_value = DIDInfo(
self.test_did,
self.test_verkey,
None,
method=DIDMethod.SOV,
key_type=KeyType.ED25519,
)
await self.manager.create_invitation(
hs_protos=[HSProto.RFC23],
public=True,
multi_use=False,
)
self.multitenant_mgr.add_key.assert_called_once_with(
"test_wallet", TestConfig.test_verkey, skip_if_exists=True
)
async def test_create_invitation_no_handshake_no_attachments_x(self):
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
public=True,
hs_protos=None,
multi_use=False,
)
assert "Invitation must include" in str(context.exception)
async def test_create_invitation_attachment_v1_0_cred_offer(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did, async_mock.patch.object(
V10CredentialExchange,
"retrieve_by_id",
async_mock.CoroutineMock(),
) as mock_retrieve_cxid:
mock_wallet_get_public_did.return_value = DIDInfo(
TestConfig.test_did,
TestConfig.test_verkey,
None,
method=DIDMethod.SOV,
key_type=KeyType.ED25519,
)
mock_retrieve_cxid.return_value = async_mock.MagicMock(
credential_offer_dict=self.CRED_OFFER_V1
)
invi_rec = await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
public=True,
hs_protos=[HSProto.RFC23],
multi_use=False,
attachments=[{"type": "credential-offer", "id": "dummy-id"}],
)
assert isinstance(invi_rec, InvitationRecord)
async def test_create_invitation_attachment_v1_0_cred_offer_no_handshake(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did, async_mock.patch.object(
V10CredentialExchange,
"retrieve_by_id",
async_mock.CoroutineMock(),
) as mock_retrieve_cxid:
mock_wallet_get_public_did.return_value = DIDInfo(
TestConfig.test_did,
TestConfig.test_verkey,
None,
method=DIDMethod.SOV,
key_type=KeyType.ED25519,
)
mock_retrieve_cxid.return_value = async_mock.MagicMock(
credential_offer_dict=self.CRED_OFFER_V1
)
invi_rec = await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
public=True,
hs_protos=None,
multi_use=False,
attachments=[{"type": "credential-offer", "id": "dummy-id"}],
)
assert isinstance(invi_rec, InvitationRecord)
assert not invi_rec._invitation.ser["handshake_protocols"]
async def test_create_invitation_attachment_v2_0_cred_offer(self):
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did, async_mock.patch.object(
test_module.V10CredentialExchange,
"retrieve_by_id",
async_mock.CoroutineMock(),
) as mock_retrieve_cxid_v1, async_mock.patch.object(
test_module.V20CredExRecord,
"retrieve_by_id",
async_mock.CoroutineMock(),
) as mock_retrieve_cxid_v2:
mock_wallet_get_public_did.return_value = DIDInfo(
TestConfig.test_did,
TestConfig.test_verkey,
None,
method=DIDMethod.SOV,
key_type=KeyType.ED25519,
)
mock_retrieve_cxid_v1.side_effect = test_module.StorageNotFoundError()
mock_retrieve_cxid_v2.return_value = async_mock.MagicMock(
cred_offer=async_mock.MagicMock(
serialize=async_mock.MagicMock(
return_value=json.dumps({"cred": "offer"})
)
)
)
invi_rec = await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
public=False,
hs_protos=None,
multi_use=False,
attachments=[{"type": "credential-offer", "id": "dummy-id"}],
)
assert invi_rec._invitation.ser["requests~attach"]
async def test_create_invitation_attachment_present_proof_v1_0(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did, async_mock.patch.object(
test_module.V10PresentationExchange,
"retrieve_by_id",
async_mock.CoroutineMock(),
) as mock_retrieve_pxid:
mock_wallet_get_public_did.return_value = DIDInfo(
TestConfig.test_did,
TestConfig.test_verkey,
None,
method=DIDMethod.SOV,
key_type=KeyType.ED25519,
)
mock_retrieve_pxid.return_value = async_mock.MagicMock(
presentation_request_dict=self.PRES_REQ_V1
)
invi_rec = await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
public=True,
hs_protos=[test_module.HSProto.RFC23],
multi_use=False,
attachments=[{"type": "present-proof", "id": "dummy-id"}],
)
assert invi_rec._invitation.ser["requests~attach"]
mock_retrieve_pxid.assert_called_once()
assert isinstance(mock_retrieve_pxid.call_args[0][0], ProfileSession)
assert mock_retrieve_pxid.call_args[0][1] == "dummy-id"
async def test_create_invitation_attachment_present_proof_v2_0(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did, async_mock.patch.object(
test_module.V10PresentationExchange,
"retrieve_by_id",
async_mock.CoroutineMock(),
) as mock_retrieve_pxid_1, async_mock.patch.object(
test_module.V20PresExRecord,
"retrieve_by_id",
async_mock.CoroutineMock(),
) as mock_retrieve_pxid_2:
mock_wallet_get_public_did.return_value = DIDInfo(
TestConfig.test_did,
TestConfig.test_verkey,
None,
method=DIDMethod.SOV,
key_type=KeyType.ED25519,
)
mock_retrieve_pxid_1.side_effect = StorageNotFoundError()
mock_retrieve_pxid_2.return_value = async_mock.MagicMock(
pres_request=TestConfig.PRES_REQ_V2
)
invi_rec = await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
public=True,
hs_protos=[test_module.HSProto.RFC23],
multi_use=False,
attachments=[{"type": "present-proof", "id": "dummy-id"}],
)
assert invi_rec._invitation.ser["requests~attach"]
mock_retrieve_pxid_1.assert_called_once()
assert isinstance(mock_retrieve_pxid_1.call_args[0][0], ProfileSession)
assert mock_retrieve_pxid_1.call_args[0][1] == "dummy-id"
mock_retrieve_pxid_2.assert_called_once()
assert isinstance(mock_retrieve_pxid_2.call_args[0][0], ProfileSession)
assert mock_retrieve_pxid_2.call_args[0][1] == "dummy-id"
async def test_dif_req_v2_attach_pres_existing_conn_auto_present_pres_msg_with_challenge(
self,
):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_presentation_request": True}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
dif_proof_req = deepcopy(TestConfig.DIF_PROOF_REQ)
dif_proof_req["options"] = {}
dif_proof_req["options"][
"challenge"
] = "3fa85f64-5717-4562-b3fc-2c963f66afa7"
dif_pres_req_v2 = V20PresRequest(
comment="some comment",
will_confirm=True,
formats=[
V20PresFormat(
attach_id="dif",
format_=V20_PRES_ATTACH_FORMAT[PRES_20_REQUEST][
V20PresFormat.Format.DIF.api
],
)
],
request_presentations_attach=[
AttachDecorator.data_json(mapping=dif_proof_req, ident="dif")
],
)
px2_rec = test_module.V20PresExRecord(
auto_present=True,
pres_request=dif_pres_req_v2.serialize(),
)
dif_req_attach_v2 = AttachDecorator.data_json(
mapping=dif_pres_req_v2.serialize(),
ident="request-0",
).serialize()
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
V20PresManager,
"receive_pres_request",
autospec=True,
) as pres_mgr_receive_pres_req, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report, async_mock.patch.object(
V20PresManager,
"create_pres",
autospec=True,
) as pres_mgr_create_pres:
oob_mgr_find_existing_conn.return_value = test_exist_conn
pres_mgr_receive_pres_req.return_value = px2_rec
pres_mgr_create_pres.return_value = (
px2_rec,
V20Pres(
formats=[
V20PresFormat(
attach_id="dif",
format_=V20_PRES_ATTACH_FORMAT[PRES_20][
V20PresFormat.Format.DIF.api
],
)
],
presentations_attach=[
AttachDecorator.data_json(
mapping={"bogus": "proof"},
ident="dif",
)
],
),
)
self.profile.context.injector.bind_instance(
VCHolder,
async_mock.MagicMock(
search_credentials=async_mock.MagicMock(
return_value=async_mock.MagicMock(
fetch=async_mock.CoroutineMock(
return_value=[
VCRecord(
contexts=[
"https://www.w3.org/2018/credentials/v1",
"https://www.w3.org/2018/credentials/examples/v1",
],
expanded_types=[
"https://www.w3.org/2018/credentials#VerifiableCredential",
"https://example.org/examples#UniversityDegreeCredential",
],
issuer_id="https://example.edu/issuers/565049",
subject_ids=[
"did:example:ebfeb1f712ebc6f1c276e12ec21"
],
proof_types=["Ed25519Signature2018"],
schema_ids=[
"https://example.org/examples/degree.json"
],
cred_value={"...": "..."},
given_id="http://example.edu/credentials/3732",
cred_tags={"some": "tag"},
)
]
)
)
)
),
)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[AttachDecorator.deserialize(dif_req_attach_v2)],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
conn_rec = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert conn_rec is not None
async def test_dif_req_v2_attach_pres_existing_conn_auto_present_pres_msg_with_nonce(
self,
):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_presentation_request": True}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
dif_proof_req = deepcopy(TestConfig.DIF_PROOF_REQ)
dif_proof_req["options"] = {}
dif_proof_req["options"]["nonce"] = "12345"
dif_pres_req_v2 = V20PresRequest(
comment="some comment",
will_confirm=True,
formats=[
V20PresFormat(
attach_id="dif",
format_=V20_PRES_ATTACH_FORMAT[PRES_20_REQUEST][
V20PresFormat.Format.DIF.api
],
)
],
request_presentations_attach=[
AttachDecorator.data_json(mapping=dif_proof_req, ident="dif")
],
)
px2_rec = test_module.V20PresExRecord(
auto_present=True,
pres_request=dif_pres_req_v2.serialize(),
)
dif_req_attach_v2 = AttachDecorator.data_json(
mapping=dif_pres_req_v2.serialize(),
ident="request-0",
).serialize()
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
V20PresManager,
"receive_pres_request",
autospec=True,
) as pres_mgr_receive_pres_req, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report, async_mock.patch.object(
V20PresManager,
"create_pres",
autospec=True,
) as pres_mgr_create_pres:
oob_mgr_find_existing_conn.return_value = test_exist_conn
pres_mgr_receive_pres_req.return_value = px2_rec
pres_mgr_create_pres.return_value = (
px2_rec,
V20Pres(
formats=[
V20PresFormat(
attach_id="dif",
format_=V20_PRES_ATTACH_FORMAT[PRES_20][
V20PresFormat.Format.DIF.api
],
)
],
presentations_attach=[
AttachDecorator.data_json(
mapping={"bogus": "proof"},
ident="dif",
)
],
),
)
self.profile.context.injector.bind_instance(
VCHolder,
async_mock.MagicMock(
search_credentials=async_mock.MagicMock(
return_value=async_mock.MagicMock(
fetch=async_mock.CoroutineMock(
return_value=[
VCRecord(
contexts=[
"https://www.w3.org/2018/credentials/v1",
"https://www.w3.org/2018/credentials/examples/v1",
],
expanded_types=[
"https://www.w3.org/2018/credentials#VerifiableCredential",
"https://example.org/examples#UniversityDegreeCredential",
],
issuer_id="https://example.edu/issuers/565049",
subject_ids=[
"did:example:ebfeb1f712ebc6f1c276e12ec21"
],
proof_types=["Ed25519Signature2018"],
schema_ids=[
"https://example.org/examples/degree.json"
],
cred_value={"...": "..."},
given_id="http://example.edu/credentials/3732",
cred_tags={"some": "tag"},
)
]
)
)
)
),
)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[AttachDecorator.deserialize(dif_req_attach_v2)],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
conn_rec = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert conn_rec is not None
async def test_create_invitation_public_x_no_public_invites(self):
self.profile.context.update_settings({"public_invites": False})
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.create_invitation(
public=True,
my_endpoint="testendpoint",
hs_protos=[test_module.HSProto.RFC23],
multi_use=False,
)
assert "Public invitations are not enabled" in str(context.exception)
async def test_create_invitation_public_x_multi_use(self):
self.profile.context.update_settings({"public_invites": True})
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.create_invitation(
public=True,
my_endpoint="testendpoint",
hs_protos=[test_module.HSProto.RFC23],
multi_use=True,
)
assert "Cannot create public invitation with" in str(context.exception)
async def test_create_invitation_public_x_no_public_did(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did:
mock_wallet_get_public_did.return_value = None
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.create_invitation(
public=True,
my_endpoint="testendpoint",
hs_protos=[test_module.HSProto.RFC23],
multi_use=False,
)
assert "Cannot create public invitation with no public DID" in str(
context.exception
)
async def test_create_invitation_attachment_x(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did:
mock_wallet_get_public_did.return_value = DIDInfo(
TestConfig.test_did,
TestConfig.test_verkey,
None,
method=DIDMethod.SOV,
key_type=KeyType.ED25519,
)
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
public=False,
hs_protos=[test_module.HSProto.RFC23],
multi_use=True,
attachments=[{"having": "attachment", "is": "no", "good": "here"}],
)
assert "Unknown attachment type" in str(context.exception)
async def test_create_invitation_peer_did(self):
async with self.profile.session() as session:
self.profile.context.update_settings(
{
"multitenant.enabled": True,
"wallet.id": "my-wallet",
}
)
mediation_record = MediationRecord(
role=MediationRecord.ROLE_CLIENT,
state=MediationRecord.STATE_GRANTED,
connection_id=self.test_mediator_conn_id,
routing_keys=self.test_mediator_routing_keys,
endpoint=self.test_mediator_endpoint,
)
await mediation_record.save(session)
with async_mock.patch.object(
self.multitenant_mgr, "get_default_mediator"
) as mock_get_default_mediator:
mock_get_default_mediator.return_value = mediation_record
invi_rec = await self.manager.create_invitation(
my_label="That guy",
my_endpoint=None,
public=False,
hs_protos=[test_module.HSProto.RFC23],
multi_use=False,
)
assert invi_rec._invitation.ser[
"@type"
] == DIDCommPrefix.qualify_current(INVITATION)
assert not invi_rec._invitation.ser.get("requests~attach")
assert invi_rec.invitation.label == "That guy"
assert (
DIDCommPrefix.qualify_current(HSProto.RFC23.name)
in invi_rec.invitation.handshake_protocols
)
service = invi_rec._invitation.ser["services"][0]
assert service["id"] == "#inline"
assert service["type"] == "did-communication"
assert len(service["recipientKeys"]) == 1
assert (
service["routingKeys"][0]
== DIDKey.from_public_key_b58(
self.test_mediator_routing_keys[0], KeyType.ED25519
).did
)
assert service["serviceEndpoint"] == self.test_mediator_endpoint
async def test_create_invitation_metadata_assigned(self):
async with self.profile.session() as session:
invi_rec = await self.manager.create_invitation(
hs_protos=[test_module.HSProto.RFC23],
metadata={"hello": "world"},
)
service = invi_rec._invitation.ser["services"][0]
invitation_key = DIDKey.from_did(service["recipientKeys"][0]).public_key_b58
record = await ConnRecord.retrieve_by_invitation_key(
session, invitation_key
)
assert await record.metadata_get_all(session) == {"hello": "world"}
async def test_create_invitation_x_public_metadata(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
InMemoryWallet, "get_public_did", autospec=True
) as mock_wallet_get_public_did:
mock_wallet_get_public_did.return_value = DIDInfo(
TestConfig.test_did,
TestConfig.test_verkey,
None,
method=DIDMethod.SOV,
key_type=KeyType.ED25519,
)
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.create_invitation(
public=True,
hs_protos=[test_module.HSProto.RFC23],
metadata={"hello": "world"},
multi_use=False,
)
assert "Cannot store metadata on public" in str(context.exception)
async def test_receive_invitation_with_valid_mediation(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
mediation_record = MediationRecord(
role=MediationRecord.ROLE_CLIENT,
state=MediationRecord.STATE_GRANTED,
connection_id=self.test_mediator_conn_id,
routing_keys=self.test_mediator_routing_keys,
endpoint=self.test_mediator_endpoint,
)
await mediation_record.save(session)
with async_mock.patch.object(
DIDXManager, "receive_invitation", async_mock.CoroutineMock()
) as mock_didx_recv_invi:
invite = await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
my_label="test123",
hs_protos=[HSProto.RFC23],
)
invi_msg = invite.invitation
invitee_record = await self.manager.receive_invitation(
invitation=invi_msg,
mediation_id=mediation_record._id,
)
mock_didx_recv_invi.assert_called_once_with(
invitation=invi_msg,
their_public_did=None,
auto_accept=None,
alias=None,
mediation_id=mediation_record._id,
)
async def test_receive_invitation_with_invalid_mediation(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
async_mock.CoroutineMock(),
) as mock_didx_recv_invi:
invite = await self.manager.create_invitation(
my_endpoint=TestConfig.test_endpoint,
my_label="test123",
hs_protos=[HSProto.RFC23],
)
invi_msg = invite.invitation
invitee_record = await self.manager.receive_invitation(
invi_msg,
mediation_id="test-mediation-id",
)
mock_didx_recv_invi.assert_called_once_with(
invitation=invi_msg,
their_public_did=None,
auto_accept=None,
alias=None,
mediation_id=None,
)
async def test_receive_invitation_didx_services_with_service_block(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
test_module, "DIDXManager", autospec=True
) as didx_mgr_cls, async_mock.patch.object(
test_module,
"InvitationMessage",
autospec=True,
) as invi_msg_cls:
didx_mgr_cls.return_value = async_mock.MagicMock(
receive_invitation=async_mock.CoroutineMock()
)
mock_oob_invi = async_mock.MagicMock(
requests_attach=[],
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[
async_mock.MagicMock(
recipient_keys=["dummy"],
routing_keys=[],
)
],
)
invi_msg_cls.deserialize.return_value = mock_oob_invi
await self.manager.receive_invitation(mock_oob_invi)
async def test_receive_invitation_connection_mock(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
test_module, "ConnectionManager", autospec=True
) as conn_mgr_cls, async_mock.patch.object(
test_module,
"InvitationMessage",
autospec=True,
) as invi_msg_cls, async_mock.patch.object(
self.manager,
"receive_invitation",
async_mock.CoroutineMock(),
) as mock_receive_invitation:
mock_receive_invitation.return_value = self.test_conn_rec.serialize()
conn_mgr_cls.return_value = async_mock.MagicMock(
receive_invitation=async_mock.CoroutineMock()
)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC160.name) for pfx in DIDCommPrefix
],
label="test",
_id="test123",
services=[
async_mock.MagicMock(
recipient_keys=[
DIDKey.from_public_key_b58(
"9WCgWKUaAJj3VWxxtzvvMQN3AoFxoBtBDo9ntwJnVVCC",
KeyType.ED25519,
).did
],
routing_keys=[],
service_endpoint="http://localhost",
)
],
requests_attach=[],
)
invi_msg_cls.deserialize.return_value = mock_oob_invi
result = await self.manager.receive_invitation(mock_oob_invi)
assert result == self.test_conn_rec.serialize()
async def test_receive_invitation_connection(self):
self.profile.context.update_settings({"public_invites": True})
oob_invi_rec = await self.manager.create_invitation(
auto_accept=True,
public=False,
hs_protos=[test_module.HSProto.RFC160],
multi_use=False,
)
result = await self.manager.receive_invitation(
invitation=oob_invi_rec.invitation,
use_existing_connection=True,
auto_accept=True,
)
connection_id = UUID(result.connection_id, version=4)
assert (
connection_id.hex == result.connection_id.replace("-", "")
and len(result.connection_id) > 5
)
async def test_receive_invitation_services_with_neither_service_blocks_nor_dids(
self,
):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
test_module, "InvitationMessage", async_mock.MagicMock()
) as invi_msg_cls:
mock_invi_msg = async_mock.MagicMock(
services=[],
)
invi_msg_cls.deserialize.return_value = mock_invi_msg
with self.assertRaises(OutOfBandManagerError):
await self.manager.receive_invitation(mock_invi_msg)
async def test_receive_invitation_services_with_service_did(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
test_module, "DIDXManager", autospec=True
) as didx_mgr_cls, async_mock.patch.object(
test_module, "InvitationMessage", autospec=True
) as invi_msg_cls:
didx_mgr_cls.return_value = async_mock.MagicMock(
receive_invitation=async_mock.CoroutineMock()
)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_did],
requests_attach=[],
)
invi_msg_cls.deserialize.return_value = mock_oob_invi
invi_rec = await self.manager.receive_invitation(mock_oob_invi)
assert invi_rec._invitation.ser["services"]
async def test_receive_invitation_attachment_x(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls:
mock_oob_invi = async_mock.MagicMock(
services=[TestConfig.test_did],
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
requests_attach=[{"having": "attachment", "is": "no", "good": "here"}],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_invitation(mock_oob_invi)
assert "requests~attach is not properly formatted" in str(context.exception)
async def test_receive_invitation_req_pres_v1_0_attachment_x(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls:
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_did],
requests_attach=[
async_mock.MagicMock(
data=async_mock.MagicMock(
json={
"@type": DIDCommPrefix.qualify_current(
PRESENTATION_REQUEST
)
}
)
),
],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
result = await self.manager.receive_invitation(mock_oob_invi)
connection_id = UUID(result.connection_id, version=4)
assert (
connection_id.hex == result.connection_id
and len(result.connection_id) > 5
)
assert "requests~attach is not properly formatted" in str(context.exception)
async def test_receive_invitation_invalid_request_type_x(self):
self.profile.context.update_settings({"public_invites": True})
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls:
mock_oob_invi = async_mock.MagicMock(
services=[TestConfig.test_did],
handshake_protocols=[],
requests_attach=[],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError):
await self.manager.receive_invitation(mock_oob_invi)
async def test_find_existing_connection(self):
async with self.profile.session() as session:
test_conn_rec = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_role=None,
state=ConnRecord.State.COMPLETED,
their_public_did=self.their_public_did,
)
await test_conn_rec.save(session)
tag_filter = {}
post_filter = {}
post_filter["their_public_did"] = "not_addded"
conn_record = await self.manager.find_existing_connection(
tag_filter, post_filter
)
assert conn_record == None
post_filter["their_public_did"] = self.their_public_did
conn_record = await self.manager.find_existing_connection(
tag_filter, post_filter
)
assert conn_record == test_conn_rec
await test_conn_rec.delete_record(session)
async def test_find_existing_connection_no_active(self):
async with self.profile.session() as session:
self.test_conn_rec.invitation_msg_id = "test_123"
self.test_conn_rec.state = ConnRecord.State.REQUEST.rfc160
await self.test_conn_rec.save(session)
tag_filter = {}
post_filter = {}
post_filter["invitation_msg_id"] = "test_123"
conn_record = await self.manager.find_existing_connection(
tag_filter, post_filter
)
assert conn_record is None
async def test_check_reuse_msg_state(self):
async with self.profile.session() as session:
await self.test_conn_rec.save(session)
await self.test_conn_rec.metadata_set(
session, "reuse_msg_state", "accepted"
)
assert await self.manager.check_reuse_msg_state(self.test_conn_rec) is None
async def test_create_handshake_reuse_msg(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
await self.test_conn_rec.save(session)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn:
oob_mgr_fetch_conn.return_value = ConnectionTarget(
did=TestConfig.test_did,
endpoint=TestConfig.test_endpoint,
recipient_keys=TestConfig.test_verkey,
sender_key=TestConfig.test_verkey,
)
oob_invi = InvitationMessage()
await self.manager.create_handshake_reuse_message(
oob_invi, self.test_conn_rec
)
assert (
len(await self.test_conn_rec.metadata_get(session, "reuse_msg_id"))
> 6
)
assert (
await self.test_conn_rec.metadata_get(session, "reuse_msg_state")
== "initial"
)
async def test_create_handshake_reuse_msg_catch_exception(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
await self.test_conn_rec.save(session)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn:
oob_mgr_fetch_conn.side_effect = StorageNotFoundError()
oob_invi = InvitationMessage()
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.create_handshake_reuse_message(
oob_invi, self.test_conn_rec
)
assert "Error on creating and sending a handshake reuse message" in str(
context.exception
)
async def test_receive_reuse_message_existing_found(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
)
reuse_msg = HandshakeReuse()
reuse_msg.assign_thread_id(thid="test_123", pthid="test_123")
self.test_conn_rec.invitation_msg_id = "test_123"
self.test_conn_rec.state = ConnRecord.State.COMPLETED.rfc160
await self.test_conn_rec.save(session)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
InvitationRecord,
"retrieve_by_tag_filter",
autospec=True,
) as retrieve_invi_rec:
oob_mgr_find_existing_conn.return_value = self.test_conn_rec
oob_mgr_fetch_conn.return_value = ConnectionTarget(
did=TestConfig.test_did,
endpoint=TestConfig.test_endpoint,
recipient_keys=TestConfig.test_verkey,
sender_key=TestConfig.test_verkey,
)
oob_invi = InvitationMessage()
retrieve_invi_rec.return_value = InvitationRecord(
invi_msg_id="test_123"
)
await self.manager.receive_reuse_message(reuse_msg, receipt)
assert (
len(
await ConnRecord.query(
session=session,
tag_filter={},
post_filter_positive={"invitation_msg_id": "test_123"},
alt=True,
)
)
== 1
)
async def test_receive_reuse_message_existing_not_found(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
reuse_msg = HandshakeReuse()
reuse_msg.assign_thread_id(thid="test_123", pthid="test_123")
self.test_conn_rec.invitation_msg_id = "test_123"
self.test_conn_rec.state = ConnRecord.State.REQUEST.rfc160
await self.test_conn_rec.save(session)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
InvitationRecord,
"retrieve_by_tag_filter",
autospec=True,
) as retrieve_invi_rec, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn:
oob_mgr_find_existing_conn.return_value = None
oob_mgr_fetch_conn.return_value = ConnectionTarget(
did=TestConfig.test_did,
endpoint=TestConfig.test_endpoint,
recipient_keys=TestConfig.test_verkey,
sender_key=TestConfig.test_verkey,
)
oob_invi = InvitationMessage()
retrieve_invi_rec.return_value = InvitationRecord(
invi_msg_id="test_123"
)
await self.manager.receive_reuse_message(reuse_msg, receipt)
assert len(self.responder.messages) == 0
async def test_receive_reuse_message_storage_not_found(self):
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
reuse_msg = HandshakeReuse()
reuse_msg.assign_thread_id(thid="test_123", pthid="test_123")
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
InvitationRecord,
"retrieve_by_tag_filter",
autospec=True,
) as retrieve_invi_rec, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn:
oob_mgr_find_existing_conn.side_effect = StorageNotFoundError()
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_reuse_message(reuse_msg, receipt)
assert "No existing ConnRecord found for OOB Invitee" in str(
context.exception
)
async def test_receive_reuse_message_problem_report_logic(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
reuse_msg = HandshakeReuse()
reuse_msg.assign_thread_id(thid="test_123", pthid="test_123")
self.test_conn_rec.invitation_msg_id = "test_456"
self.test_conn_rec.their_did = "test_did"
self.test_conn_rec.state = ConnRecord.State.COMPLETED.rfc160
await self.test_conn_rec.save(session)
with async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn:
oob_mgr_fetch_conn.return_value = ConnectionTarget(
did=TestConfig.test_did,
endpoint=TestConfig.test_endpoint,
recipient_keys=TestConfig.test_verkey,
sender_key=TestConfig.test_verkey,
)
await self.manager.receive_reuse_message(reuse_msg, receipt)
async def test_receive_reuse_accepted(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
reuse_msg_accepted = HandshakeReuseAccept()
reuse_msg_accepted.assign_thread_id(thid="test_123", pthid="test_123")
self.test_conn_rec.invitation_msg_id = "test_123"
self.test_conn_rec.state = ConnRecord.State.COMPLETED.rfc160
await self.test_conn_rec.save(session)
await self.test_conn_rec.metadata_set(session, "reuse_msg_id", "test_123")
await self.test_conn_rec.metadata_set(session, "reuse_msg_state", "initial")
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn:
await self.manager.receive_reuse_accepted_message(
reuse_msg_accepted, receipt, self.test_conn_rec
)
assert (
await self.test_conn_rec.metadata_get(session, "reuse_msg_state")
== "accepted"
)
async def test_receive_reuse_accepted(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
reuse_msg_accepted = HandshakeReuseAccept()
reuse_msg_accepted.assign_thread_id(thid="test_123", pthid="test_123")
self.test_conn_rec.invitation_msg_id = "test_123"
self.test_conn_rec.state = ConnRecord.State.COMPLETED.rfc160
await self.test_conn_rec.save(session)
await self.test_conn_rec.metadata_set(session, "reuse_msg_id", "test_123")
await self.test_conn_rec.metadata_set(session, "reuse_msg_state", "initial")
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn:
await self.manager.receive_reuse_accepted_message(
reuse_msg_accepted, receipt, self.test_conn_rec
)
assert (
await self.test_conn_rec.metadata_get(session, "reuse_msg_state")
== "accepted"
)
async def test_receive_reuse_accepted_invalid_conn(self):
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
reuse_msg_accepted = HandshakeReuseAccept()
reuse_msg_accepted.assign_thread_id(thid="test_123", pthid="test_123")
test_invalid_conn = ConnRecord(
my_did="Test",
their_did="Test",
invitation_msg_id="test_456",
connection_id="12345678-0123-4567-1234-567812345678",
)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn:
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_reuse_accepted_message(
reuse_msg_accepted, receipt, test_invalid_conn
)
assert "Error processing reuse accepted message" in str(context.exception)
async def test_receive_reuse_accepted_message_catch_exception(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
reuse_msg_accepted = HandshakeReuseAccept()
reuse_msg_accepted.assign_thread_id(thid="test_123", pthid="test_123")
self.test_conn_rec.invitation_msg_id = "test_123"
self.test_conn_rec.state = ConnRecord.State.COMPLETED.rfc160
await self.test_conn_rec.save(session)
await self.test_conn_rec.metadata_set(session, "reuse_msg_id", "test_123")
await self.test_conn_rec.metadata_set(session, "reuse_msg_state", "initial")
with async_mock.patch.object(
self.test_conn_rec,
"metadata_set",
async_mock.CoroutineMock(side_effect=StorageNotFoundError),
):
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_reuse_accepted_message(
reuse_msg_accepted, receipt, self.test_conn_rec
)
assert "Error processing reuse accepted message" in str(
context.exception
)
async def test_problem_report_received_not_active(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
problem_report = ProblemReport(
description={
"en": "test",
"code": ProblemReportReason.EXISTING_CONNECTION_NOT_ACTIVE.value,
}
)
problem_report.assign_thread_id(thid="test_123", pthid="test_123")
self.test_conn_rec.invitation_msg_id = "test_123"
self.test_conn_rec.state = ConnRecord.State.COMPLETED.rfc160
await self.test_conn_rec.save(session)
await self.test_conn_rec.metadata_set(session, "reuse_msg_id", "test_123")
await self.test_conn_rec.metadata_set(session, "reuse_msg_state", "initial")
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn:
await self.manager.receive_problem_report(
problem_report, receipt, self.test_conn_rec
)
assert (
await self.test_conn_rec.metadata_get(session, "reuse_msg_state")
== "not_accepted"
)
async def test_problem_report_received_not_exists(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
problem_report = ProblemReport(
description={
"en": "test",
"code": ProblemReportReason.NO_EXISTING_CONNECTION.value,
}
)
problem_report.assign_thread_id(thid="test_123", pthid="test_123")
self.test_conn_rec.invitation_msg_id = "test_123"
self.test_conn_rec.state = ConnRecord.State.COMPLETED.rfc160
await self.test_conn_rec.save(session)
await self.test_conn_rec.metadata_set(session, "reuse_msg_id", "test_123")
await self.test_conn_rec.metadata_set(session, "reuse_msg_state", "initial")
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn:
await self.manager.receive_problem_report(
problem_report, receipt, self.test_conn_rec
)
assert (
await self.test_conn_rec.metadata_get(session, "reuse_msg_state")
== "not_accepted"
)
async def test_problem_report_received_invalid_conn(self):
self.profile.context.update_settings({"public_invites": True})
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did="test_did",
)
problem_report = ProblemReport(
description={
"en": "test",
"code": ProblemReportReason.NO_EXISTING_CONNECTION.value,
}
)
problem_report.assign_thread_id(thid="test_123", pthid="test_123")
test_invalid_conn = ConnRecord(
my_did="Test",
their_did="Test",
invitation_msg_id="test_456",
connection_id="12345678-0123-4567-1234-567812345678",
)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn:
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_problem_report(
problem_report, receipt, test_invalid_conn
)
assert "Error processing problem report message" in str(context.exception)
async def test_existing_conn_record_public_did(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report:
oob_mgr_find_existing_conn.return_value = test_exist_conn
oob_mgr_check_reuse_state.return_value = None
oob_mgr_create_reuse_msg.return_value = None
oob_mgr_receive_reuse_msg.return_value = None
oob_mgr_receive_accept_msg.return_value = None
oob_mgr_receive_problem_report.return_value = None
await test_exist_conn.metadata_set(
session, "reuse_msg_state", "accepted"
)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
result = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
retrieved_conn_records = await ConnRecord.query(
session=session,
tag_filter={},
post_filter_positive={
"invitation_msg_id": "12345678-0123-4567-1234-567812345678"
},
alt=True,
)
assert (
await retrieved_conn_records[0].metadata_get(
session, "reuse_msg_id"
)
is None
)
assert (
await retrieved_conn_records[0].metadata_get(
session, "reuse_msg_state"
)
is None
)
assert result.connection_id == retrieved_conn_records[0].connection_id
async def test_existing_conn_record_public_did_not_accepted(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did="did:sov:LjgpST2rjsoxYegQDRm7EL",
their_public_did="did:sov:LjgpST2rjsoxYegQDRm7EL",
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
test_new_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did="did:sov:LjgpST2rjsoxYegQDRm7EL",
their_public_did="did:sov:LjgpST2rjsoxYegQDRm7EL",
invitation_msg_id="12345678-0123-4567-1234-1234545454487",
their_role=ConnRecord.Role.REQUESTER,
)
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report:
oob_mgr_find_existing_conn.return_value = test_exist_conn
oob_mgr_check_reuse_state.return_value = None
oob_mgr_create_reuse_msg.return_value = None
oob_mgr_receive_reuse_msg.return_value = None
oob_mgr_receive_accept_msg.return_value = None
oob_mgr_receive_problem_report.return_value = None
await test_exist_conn.metadata_set(
session, "reuse_msg_state", "not_accepted"
)
didx_mgr_receive_invitation.return_value = test_new_conn
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
result = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
retrieved_conn_records = await ConnRecord.query(
session=session,
tag_filter={},
post_filter_positive={
"invitation_msg_id": "12345678-0123-4567-1234-567812345678"
},
alt=True,
)
assert (
await retrieved_conn_records[0].metadata_get(
session, "reuse_msg_state"
)
== "not_accepted"
)
assert result.connection_id != retrieved_conn_records[0].connection_id
async def test_existing_conn_record_public_did_inverse_cases(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await self.test_conn_rec.save(session)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state:
oob_mgr_find_existing_conn.return_value = test_exist_conn
didx_mgr_receive_invitation.return_value = self.test_conn_rec
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
result = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=False
)
retrieved_conn_records = await ConnRecord.query(
session=session,
tag_filter={},
post_filter_positive={
"invitation_msg_id": "12345678-0123-4567-1234-567812345678"
},
alt=True,
)
assert result.connection_id != retrieved_conn_records[0].connection_id
async def test_existing_conn_record_public_did_timeout(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state:
oob_mgr_find_existing_conn.return_value = test_exist_conn
oob_mgr_check_reuse_state.side_effect = asyncio.TimeoutError
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
result = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
retrieved_conn_records = await ConnRecord.query(
session=session,
tag_filter={},
post_filter_positive={
"their_public_did": TestConfig.test_target_did
},
alt=True,
)
assert (
retrieved_conn_records[0].state == ConnRecord.State.ABANDONED.rfc160
)
async def test_existing_conn_record_public_did_timeout_no_handshake_protocol(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn:
oob_mgr_find_existing_conn.return_value = test_exist_conn
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[],
services=[TestConfig.test_target_did],
requests_attach=[
{"having": "attachment", "is": "no", "good": "here"}
],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
result = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=False
)
assert "No existing connection exists and " in str(context.exception)
async def test_req_v1_attach_presentation_existing_conn_no_auto_present(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
exchange_rec = V10PresentationExchange()
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch.object(
PresentationManager, "receive_request", autospec=True
) as pres_mgr_receive_request, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report:
oob_mgr_find_existing_conn.return_value = test_exist_conn
pres_mgr_receive_request.return_value = exchange_rec
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[
AttachDecorator.deserialize(TestConfig.req_attach_v1)
],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
result = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert "Configuration sets auto_present false" in str(context.exception)
async def test_req_v1_attach_presentation_existing_conn_auto_present_pres_msg(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_presentation_request": True}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
exchange_rec = V10PresentationExchange()
exchange_rec.auto_present = True
exchange_rec.presentation_request = TestConfig.INDY_PROOF_REQ
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
PresentationManager,
"receive_request",
autospec=True,
) as pres_mgr_receive_request, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report, async_mock.patch.object(
PresentationManager,
"create_presentation",
autospec=True,
) as pres_mgr_create_presentation:
oob_mgr_find_existing_conn.return_value = test_exist_conn
pres_mgr_receive_request.return_value = exchange_rec
pres_mgr_create_presentation.return_value = (
exchange_rec,
Presentation(
presentations_attach=[
AttachDecorator.data_base64({"bogus": "proof"})
]
),
)
holder = async_mock.MagicMock(IndyHolder, autospec=True)
get_creds = async_mock.CoroutineMock(
return_value=(
{
"cred_info": {"referent": "dummy_reft"},
"attrs": {
"player": "Richie Knucklez",
"screenCapture": "aW1hZ2luZSBhIHNjcmVlbiBjYXB0dXJl",
"highScore": "1234560",
},
},
)
)
holder.get_credentials_for_presentation_request_by_referent = get_creds
holder.create_credential_request = async_mock.CoroutineMock(
return_value=(
json.dumps(TestConfig.indy_cred_req),
json.dumps(TestConfig.cred_req_meta),
)
)
self.profile.context.injector.bind_instance(IndyHolder, holder)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[
AttachDecorator.deserialize(TestConfig.req_attach_v1)
],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
conn_rec = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert conn_rec is not None
async def test_req_v1_attach_pres_catch_value_error(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_presentation_request": True}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
exchange_rec = V10PresentationExchange()
exchange_rec.auto_present = True
exchange_rec.presentation_request = TestConfig.INDY_PROOF_REQ
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
PresentationManager,
"receive_request",
autospec=True,
) as pres_mgr_receive_request, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report, async_mock.patch.object(
PresentationManager,
"create_presentation",
autospec=True,
) as pres_mgr_create_presentation:
oob_mgr_find_existing_conn.return_value = test_exist_conn
pres_mgr_receive_request.return_value = exchange_rec
pres_mgr_create_presentation.return_value = (
exchange_rec,
Presentation(comment="this is test"),
)
holder = async_mock.MagicMock(IndyHolder, autospec=True)
get_creds = async_mock.CoroutineMock(return_value=())
holder.get_credentials_for_presentation_request_by_referent = get_creds
holder.create_credential_request = async_mock.CoroutineMock(
return_value=(
json.dumps(TestConfig.indy_cred_req),
json.dumps(TestConfig.cred_req_meta),
)
)
self.profile.context.injector.bind_instance(IndyHolder, holder)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[
AttachDecorator.deserialize(TestConfig.req_attach_v1)
],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert "Cannot auto-respond" in str(context.exception)
async def test_req_v2_attach_presentation_existing_conn_no_auto_present(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
px2_rec = test_module.V20PresExRecord()
with async_mock.patch.object(
DIDXManager, "receive_invitation", autospec=True
) as didx_mgr_receive_invitation, async_mock.patch.object(
V20PresManager, "receive_pres_request", autospec=True
) as pres_mgr_receive_pres_req, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report:
oob_mgr_find_existing_conn.return_value = test_exist_conn
pres_mgr_receive_pres_req.return_value = px2_rec
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[
AttachDecorator.deserialize(TestConfig.req_attach_v2)
],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert (
"Configuration set auto_present false: cannot respond automatically to presentation requests"
== str(context.exception)
)
async def test_req_v2_attach_presentation_existing_conn_auto_present_pres_msg(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_presentation_request": True}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
px2_rec = test_module.V20PresExRecord(
auto_present=True,
pres_request=TestConfig.PRES_REQ_V2.serialize(),
)
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
V20PresManager,
"receive_pres_request",
autospec=True,
) as pres_mgr_receive_pres_req, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report, async_mock.patch.object(
V20PresManager,
"create_pres",
autospec=True,
) as pres_mgr_create_pres:
oob_mgr_find_existing_conn.return_value = test_exist_conn
pres_mgr_receive_pres_req.return_value = px2_rec
pres_mgr_create_pres.return_value = (
px2_rec,
V20Pres(
formats=[
V20PresFormat(
attach_id="indy",
format_=V20_PRES_ATTACH_FORMAT[PRES_20][
V20PresFormat.Format.INDY.api
],
)
],
presentations_attach=[
AttachDecorator.data_base64(
mapping={"bogus": "proof"},
ident="indy",
)
],
),
)
holder = async_mock.MagicMock(IndyHolder, autospec=True)
get_creds = async_mock.CoroutineMock(
return_value=(
{
"cred_info": {"referent": "dummy_reft"},
"attrs": {
"player": "Richie Knucklez",
"screenCapture": "aW1hZ2luZSBhIHNjcmVlbiBjYXB0dXJl",
"highScore": "1234560",
},
},
)
)
holder.get_credentials_for_presentation_request_by_referent = get_creds
holder.create_credential_request = async_mock.CoroutineMock(
return_value=(
json.dumps(TestConfig.indy_cred_req),
json.dumps(TestConfig.cred_req_meta),
)
)
self.profile.context.injector.bind_instance(IndyHolder, holder)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[
AttachDecorator.deserialize(TestConfig.req_attach_v2)
],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
conn_rec = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert conn_rec is not None
async def test_req_v2_attach_pres_catch_value_error(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_presentation_request": False}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
px2_rec = test_module.V20PresExRecord(
auto_present=False,
pres_request=TestConfig.PRES_REQ_V2.serialize(),
)
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
V20PresManager,
"receive_pres_request",
autospec=True,
) as pres_mgr_receive_pres_req, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report, async_mock.patch.object(
V20PresManager,
"create_pres",
autospec=True,
) as pres_mgr_create_pres:
oob_mgr_find_existing_conn.return_value = test_exist_conn
pres_mgr_receive_pres_req.return_value = px2_rec
pres_mgr_create_pres.return_value = (
px2_rec,
V20Pres(
formats=[
V20PresFormat(
attach_id="indy",
format_=V20_PRES_ATTACH_FORMAT[PRES_20][
V20PresFormat.Format.INDY.api
],
)
],
presentations_attach=[
AttachDecorator.data_base64(
mapping={"bogus": "proof"},
ident="indy",
)
],
),
)
holder = async_mock.MagicMock(IndyHolder, autospec=True)
get_creds = async_mock.CoroutineMock(return_value=())
holder.get_credentials_for_presentation_request_by_referent = get_creds
holder.create_credential_request = async_mock.CoroutineMock(
return_value=(
json.dumps(TestConfig.indy_cred_req),
json.dumps(TestConfig.cred_req_meta),
)
)
self.profile.context.injector.bind_instance(IndyHolder, holder)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[
AttachDecorator.deserialize(TestConfig.req_attach_v2)
],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert "cannot respond automatically" in str(context.exception)
async def test_req_attach_cred_offer_v1(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_credential_offer": True}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
state=ConnRecord.State.COMPLETED,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
req_attach = deepcopy(TestConfig.req_attach_v1)
del req_attach["data"]["json"]
req_attach["data"]["json"] = TestConfig.CRED_OFFER_V1.serialize()
exchange_rec = V10CredentialExchange()
exchange_rec.credential_offer = TestConfig.CRED_OFFER_V1
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
V10CredManager,
"receive_offer",
autospec=True,
) as cred_mgr_offer_receive, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"conn_rec_is_active",
autospec=True,
) as oob_mgr_check_conn_rec_active, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report, async_mock.patch.object(
V10CredManager,
"create_request",
autospec=True,
) as cred_mgr_request_receive:
oob_mgr_find_existing_conn.return_value = test_exist_conn
oob_mgr_check_conn_rec_active.return_value = test_exist_conn
cred_mgr_offer_receive.return_value = exchange_rec
cred_mgr_request_receive.return_value = (exchange_rec, INDY_CRED_REQ)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[AttachDecorator.deserialize(req_attach)],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
conn_rec = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert conn_rec is not None
async def test_req_attach_cred_offer_v1_no_issue(self):
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_credential_offer": False}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
state=ConnRecord.State.COMPLETED,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
req_attach = deepcopy(TestConfig.req_attach_v1)
del req_attach["data"]["json"]
req_attach["data"]["json"] = TestConfig.CRED_OFFER_V1.serialize()
exchange_rec = V10CredentialExchange()
exchange_rec.credential_offer = TestConfig.CRED_OFFER_V1
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
V10CredManager,
"receive_offer",
autospec=True,
) as cred_mgr_offer_receive, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"conn_rec_is_active",
autospec=True,
) as oob_mgr_check_conn_rec_active, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report:
oob_mgr_find_existing_conn.return_value = test_exist_conn
cred_mgr_offer_receive.return_value = exchange_rec
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[AttachDecorator.deserialize(req_attach)],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert "Configuration sets auto_offer false" in str(context.exception)
async def test_req_attach_cred_offer_v2(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_credential_offer": True}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
state=ConnRecord.State.COMPLETED,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
req_attach = deepcopy(TestConfig.req_attach_v1)
del req_attach["data"]["json"]
req_attach["data"]["json"] = TestConfig.CRED_OFFER_V2.serialize()
exchange_rec = V20CredExRecord()
exchange_rec.cred_offer = TestConfig.CRED_OFFER_V2
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
V20CredManager,
"receive_offer",
autospec=True,
) as cred_mgr_offer_receive, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"conn_rec_is_active",
autospec=True,
) as oob_mgr_check_conn_rec_active, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report, async_mock.patch.object(
V20CredManager,
"create_request",
autospec=True,
) as cred_mgr_request_receive:
oob_mgr_find_existing_conn.return_value = test_exist_conn
oob_mgr_check_conn_rec_active.return_value = test_exist_conn
cred_mgr_offer_receive.return_value = exchange_rec
cred_mgr_request_receive.return_value = (exchange_rec, INDY_CRED_REQ)
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[AttachDecorator.deserialize(req_attach)],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
conn_rec = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert conn_rec is not None
async def test_req_attach_cred_offer_v2_no_issue(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_credential_offer": False}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
state=ConnRecord.State.COMPLETED,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
req_attach = deepcopy(TestConfig.req_attach_v1)
del req_attach["data"]["json"]
req_attach["data"]["json"] = TestConfig.CRED_OFFER_V2.serialize()
exchange_rec = V20CredExRecord()
exchange_rec.cred_offer = TestConfig.CRED_OFFER_V2
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
V20CredManager,
"receive_offer",
autospec=True,
) as cred_mgr_offer_receive, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"conn_rec_is_active",
autospec=True,
) as oob_mgr_check_conn_rec_active, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report:
oob_mgr_find_existing_conn.return_value = test_exist_conn
cred_mgr_offer_receive.return_value = exchange_rec
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[AttachDecorator.deserialize(req_attach)],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert "Configuration sets auto_offer false" in str(context.exception)
async def test_catch_unsupported_request_attach(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_credential_offer": False}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
req_attach = deepcopy(TestConfig.req_attach_v1)
del req_attach["data"]["json"]
req_attach["data"]["json"] = TestConfig.CRED_OFFER_V1.serialize()
req_attach["data"]["json"]["@type"] = "test"
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"conn_rec_is_active",
autospec=True,
) as oob_mgr_check_conn_rec_active, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report:
oob_mgr_find_existing_conn.return_value = test_exist_conn
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[AttachDecorator.deserialize(req_attach)],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
with self.assertRaises(OutOfBandManagerError) as context:
await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
assert "Unsupported requests~attach type" in str(context.exception)
async def test_check_conn_rec_active_a(self):
async with self.profile.session() as session:
await self.test_conn_rec.save(session)
conn_rec = await self.manager.conn_rec_is_active(
self.test_conn_rec.connection_id
)
assert conn_rec.connection_id == self.test_conn_rec.connection_id
async def test_check_conn_rec_active_b(self):
connection_id = self.test_conn_rec.connection_id
conn_rec_request = deepcopy(self.test_conn_rec)
conn_rec_request.state = "request"
conn_rec_active = deepcopy(self.test_conn_rec)
conn_rec_active.state = "active"
with async_mock.patch.object(
test_module.ConnRecord,
"retrieve_by_id",
autospec=True,
) as mock_conn_rec_retrieve:
mock_conn_rec_retrieve.side_effect = [conn_rec_request, conn_rec_active]
conn_rec = await self.manager.conn_rec_is_active(connection_id)
assert conn_rec.state == "active"
async def test_request_attach_cred_offer_v1_check_conn_rec_active_timeout(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_credential_offer": True}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
req_attach = deepcopy(TestConfig.req_attach_v1)
del req_attach["data"]["json"]
req_attach["data"]["json"] = TestConfig.CRED_OFFER_V1.serialize()
exchange_rec = V20CredExRecord()
exchange_rec.cred_offer = TestConfig.CRED_OFFER_V1
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
), async_mock.patch.object(
V10CredManager,
"receive_offer",
autospec=True,
) as cred_mgr_offer_receive, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
), async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
), async_mock.patch.object(
OutOfBandManager,
"conn_rec_is_active",
autospec=True,
) as oob_mgr_check_conn_rec_active, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
), async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
), async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
), async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
), async_mock.patch.object(
V10CredManager,
"create_request",
autospec=True,
) as cred_mgr_request_receive, async_mock.patch.object(
test_module.LOGGER, "warning", async_mock.MagicMock()
) as mock_logger_warning:
oob_mgr_find_existing_conn.return_value = test_exist_conn
cred_mgr_offer_receive.return_value = exchange_rec
cred_mgr_request_receive.return_value = (exchange_rec, INDY_CRED_REQ)
oob_mgr_check_conn_rec_active.side_effect = asyncio.TimeoutError
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[AttachDecorator.deserialize(req_attach)],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
conn_rec = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
mock_logger_warning.assert_called_once()
assert conn_rec is not None
async def test_request_attach_cred_offer_v2_check_conn_rec_active_timeout(self):
async with self.profile.session() as session:
self.profile.context.update_settings({"public_invites": True})
self.profile.context.update_settings(
{"debug.auto_respond_credential_offer": True}
)
test_exist_conn = ConnRecord(
my_did=TestConfig.test_did,
their_did=TestConfig.test_target_did,
their_public_did=TestConfig.test_target_did,
invitation_msg_id="12345678-0123-4567-1234-567812345678",
their_role=ConnRecord.Role.REQUESTER,
)
await test_exist_conn.save(session)
await test_exist_conn.metadata_set(session, "reuse_msg_state", "initial")
await test_exist_conn.metadata_set(session, "reuse_msg_id", "test_123")
receipt = MessageReceipt(
recipient_did=TestConfig.test_did,
recipient_did_public=False,
sender_did=TestConfig.test_target_did,
)
req_attach = deepcopy(TestConfig.req_attach_v1)
del req_attach["data"]["json"]
req_attach["data"]["json"] = TestConfig.CRED_OFFER_V2.serialize()
exchange_rec = V20CredExRecord()
exchange_rec.cred_offer = TestConfig.CRED_OFFER_V2
with async_mock.patch.object(
DIDXManager,
"receive_invitation",
autospec=True,
) as didx_mgr_receive_invitation, async_mock.patch.object(
V20CredManager,
"receive_offer",
autospec=True,
) as cred_mgr_offer_receive, async_mock.patch(
"aries_cloudagent.protocols.out_of_band.v1_0.manager.InvitationMessage",
autospec=True,
) as inv_message_cls, async_mock.patch.object(
OutOfBandManager,
"fetch_connection_targets",
autospec=True,
) as oob_mgr_fetch_conn, async_mock.patch.object(
OutOfBandManager,
"find_existing_connection",
autospec=True,
) as oob_mgr_find_existing_conn, async_mock.patch.object(
OutOfBandManager,
"check_reuse_msg_state",
autospec=True,
) as oob_mgr_check_reuse_state, async_mock.patch.object(
OutOfBandManager,
"conn_rec_is_active",
autospec=True,
) as oob_mgr_check_conn_rec_active, async_mock.patch.object(
OutOfBandManager,
"create_handshake_reuse_message",
autospec=True,
) as oob_mgr_create_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_message",
autospec=True,
) as oob_mgr_receive_reuse_msg, async_mock.patch.object(
OutOfBandManager,
"receive_reuse_accepted_message",
autospec=True,
) as oob_mgr_receive_accept_msg, async_mock.patch.object(
OutOfBandManager,
"receive_problem_report",
autospec=True,
) as oob_mgr_receive_problem_report, async_mock.patch.object(
V20CredManager,
"create_request",
autospec=True,
) as cred_mgr_request_receive, async_mock.patch.object(
test_module.LOGGER, "warning", async_mock.MagicMock()
) as mock_logger_warning:
oob_mgr_find_existing_conn.return_value = test_exist_conn
cred_mgr_offer_receive.return_value = exchange_rec
cred_mgr_request_receive.return_value = (exchange_rec, INDY_CRED_REQ)
oob_mgr_check_conn_rec_active.side_effect = asyncio.TimeoutError
mock_oob_invi = async_mock.MagicMock(
handshake_protocols=[
pfx.qualify(HSProto.RFC23.name) for pfx in DIDCommPrefix
],
services=[TestConfig.test_target_did],
requests_attach=[AttachDecorator.deserialize(req_attach)],
)
inv_message_cls.deserialize.return_value = mock_oob_invi
conn_rec = await self.manager.receive_invitation(
mock_oob_invi, use_existing_connection=True
)
mock_logger_warning.assert_called_once()
assert conn_rec is not None
| 44.881663 | 113 | 0.569855 | 15,295 | 160,811 | 5.614776 | 0.035502 | 0.038566 | 0.045646 | 0.057291 | 0.902955 | 0.890589 | 0.874566 | 0.857391 | 0.842428 | 0.832052 | 0 | 0.019509 | 0.356456 | 160,811 | 3,582 | 114 | 44.894193 | 0.810318 | 0.000106 | 0 | 0.753461 | 0 | 0 | 0.107925 | 0.051465 | 0 | 0 | 0 | 0 | 0.034462 | 1 | 0.000589 | false | 0 | 0.017673 | 0 | 0.025331 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
433920cd2d8b83f3a21241c351284fef78ca67c7 | 21,317 | py | Python | segmentation/models/fcn.py | LittleWat/MCD_DA | 37cb1bc38c203702e22c7c0c37e284d0294714fb | [
"MIT"
] | 464 | 2018-04-04T22:38:44.000Z | 2022-03-12T15:46:49.000Z | segmentation/models/fcn.py | seqam-lab/MCD_DA | af10217c5c5451dcd8bc3e975a7d067c285cc029 | [
"MIT"
] | 28 | 2018-05-05T20:01:31.000Z | 2022-01-16T05:07:56.000Z | segmentation/models/fcn.py | seqam-lab/MCD_DA | af10217c5c5451dcd8bc3e975a7d067c285cc029 | [
"MIT"
] | 147 | 2018-04-10T08:44:10.000Z | 2021-12-28T02:14:38.000Z | import torch
import torch.nn as nn
import torch.nn.functional as F
from easydict import EasyDict as edict
from models import extended_resnet
class Upsample(nn.Module):
def __init__(self, inplanes, planes):
super(Upsample, self).__init__()
self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=5, padding=2)
self.bn = nn.BatchNorm2d(planes)
def forward(self, x, size):
x = F.upsample(x, size=size, mode="bilinear")
x = self.conv1(x)
x = self.bn(x)
return x
class Fusion(nn.Module):
def __init__(self, inplanes):
super(Fusion, self).__init__()
self.conv = nn.Conv2d(inplanes, inplanes, kernel_size=1)
self.bn = nn.BatchNorm2d(inplanes)
self.relu = nn.ReLU()
#self.dropout = nn.Dropout(.1)
def forward(self, x1, x2):
out = self.bn(self.conv(x1)) + x2
out = self.relu(out)
return out
class Fusion2(nn.Module):
def __init__(self, in_ch, out_ch):
super(Fusion2, self).__init__()
self.conv = nn.Conv2d(in_ch, out_ch, kernel_size=1)
self.bn = nn.BatchNorm2d(out_ch)
self.relu = nn.ReLU()
self.dropout = nn.Dropout(.1)
def forward(self, x1, x2):
out = self.bn(self.conv(x1)) + x2
out = self.dropout(self.relu(out))
return out
class Discriminator(nn.Module):
def __init__(self):
super(Discriminator, self).__init__()
self.l1 = self._conv_bn_relu_dropout(2048)
self.l2 = self._conv_bn_relu_dropout(256)
self.l3 = self._conv_bn_relu_dropout(32)
self.fc1 = nn.Linear(2048, 512)
self.bn1 = nn.BatchNorm1d(512)
self.fc2 = nn.Linear(512, 512)
self.bn2 = nn.BatchNorm1d(512)
self.fc3 = nn.Linear(512, 2)
def _conv_bn_relu_dropout(self, inplanes):
return nn.Sequential(
nn.Conv2d(inplanes, inplanes / 8, 3, padding=1, bias=False),
nn.BatchNorm2d(inplanes / 8),
nn.ReLU(inplace=True),
nn.Dropout(.1),
)
def forward(self, x):
h = self.l1(x) # [1, 2048, 16, 32] -> [1, 256, 16, 32]
h = self.l2(h) # [1, 256, 16, 32] -> [1, 32, 16, 32]
h = self.l3(h) # [1, 32, 16, 32] -> [1, 4, 16, 32]
h = h.view(h.size(0), -1) # [1, 4, 16, 32] -> [1, 2048]
out = F.relu(self.bn1(self.fc1(h)))
out = F.relu(self.bn2(self.fc2(out)))
out = self.fc3(out)
return out
class ResFCN(nn.Module):
"""
img_size: torch.Size([512, 1024])
conv_x: torch.Size([1, 64, 256, 512])
pool_x: torch.Size([1, 64, 128, 256])
fm2: torch.Size([1, 512, 64, 128])
fm3: torch.Size([1, 1024, 32, 64])
fm4: torch.Size([1, 2048, 16, 32])
"""
def __init__(self, num_classes, layer='50', input_ch=3):
super(ResFCN, self).__init__()
self.num_classes = num_classes
print ('resnet' + layer)
if layer == '18':
resnet = extended_resnet.resnet18(pretrained=True, input_ch=input_ch)
elif layer == '34':
resnet = extended_resnet.resnet34(pretrained=True, input_ch=input_ch)
elif layer == '50':
resnet = extended_resnet.resnet50(pretrained=True, input_ch=input_ch)
elif layer == '101':
resnet = extended_resnet.resnet101(pretrained=True, input_ch=input_ch)
elif layer == '152':
resnet = extended_resnet.resnet152(pretrained=True, input_ch=input_ch)
else:
NotImplementedError
self.conv1 = resnet.conv1
self.bn0 = resnet.bn1
self.relu = resnet.relu
self.maxpool = resnet.maxpool
self.layer1 = resnet.layer1
self.layer2 = resnet.layer2
self.layer3 = resnet.layer3
self.layer4 = resnet.layer4
self.num_classes = num_classes
self.upsample1 = Upsample(2048, 1024)
self.upsample2 = Upsample(1024, 512)
self.upsample3 = Upsample(512, 64)
self.upsample4 = Upsample(64, 64)
self.upsample5 = Upsample(64, 32)
self.fs1 = Fusion(1024)
self.fs2 = Fusion(512)
self.fs3 = Fusion(256)
self.fs4 = Fusion(64)
self.fs5 = Fusion(64)
self.out5 = self._classifier(32)
self.transformer = nn.Conv2d(256, 64, kernel_size=1)
def forward(self, x):
input_size = x.size()
x = self.conv1(x)
x = self.bn0(x)
x = self.relu(x)
conv_x = x
x = self.maxpool(x)
pool_x = x
fm1 = self.layer1(x)
fm2 = self.layer2(fm1)
fm3 = self.layer3(fm2)
fm4 = self.layer4(fm3)
fsfm1 = self.fs1(fm3, self.upsample1(fm4, fm3.size()[2:]))
fsfm2 = self.fs2(fm2, self.upsample2(fsfm1, fm2.size()[2:]))
fsfm3 = self.fs4(pool_x, self.upsample3(fsfm2, pool_x.size()[2:]))
fsfm4 = self.fs5(conv_x, self.upsample4(fsfm3, conv_x.size()[2:]))
fsfm5 = self.upsample5(fsfm4, input_size[2:])
out = self.out5(fsfm5)
return out
def _classifier(self, inplanes):
if inplanes == 32:
return nn.Sequential(
nn.Conv2d(inplanes, self.num_classes, 1),
nn.Conv2d(self.num_classes, self.num_classes,
kernel_size=3, padding=1)
)
return nn.Sequential(
nn.Conv2d(inplanes, inplanes / 2, 3, padding=1, bias=False),
nn.BatchNorm2d(inplanes / 2),
nn.ReLU(inplace=True),
nn.Dropout(.1),
nn.Conv2d(inplanes / 2, self.num_classes, 1),
)
class ResBase(nn.Module):
def __init__(self, num_classes, layer='50', input_ch=3):
super(ResBase, self).__init__()
self.num_classes = num_classes
print ('resnet' + layer)
if layer == '18':
resnet = extended_resnet.resnet18(pretrained=True, input_ch=input_ch)
elif layer == '50':
resnet = extended_resnet.resnet50(pretrained=True, input_ch=input_ch)
elif layer == '101':
resnet = extended_resnet.resnet101(pretrained=True, input_ch=input_ch)
elif layer == '152':
resnet = extended_resnet.resnet152(pretrained=True, input_ch=input_ch)
else:
NotImplementedError
self.conv1 = resnet.conv1
self.bn0 = resnet.bn1
self.relu = resnet.relu
self.maxpool = resnet.maxpool
self.layer1 = resnet.layer1
self.layer2 = resnet.layer2
self.layer3 = resnet.layer3
self.layer4 = resnet.layer4
def forward(self, x):
img_size = x.size()[2:]
x = self.conv1(x)
x = self.bn0(x)
x = self.relu(x)
conv_x = x
x = self.maxpool(x)
pool_x = x
fm1 = self.layer1(x)
fm2 = self.layer2(fm1)
fm3 = self.layer3(fm2)
fm4 = self.layer4(fm3)
out_dic = {
"img_size": img_size,
"conv_x": conv_x,
"pool_x": pool_x,
"fm2": fm2,
"fm3": fm3,
"fm4": fm4
}
return out_dic
class ResClassifier(nn.Module):
def __init__(self, num_classes):
super(ResClassifier, self).__init__()
self.num_classes = num_classes
self.upsample1 = Upsample(2048, 1024)
self.upsample2 = Upsample(1024, 512)
self.upsample3 = Upsample(512, 64)
self.upsample4 = Upsample(64, 64)
self.upsample5 = Upsample(64, 32)
self.fs1 = Fusion(1024)
self.fs2 = Fusion(512)
self.fs3 = Fusion(256)
self.fs4 = Fusion(64)
self.fs5 = Fusion(64)
self.out5 = self._classifier(32)
def _classifier(self, inplanes):
if inplanes == 32:
return nn.Sequential(
nn.Conv2d(inplanes, self.num_classes, 1),
nn.Conv2d(self.num_classes, self.num_classes,
kernel_size=3, padding=1)
)
return nn.Sequential(
nn.Conv2d(inplanes, inplanes / 2, 3, padding=1, bias=False),
nn.BatchNorm2d(inplanes / 2),
nn.ReLU(inplace=True),
#nn.Dropout(.1),
nn.Conv2d(inplanes / 2, self.num_classes, 1),
)
def forward(self, gen_out_dic):
gen_out_dic = edict(gen_out_dic)
fsfm1 = self.fs1(gen_out_dic.fm3, self.upsample1(gen_out_dic.fm4, gen_out_dic.fm3.size()[2:]))
fsfm2 = self.fs2(gen_out_dic.fm2, self.upsample2(fsfm1, gen_out_dic.fm2.size()[2:]))
fsfm3 = self.fs4(gen_out_dic.pool_x, self.upsample3(fsfm2, gen_out_dic.pool_x.size()[2:]))
fsfm4 = self.fs5(gen_out_dic.conv_x, self.upsample4(fsfm3, gen_out_dic.conv_x.size()[2:]))
fsfm5 = self.upsample5(fsfm4, gen_out_dic.img_size)
out = self.out5(fsfm5)
return out
class zzResBase(nn.Module):
def __init__(self, num_classes, layer='50', input_ch=3):
super(zzResBase, self).__init__()
self.num_classes = num_classes
print ('resnet' + layer)
if layer == '18':
resnet = extended_resnet.resnet18(pretrained=True, input_ch=input_ch)
elif layer == '50':
resnet = extended_resnet.resnet50(pretrained=True, input_ch=input_ch)
elif layer == '101':
resnet = extended_resnet.resnet101(pretrained=True, input_ch=input_ch)
elif layer == '152':
resnet = extended_resnet.resnet152(pretrained=True, input_ch=input_ch)
else:
NotImplementedError
self.conv1 = resnet.conv1
self.bn0 = resnet.bn1
self.relu = resnet.relu
self.maxpool = resnet.maxpool
self.layer1 = resnet.layer1
self.layer2 = resnet.layer2
self.layer3 = resnet.layer3
self.layer4 = resnet.layer4
def forward(self, x):
img_size = x.size()[2:]
x = self.conv1(x)
x = self.bn0(x)
x = self.relu(x)
conv_x = x
x = self.maxpool(x)
pool_x = x
fm1 = self.layer1(x)
fm2 = self.layer2(fm1)
fm3 = self.layer3(fm2)
fm4 = self.layer4(fm3)
return conv_x, pool_x, fm1, fm2, fm3, fm4
class zzResClassifier(nn.Module):
def __init__(self, num_classes):
super(zzResClassifier, self).__init__()
self.num_classes = num_classes
self.upsample1 = Upsample(2048, 1024)
self.upsample2 = Upsample(1024, 512)
self.upsample3 = Upsample(512, 64)
self.upsample4 = Upsample(64, 64)
self.upsample5 = Upsample(64, 32)
self.fs1 = Fusion(1024)
self.fs2 = Fusion(512)
self.fs3 = Fusion(256)
self.fs4 = Fusion(64)
self.fs5 = Fusion(64)
# self.out0 = self._classifier(2048)
# self.out1 = self._classifier(1024)
# self.out2 = self._classifier(512)
# self.out_e = self._classifier(256)
# self.out3 = self._classifier(64)
# self.out4 = self._classifier(64)
self.out5 = self._classifier(32)
self.transformer = nn.Conv2d(256, 64, kernel_size=1)
def _classifier(self, inplanes):
if inplanes == 32:
return nn.Sequential(
nn.Conv2d(inplanes, self.num_classes, 1),
nn.Conv2d(self.num_classes, self.num_classes,
kernel_size=3, padding=1)
)
return nn.Sequential(
nn.Conv2d(inplanes, inplanes / 2, 3, padding=1, bias=False),
nn.BatchNorm2d(inplanes / 2),
nn.ReLU(inplace=True),
nn.Dropout(.1),
nn.Conv2d(inplanes / 2, self.num_classes, 1),
)
def forward(self, x, conv_x, pool_x, fm1, fm2, fm3, fm4):
input = x
# out32 = self.out0(fm4)
fsfm1 = self.fs1(fm3, self.upsample1(fm4, fm3.size()[2:]))
# out16 = self.out1(fsfm1)
fsfm2 = self.fs2(fm2, self.upsample2(fsfm1, fm2.size()[2:]))
# out8 = self.out2(fsfm2)
fsfm3 = self.fs4(pool_x, self.upsample3(fsfm2, pool_x.size()[2:]))
# print(fsfm3.size())
# out4 = self.out3(fsfm3)
fsfm4 = self.fs5(conv_x, self.upsample4(fsfm3, conv_x.size()[2:]))
# out2 = self.out4(fsfm4)
fsfm5 = self.upsample5(fsfm4, input.size()[2:])
out = self.out5(fsfm5)
return out
class MFResClassifier(nn.Module):
def __init__(self, num_classes):
super(MFResClassifier, self).__init__()
self.num_classes = num_classes
self.upsample1 = Upsample(2048, 1024)
self.upsample2 = Upsample(1024, 512)
self.upsample3 = Upsample(512, 64)
self.upsample4 = Upsample(64, 64)
self.upsample5 = Upsample(64, 32)
self.fs1 = Fusion(1024)
self.fs2 = Fusion(512)
self.fs3 = Fusion(256)
self.fs4 = Fusion(64)
self.fs5 = Fusion(64)
self.out5 = self._classifier(32)
def _classifier(self, inplanes):
if inplanes == 32:
return nn.Sequential(
nn.Conv2d(inplanes, self.num_classes, 1),
nn.Conv2d(self.num_classes, self.num_classes,
kernel_size=3, padding=1)
)
return nn.Sequential(
nn.Conv2d(inplanes, inplanes / 2, 3, padding=1, bias=False),
nn.BatchNorm2d(inplanes / 2),
nn.ReLU(inplace=True),
nn.Dropout(.1),
nn.Conv2d(inplanes / 2, self.num_classes, 1),
)
def forward(self, gen_out_dic1, gen_out_dic2):
gen_out_dic1 = edict(gen_out_dic1)
gen_out_dic2 = edict(gen_out_dic2)
assert gen_out_dic1.img_size == gen_out_dic2.img_size
img_size = gen_out_dic1.img_size
conv_x = gen_out_dic1.conv_x + gen_out_dic2.conv_x
pool_x = gen_out_dic1.pool_x + gen_out_dic2.pool_x
fm2 = gen_out_dic1.fm2 + gen_out_dic2.fm2
fm3 = gen_out_dic1.fm3 + gen_out_dic2.fm3
fm4 = gen_out_dic1.fm4 + gen_out_dic2.fm4
fsfm1 = self.fs1(fm3, self.upsample1(fm4, fm3.size()[2:]))
fsfm2 = self.fs2(fm2, self.upsample2(fsfm1, fm2.size()[2:]))
fsfm3 = self.fs4(pool_x, self.upsample3(fsfm2, pool_x.size()[2:]))
fsfm4 = self.fs5(conv_x, self.upsample4(fsfm3, conv_x.size()[2:]))
fsfm5 = self.upsample5(fsfm4, img_size)
out = self.out5(fsfm5)
return out
class MFResClassifier2(nn.Module):
def __init__(self, num_classes):
super(MFResClassifier2, self).__init__()
self.num_classes = num_classes
self.upsample1 = Upsample(2048 * 2, 1024)
self.upsample2 = Upsample(1024, 512)
self.upsample3 = Upsample(512, 64)
self.upsample4 = Upsample(64, 64)
self.upsample5 = Upsample(64, 32)
self.fs1 = Fusion2(1024 * 2, 1024)
self.fs2 = Fusion2(512 * 2, 512)
self.fs3 = Fusion2(256 * 2, 256)
self.fs4 = Fusion2(64 * 2, 64)
self.fs5 = Fusion2(64 * 2, 64)
self.out5 = self._classifier(32)
def _classifier(self, inplanes):
if inplanes == int(32):
return nn.Sequential(
nn.Conv2d(inplanes, self.num_classes, 1),
nn.Conv2d(self.num_classes, self.num_classes,
kernel_size=3, padding=1)
)
return nn.Sequential(
nn.Conv2d(inplanes, inplanes / 2, 3, padding=1, bias=False),
nn.BatchNorm2d(inplanes / 2),
nn.ReLU(inplace=True),
nn.Dropout(.1),
nn.Conv2d(inplanes / 2, self.num_classes, 1),
)
def forward(self, gen_out_dic1, gen_out_dic2):
gen_out_dic1 = edict(gen_out_dic1)
gen_out_dic2 = edict(gen_out_dic2)
assert gen_out_dic1.img_size == gen_out_dic2.img_size
img_size = gen_out_dic1.img_size
conv_x = torch.cat([gen_out_dic1.conv_x, gen_out_dic2.conv_x], 1)
pool_x = torch.cat([gen_out_dic1.pool_x, gen_out_dic2.pool_x], 1)
fm2 = torch.cat([gen_out_dic1.fm2, gen_out_dic2.fm2], 1)
fm3 = torch.cat([gen_out_dic1.fm3, gen_out_dic2.fm3], 1)
fm4 = torch.cat([gen_out_dic1.fm4, gen_out_dic2.fm4], 1)
# print (gen_out_dic1.conv_x.size())
# print (conv_x.size())
# print (fm2.size())
# print (fm3.size())
# print (fm4.size())
fsfm1 = self.fs1(fm3, self.upsample1(fm4, fm3.size()[2:]))
fsfm2 = self.fs2(fm2, self.upsample2(fsfm1, fm2.size()[2:]))
fsfm3 = self.fs4(pool_x, self.upsample3(fsfm2, pool_x.size()[2:]))
fsfm4 = self.fs5(conv_x, self.upsample4(fsfm3, conv_x.size()[2:]))
fsfm5 = self.upsample5(fsfm4, img_size)
out = self.out5(fsfm5)
return out
class ResClassifier_P(nn.Module):
def __init__(self, num_classes):
super(ResClassifier_P, self).__init__()
self.num_classes = num_classes
self.upsample1 = Upsample(2048, 1024)
self.upsample2 = Upsample(1024, 512)
self.upsample3 = Upsample(512, 64)
self.upsample4 = Upsample(64, 64)
self.upsample5 = Upsample(64, 32)
self.fs1 = Fusion(1024)
self.fs2 = Fusion(512)
self.fs3 = Fusion(256)
self.fs4 = Fusion(64)
self.fs5 = Fusion(64)
self.out0 = self._classifier(2048)
# self.out1 = self._classifier(1024)
# self.out2 = self._classifier(512)
# self.out_e = self._classifier(256)
# self.out3 = self._classifier(64)
# self.out4 = self._classifier(64)
self.out5 = self._classifier(32)
self.transformer = nn.Conv2d(256, 64, kernel_size=1)
def _classifier(self, inplanes):
if inplanes == 32:
return nn.Sequential(
nn.Conv2d(inplanes, self.num_classes, 1),
nn.Conv2d(self.num_classes, self.num_classes,
kernel_size=3, padding=1)
)
return nn.Sequential(
nn.Conv2d(inplanes, inplanes / 2, 3, padding=1, bias=False),
nn.BatchNorm2d(inplanes / 2),
nn.ReLU(inplace=True),
nn.Dropout(.1),
nn.Conv2d(inplanes / 2, self.num_classes, 1),
)
def forward(self, x, conv_x, pool_x, fm1, fm2, fm3, fm4):
input = x
out32 = self.out0(fm4)
fsfm1 = self.fs1(fm3, self.upsample1(fm4, fm3.size()[2:]))
fsfm2 = self.fs2(fm2, self.upsample2(fsfm1, fm2.size()[2:]))
fsfm3 = self.fs4(pool_x, self.upsample3(fsfm2, pool_x.size()[2:]))
# print(fsfm3.size())
fsfm4 = self.fs5(conv_x, self.upsample4(fsfm3, conv_x.size()[2:]))
fsfm5 = self.upsample5(fsfm4, input.size()[2:])
out = self.out5(fsfm5)
return out # ,out32
class ResBaseUP(nn.Module):
def __init__(self, num_classes, layer='50'):
super(ResBaseUP, self).__init__()
self.num_classes = num_classes
if layer == '50':
print 'resnet' + layer
resnet = extended_resnet.resnet50(pretrained=True)
if layer == '101':
print 'resnet' + layer
resnet = extended_resnet.resnet101(pretrained=True)
self.conv1 = resnet.conv1
self.bn0 = resnet.bn1
self.relu = resnet.relu
self.maxpool = resnet.maxpool
self.layer1 = resnet.layer1
self.layer2 = resnet.layer2
self.layer3 = resnet.layer3
self.layer4 = resnet.layer4
self.num_classes = num_classes
self.upsample1 = Upsample(2048, 1024)
self.upsample2 = Upsample(1024, 512)
self.upsample3 = Upsample(512, 64)
self.upsample4 = Upsample(64, 64)
self.upsample5 = Upsample(64, 32)
self.fs1 = Fusion(1024)
self.fs2 = Fusion(512)
self.fs3 = Fusion(256)
self.fs4 = Fusion(64)
self.fs5 = Fusion(64)
def forward(self, x):
input = x
x = self.conv1(x)
x = self.bn0(x)
x = self.relu(x)
conv_x = x
x = self.maxpool(x)
pool_x = x
fm1 = self.layer1(x)
fm2 = self.layer2(fm1)
fm3 = self.layer3(fm2)
fm4 = self.layer4(fm3)
fsfm1 = self.fs1(fm3, self.upsample1(fm4, fm3.size()[2:]))
fsfm2 = self.fs2(fm2, self.upsample2(fsfm1, fm2.size()[2:]))
fsfm3 = self.fs4(pool_x, self.upsample3(fsfm2, pool_x.size()[2:]))
fsfm4 = self.fs5(conv_x, self.upsample4(fsfm3, conv_x.size()[2:]))
fsfm5 = self.upsample5(fsfm4, input.size()[2:])
return fsfm5
class ResClassifierUP(nn.Module):
def __init__(self, num_classes):
super(ResClassifierUP, self).__init__()
self.num_classes = num_classes
self.out5 = self._classifier(32)
def _classifier(self, inplanes):
if inplanes == 32:
return nn.Sequential(
nn.Conv2d(inplanes, self.num_classes, 1),
nn.Conv2d(self.num_classes, self.num_classes,
kernel_size=3, padding=1)
)
return nn.Sequential(
nn.Conv2d(inplanes, inplanes / 2, 3, padding=1, bias=False),
nn.BatchNorm2d(inplanes / 2),
nn.ReLU(inplace=True),
nn.Dropout(.1),
nn.Conv2d(inplanes / 2, self.num_classes, 1),
)
def forward(self, fsfm5):
out = self.out5(fsfm5)
return out
| 32.644717 | 102 | 0.575269 | 2,771 | 21,317 | 4.256225 | 0.054854 | 0.052569 | 0.059352 | 0.030524 | 0.861964 | 0.834577 | 0.802612 | 0.786671 | 0.751484 | 0.737324 | 0 | 0.086423 | 0.296524 | 21,317 | 652 | 103 | 32.694785 | 0.700053 | 0.039218 | 0 | 0.735772 | 0 | 0 | 0.005544 | 0 | 0 | 0 | 0 | 0 | 0.004065 | 0 | null | null | 0 | 0.010163 | null | null | 0.010163 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4339467435fec0842b1ca79f3431f42ed100a14f | 108 | py | Python | server/config.py | ChidinmaKO/Omo-Bookstore-App | f60d56ea2998db47a6cb079be94d40f796e0c0d3 | [
"MIT"
] | null | null | null | server/config.py | ChidinmaKO/Omo-Bookstore-App | f60d56ea2998db47a6cb079be94d40f796e0c0d3 | [
"MIT"
] | 2 | 2021-03-10T17:36:08.000Z | 2022-02-19T01:58:50.000Z | server/config.py | ChidinmaKO/Omo-Bookstore-App | f60d56ea2998db47a6cb079be94d40f796e0c0d3 | [
"MIT"
] | null | null | null | import os
class Config:
SECRET_KEY = "a26385d494a3303155d3a2bd88f9af0a5decfd38a6201c130bc62af974f5a9ff" | 27 | 83 | 0.861111 | 7 | 108 | 13.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.391753 | 0.101852 | 108 | 4 | 83 | 27 | 0.556701 | 0 | 0 | 0 | 0 | 0 | 0.587156 | 0.587156 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
43400ee701619606c5f455756a27ccd3e6eb7977 | 7,676 | py | Python | make_LxLy.py | ryuikaneko/itps_contraction | 10816fb6c90d77f5a3b2f804ab22573d1d676eb4 | [
"MIT"
] | 1 | 2019-12-19T05:03:37.000Z | 2019-12-19T05:03:37.000Z | make_LxLy.py | ryuikaneko/itps_contraction | 10816fb6c90d77f5a3b2f804ab22573d1d676eb4 | [
"MIT"
] | null | null | null | make_LxLy.py | ryuikaneko/itps_contraction | 10816fb6c90d77f5a3b2f804ab22573d1d676eb4 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from __future__ import print_function
import numpy as np
import scipy as scipy
import argparse
def parse_args():
parser = argparse.ArgumentParser(description='make tensors')
parser.add_argument('-Lx',metavar='Lx',dest='Lx',type=int,default=2,help='set Lx')
parser.add_argument('-Ly',metavar='Ly',dest='Ly',type=int,default=2,help='set Ly')
return parser.parse_args()
def make_tensors(Lx,Ly):
print("# Lx x Ly site contraction (Lx="+str(Lx)+", Ly="+str(Ly)+")")
print("")
print("## def Contract_scalar_"+str(Lx)+"x"+str(Ly)+"(\\")
for y in range(Ly+1,-1,-1):
print("## ",end="")
for x in range(0,Lx+2):
print("t"+str(x)+"_"+str(y),end="")
print(",",end="")
print("\\")
for y in range(Ly,0,-1):
print("## ",end="")
for x in range(1,Lx+1):
print("o"+str(x)+"_"+str(y),end="")
if x!=Lx or y!=1:
print(",",end="")
print("\\")
print("## ):")
print("")
## 2
## |
## LB--1
x = 0
y = 0
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("")
## 1
## |
## 2--RB
x = Lx+1
y = 0
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("b"+"t"+str(x-1)+"_"+str(y)+"t"+str(x)+"_"+str(y),end=" ")
print("")
## LT--2
## |
## 1
x = 0
y = Ly+1
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y-1)+"t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("")
## 1--RT
## |
## 2
x = Lx+1
y = Ly+1
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x-1)+"_"+str(y)+"t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y-1)+"t"+str(x)+"_"+str(y),end=" ")
print("")
## 4(c) 3
## \ /
## v
## 2--B--1
y = 0
for x in range(1,Lx+1):
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("b"+"t"+str(x-1)+"_"+str(y)+"t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("bc"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("")
## 2 4(c)
## | /
## L<
## | \
## 1 3
x = 0
for y in range(1,Ly+1):
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y-1)+"t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("bc"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("")
## 1--T--2
## ^
## / \
## 3 4(c)
y = Ly+1
for x in range(1,Lx+1):
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x-1)+"_"+str(y)+"t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y-1)+"t"+str(x)+"_"+str(y),end=" ")
print("bc"+"t"+str(x)+"_"+str(y-1)+"t"+str(x)+"_"+str(y),end=" ")
print("")
## 3 1
## \ |
## >R
## / |
## 4(c) 2
x = Lx+1
for y in range(1,Ly+1):
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("b"+"t"+str(x)+"_"+str(y-1)+"t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x-1)+"_"+str(y)+"t"+str(x)+"_"+str(y),end=" ")
print("bc"+"t"+str(x-1)+"_"+str(y)+"t"+str(x)+"_"+str(y),end=" ")
print("")
## mid t
##
## 2
## |
## 1--T--3
## |\
## 4 5
for y in range(1,Ly+1):
for x in range(1,Lx+1):
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x-1)+"_"+str(y)+"t"+str(x)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y-1)+"t"+str(x)+"_"+str(y),end=" ")
print("m"+str(x)+"_"+str(y),end=" ")
print("")
## mid tc
##
## 2
## |
## 1--T--3
## |\
## 4 5
for y in range(1,Ly+1):
for x in range(1,Lx+1):
print("tensor",end=" ")
print("t"+str(x)+"_"+str(y)+"_conj",end=" ")
print("bc"+"t"+str(x-1)+"_"+str(y)+"t"+str(x)+"_"+str(y),end=" ")
print("bc"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("bc"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("bc"+"t"+str(x)+"_"+str(y-1)+"t"+str(x)+"_"+str(y),end=" ")
print("mc"+str(x)+"_"+str(y),end=" ")
print("")
## mid o
##
## 1
## |
## o
## |
## 2(c)
for y in range(1,Ly+1):
for x in range(1,Lx+1):
print("tensor",end=" ")
print("o"+str(x)+"_"+str(y),end=" ")
print("m"+str(x)+"_"+str(y),end=" ")
print("mc"+str(x)+"_"+str(y),end=" ")
print("")
print("")
## bdim B
print("bond_dim",end=" ")
print("100",end=" ")
y = 0
for x in range(0,Lx+1):
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("")
## bdim L
print("bond_dim",end=" ")
print("100",end=" ")
x = 0
for y in range(0,Ly+1):
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("")
## bdim T
print("bond_dim",end=" ")
print("100",end=" ")
y = Ly+1
for x in range(0,Lx+1):
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("")
## bdim R
print("bond_dim",end=" ")
print("100",end=" ")
x = Lx+1
for y in range(0,Ly+1):
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("")
## bdim B2
print("bond_dim",end=" ")
print("10",end=" ")
y = 0
for x in range(1,Lx+1):
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("bc"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("")
## bdim L2
print("bond_dim",end=" ")
print("10",end=" ")
x = 0
for y in range(1,Ly+1):
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("bc"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("")
## bdim mid t
for y in range(1,Ly+1):
for x in range(1,Lx+1):
print("bond_dim",end=" ")
print("10",end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("bc"+"t"+str(x)+"_"+str(y)+"t"+str(x+1)+"_"+str(y),end=" ")
print("b"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("bc"+"t"+str(x)+"_"+str(y)+"t"+str(x)+"_"+str(y+1),end=" ")
print("")
## bdim mid o
for y in range(1,Ly+1):
for x in range(1,Lx+1):
print("bond_dim",end=" ")
print("2",end=" ")
print("m"+str(x)+"_"+str(y),end=" ")
print("mc"+str(x)+"_"+str(y),end=" ")
print("")
def main():
args = parse_args()
Lx = args.Lx
Ly = args.Ly
make_tensors(Lx,Ly)
if __name__ == "__main__":
main()
| 28.42963 | 86 | 0.418317 | 1,275 | 7,676 | 2.410196 | 0.062745 | 0.139278 | 0.16108 | 0.221282 | 0.824601 | 0.816141 | 0.800846 | 0.763098 | 0.714286 | 0.714286 | 0 | 0.029412 | 0.242574 | 7,676 | 269 | 87 | 28.535316 | 0.49914 | 0.045857 | 0 | 0.856354 | 0 | 0 | 0.09077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016575 | false | 0 | 0.022099 | 0 | 0.044199 | 0.674033 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
4a353ee5472826376eceda92e85a89be866cb19e | 45,803 | py | Python | src/plot.py | wyxzou/Federated-Learning-PyTorch | 753b5b04ff6628b7bc2f54aadeaa07d67b719014 | [
"MIT"
] | null | null | null | src/plot.py | wyxzou/Federated-Learning-PyTorch | 753b5b04ff6628b7bc2f54aadeaa07d67b719014 | [
"MIT"
] | null | null | null | src/plot.py | wyxzou/Federated-Learning-PyTorch | 753b5b04ff6628b7bc2f54aadeaa07d67b719014 | [
"MIT"
] | null | null | null | import pickle
import numpy as np
import matplotlib.pyplot as plt
def plot_graphs(x, y, title, xlabel, ylabel, legend_labels):
plt.figure()
n = len(x)
for i in range(n):
plt.plot(x[i], y[i], label=legend_labels[i])
plt.title(title)
plt.xlabel(xlabel)
plt.ylabel(ylabel)
plt.legend()
plt.show()
# plt.savefig("100 workers")
def plot_iid_sparsetopk_baseline():
dataset = "mnist"
model = "mlp"
epochs = 60
num_users = 1
frac = "1.0"
iid = 1
local_bs = 10
optimizer = "sparsetopk"
lrs = [0.01, 0.005, 0.001] # 0.0005]
bidirectional = 1
numbers = [1, 2, 3, 4, 5]
plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, "dir1")
def plot_iid_sgd_baseline():
dataset = "mnist"
model = "mlp"
epochs = 60
num_users = 1
frac = "1.0"
iid = 1
local_bs = 10
optimizer = "sgd"
lrs = [0.01, 0.005, 0.001] # 0.0005]
bidirectional = 1
numbers = [1, 2, 3]
plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, "sgd")
def plot_iid_sparsetopk_baseline():
dataset = "mnist"
model = "mlp"
epochs = 60
num_users = 1
frac = "1.0"
iid = 1
local_bs = 10
optimizer = "sparsetopk"
lrs = [0.01, 0.005, 0.001] # 0.0005]
bidirectional = 1
numbers = [1, 2, 3, 4, 5]
plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, "dir1")
def plot_users_100_dir_1_sparsetopk():
dataset = "mnist"
model = "mlp"
epochs = 80
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
optimizer = "sparsetopk"
lrs = [0.01, 0.005]
bidirectional = 1
numbers = [1]
plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, "bidir_vs_unidir2")
def plot_users_100_dir_1_sparsetopk_batch_64():
dataset = "mnist"
model = "mlp"
epochs = 80
num_users = 100
frac = "1.0"
iid = 1
local_bs = 64
optimizer = "sparsetopk"
lrs = [0.01, 0.005, 0.001]
bidirectional = 1
numbers = [1]
plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, "bidir_vs_unidir2")
def plot_users_100_dir_0_sparsetopk():
dataset = "mnist"
model = "mlp"
epochs = 80
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
optimizer = "sparsetopk"
lrs = [0.01, 0.005, 0.001]
bidirectional = 0
numbers = [1]
plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, "bidir_vs_unidir2")
def plot_users_100_dir_0_sparsetopk_batch_64():
dataset = "mnist"
model = "mlp"
epochs = 80
num_users = 100
frac = "1.0"
iid = 1
local_bs = 64
optimizer = "sparsetopk"
lrs = [0.01, 0.005, 0.001]
bidirectional = 0
numbers = [1]
plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, "bidir_vs_unidir2")
def plot_users_100_sgd_batch_64():
dataset = "mnist"
model = "mlp"
epochs = 80
num_users = 100
frac = "1.0"
iid = 1
local_bs = 64
optimizer = "sgd"
lrs = [0.01, 0.005, 0.001]
bidirectional = 0
numbers = [1]
plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, "bidir_vs_unidir2")
def plot_users_100_sgd():
dataset = "mnist"
model = "mlp"
epochs = 80
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
optimizer = "sgd"
lrs = [0.01, 0.005, 0.001]
bidirectional = 0
numbers = [1]
plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, "bidir_vs_unidir2")
def compare_baselines():
dataset = "mnist"
model = "mlp"
epochs = 60
num_users = 1
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments1 = []
for number in [1, 2, 3]:
file_name1 = '../tuning/dir1/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.005, 1, number)
with open(file_name1, 'rb') as pickle_file:
experiments1.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments1)[:, 3], axis=0))
experiments2 = []
for number in [1, 2, 3]:
file_name2 = '../tuning/sgd/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sgd", 0.01, 1, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 2, all_experiments, "", "Epoch", "Accuracy", ["sparsetopk", "sgd"])
def sgd_results():
dataset = "mnist"
model = "mlp"
epochs = 80
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments1 = []
for number in [1, 2, 3]:
file_name1 = '{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sgd", 0.01, 0, number)
with open(file_name1, 'rb') as pickle_file:
experiments1.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments1)[:, 3], axis=0))
plot_graphs([list(range(epochs))], all_experiments, "", "Epoch", "Accuracy", ["sgd"])
def compare_mnist():
dataset = "mnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments1 = []
for number in [1, 2, 3, 4, 5]:
file_name1 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.01, 0, number)
with open(file_name1, 'rb') as pickle_file:
experiments1.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments1)[:, 3], axis=0))
experiments3 = []
for number in [1, 2, 3, 4, 5]:
file_name3 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.01, 1, number)
with open(file_name3, 'rb') as pickle_file:
experiments3.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments3)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sgd", 0.01, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 3, all_experiments, "", "Epoch", "Accuracy", ["sparsetopk unidirectional", "sparsetopk bidirectional", "sgd"])
def compare_worker():
dataset = "mnist"
model = "mlp"
epochs = 80
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments1 = []
for number in [1, 2, 3]:
file_name1 = '../100baseline/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sgd", 0.01, 0, number)
with open(file_name1, 'rb') as pickle_file:
experiments1.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments1)[:, 3], axis=0))
experiments3 = []
for number in [2, 3]:
file_name3 = '../100baseline/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 100, frac, iid,
local_bs, "sgd", 0.01, 0, number)
with open(file_name3, 'rb') as pickle_file:
experiments3.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments3)[:, 3], axis=0))
# experiments2 = []
# for number in [1]:
# file_name2 = '../bidir_vs_unidir2/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
# .format(dataset, model, epochs, num_users, frac, iid,
# local_bs, "sgd", 0.005, 0, number)
# with open(file_name2, 'rb') as pickle_file:
# experiments2.append(pickle.load(pickle_file))
# all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 2, all_experiments, "", "Epoch", "Accuracy", ["20 users", "100 users"])
def compare_batch_size():
dataset = "mnist"
model = "mlp"
epochs = 80
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments1 = []
for number in [1]:
file_name1 = '../bidir_vs_unidir2/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.005, 0, number)
with open(file_name1, 'rb') as pickle_file:
experiments1.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments1)[:, 3], axis=0))
experiments3 = []
for number in [1]:
file_name3 = '../bidir_vs_unidir2/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.005, 1, number)
with open(file_name3, 'rb') as pickle_file:
experiments3.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments3)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../bidir_vs_unidir2/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sgd", 0.005, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 3, all_experiments, "", "Epoch", "Accuracy", ["sparsetopk", "sparsetopk unidirectional", "sgd"])
def train_learning_rate():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
experiments1 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sparsetopk", 0.002, 1, number)
with open(file_name2, 'rb') as pickle_file:
experiments1.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sparsetopk", 0.005, 1, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments3 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sparsetopk", 0.008, 1, number)
with open(file_name2, 'rb') as pickle_file:
experiments3.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments3)[:, 3], axis=0))
experiments4 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sparsetopk", 0.01, 1, number)
with open(file_name2, 'rb') as pickle_file:
experiments4.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments4)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 4, all_experiments, "", "Epoch", "Accuracy", ["0.001", "0.002", "0.008", "0.01"])
def fminist_baseline_unidirectional():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments1 = []
for number in [1]:
file_name1 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.01, 0, number)
with open(file_name1, 'rb') as pickle_file:
experiments1.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments1)[:, 3], axis=0))
experiments3 = []
for number in [1]:
file_name3 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.008, 0, number)
with open(file_name3, 'rb') as pickle_file:
experiments3.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments3)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.005, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments4 = []
for number in [1]:
file_name4 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.002, 0, number)
with open(file_name4, 'rb') as pickle_file:
experiments4.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments4)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 4, all_experiments, "Tuning Unidirectional TopK on Learning Rates", "Epoch", "Accuracy", ["0.01", "0.008", "0.005", "0.002"])
def sgd_baseline_one_worker():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sgd", 0.02, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sgd", 0.01, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sgd", 0.002, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sgd", 0.005, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 4, all_experiments, "", "Epoch", "Accuracy", ["0.02", "0.01", "0.005", "0.002"])
def sgd_baseline_20_worker():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sgd", 0.02, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sgd", 0.02, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 2, all_experiments, "Tuning SGD on Learning Rate (Twenty Workers)", "Epoch", "Accuracy", ["0.02", "0.01"])
def plot_sparsity():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
topk = 0.001
lr = 0.01
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", lr, 1, topk, 0.01, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", lr, 1, topk, 0.1, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", lr, 1, topk, 0.1, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 2, all_experiments, "Tuning SGD on Learning Rate (Twenty Workers)", "Epoch", "Accuracy", ["0.02", "0.01"])
def bidirectional_sparsetopk():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
topk = 0.001
topk_d = 0.001
all_experiments = []
# experiments2 = []
# for number in [1]:
# file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
# .format(dataset, model, epochs, 20, frac, iid,
# local_bs, "sparsetopk", 0.01, 1, topk, topk_d, number)
# with open(file_name2, 'rb') as pickle_file:
# experiments2.append(pickle.load(pickle_file))
# all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.02, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.05, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.06, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
# experiments2 = []
# for number in [1]:
# file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
# .format(dataset, model, epochs, 20, frac, iid,
# local_bs, "sparsetopk", 0.04, 1, number)
# with open(file_name2, 'rb') as pickle_file:
# experiments2.append(pickle.load(pickle_file))
# all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
# experiments2 = []
# for number in [1]:
# file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
# .format(dataset, model, epochs, 20, frac, iid,
# local_bs, "sparsetopk", 0.05, 1, number)
# with open(file_name2, 'rb') as pickle_file:
# experiments2.append(pickle.load(pickle_file))
# all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 3, all_experiments, "Tuning Bidirectional TopK on Learning Rate (Twenty Workers)", "Epoch", "Accuracy", ["0.02", "0.05", "0.06"])
def bidirectional_sparsetopk_different_sparsity():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
topk = 0.001
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 1, topk, 0.01, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 1, topk, 0.1, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 1, topk, 0.25, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 1, topk, 0.5, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 1, topk, 0.75, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 1, topk, "1.0", number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 6, all_experiments, "Effect of Different Sparsity on Bidirectional TopK", "Epoch", "Accuracy", ["0.1", "0.01", "0.25", "0.5", "0.75", "1"])
def unidirectional_sparsetopk_different_sparsity():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sgd", 0.01, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 0, 0.1, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 0, 0.01, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 4, all_experiments, "Effect of Sparsity on Unidirectional TopK", "Epoch", "Accuracy", ["1", "0.1", "0.01", "0.001"])
def compare1_to_20_worker():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sgd", 0.01, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sgd", 0.02, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 2, all_experiments, "Comparison Between Different Number of Workers", "Epoch", "Accuracy", ["1 worker (lr = 0.01)", "20 worker (lr = 0.02)"])
def compare_fmnist():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 20
frac = "1.0"
iid = 1
local_bs = 10
topk = 0.001
topk_d = 0.001
all_experiments = []
experiments1 = []
for number in [1]:
file_name1 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.01, 0, number)
with open(file_name1, 'rb') as pickle_file:
experiments1.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments1)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, num_users, frac, iid,
local_bs, "sparsetopk", 0.06, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sgd", 0.02, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sgd", 0.01, 0, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 3, all_experiments, "", "Epoch", "Accuracy", ["sparsetopk unidirectional", "sparsetopk bidirectional", "sgd"])
def plot_compression():
dataset = "fmnist"
model = "mlp"
epochs = 100
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
experiments1 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sparsetopk", 0.002, 1, number)
with open(file_name2, 'rb') as pickle_file:
experiments1.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sparsetopk", 0.005, 1, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments3 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sparsetopk", 0.008, 1, number)
with open(file_name2, 'rb') as pickle_file:
experiments3.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments3)[:, 3], axis=0))
experiments4 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 1, frac, iid,
local_bs, "sparsetopk", 0.01, 1, number)
with open(file_name2, 'rb') as pickle_file:
experiments4.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments4)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 4, all_experiments, "", "Epoch", "Accuracy", ["0.001", "0.002", "0.008", "0.01"])
def plot_xi_values():
dataset = "fmnist"
model = "mlp"
epochs = 20
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
topk = 0.001
topk_d = 0.001
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.06, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 4], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 0, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 4], axis=0))
batches = len(all_experiments[0])
plot_graphs([list(range(batches))] * 2, all_experiments, "", "Batch", r"$\xi$", ["bidirectional", "unidirectional"])
def plot_xi_values():
dataset = "fmnist"
model = "mlp"
epochs = 20
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
topk = 0.001
topk_d = 0.001
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.06, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 4], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 0, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 4], axis=0))
batches = len(all_experiments[0])
plot_graphs([list(range(batches))] * 2, all_experiments, "", "Batch", r"$\xi$", ["bidirectional", "unidirectional"])
def plot_compression_values():
dataset = "fmnist"
model = "mlp"
epochs = 20
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
topk = 0.001
topk_d = 0.001
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.06, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(1 - np.mean(np.array(experiments2)[:, 5], axis=0))
batches = len(all_experiments[0])
plot_graphs([list(range(batches))] * 1, all_experiments, "Percentage of Gradient Compressed Donwstream ", "Batch", "", ["downstream"])
def plot_compression_values_upstream():
dataset = "fmnist"
model = "mlp"
epochs = 20
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
topk = 0.001
topk_d = 0.001
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 0, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(1 - np.mean(np.array(experiments2)[:, 5], axis=0))
batches = len(all_experiments[0])
plot_graphs([list(range(batches))] * 1, all_experiments, "Percentage of Gradient Compressed Downstream ", "Batch", "", ["upstream"])
def plot_delta():
dataset = "fmnist"
model = "mlp"
epochs = 20
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
topk = 0.001
topk_d = 0.001
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 0, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 6], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.06, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 6], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.06, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 7], axis=0))
batches = len(all_experiments[0])
plot_graphs([list(range(batches))] * 3, all_experiments, "", "Batch", r"$1 - \gamma$", ["unidirectional upstream", "bidirectional upstream", "bidirectional downstream"])
def plot_cnn():
dataset = "fmnist"
model = "cnn"
epochs = 20
num_users = 100
frac = "1.0"
iid = 1
local_bs = 10
topk = 0.001
topk_d = 0.001
all_experiments = []
experiments2 = []
for number in [1]:
file_name2 = '../save/bidir_fix/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 1, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
experiments2 = []
for number in [1]:
file_name2 = '../save/inter_batch_communication/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_TOPK[{}]_TOPKD[{}]_NUM[{}].pkl' \
.format(dataset, model, epochs, 20, frac, iid,
local_bs, "sparsetopk", 0.01, 0, topk, topk_d, number)
with open(file_name2, 'rb') as pickle_file:
experiments2.append(pickle.load(pickle_file))
all_experiments.append(np.mean(np.array(experiments2)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * 2, all_experiments, "CNN; 20 Users", "Epoch", "Accuracy", ["bidirectional", "unidirectional"])
def plot_files(dataset, model, epochs, num_users, frac, iid, local_bs, optimizer, lrs, bidirectional, numbers, folder):
avg_experiments = []
for lr in lrs:
experiments = []
for number in numbers:
file_name = '../{}/{}_{}_EPOCH[{}]_USERS[{}]_C[{}]_iid[{}]_B[{}]_OPT[{}]_LR[{}]_DIR[{}]_NUM[{}].pkl' \
.format(folder, dataset, model, epochs, num_users, frac, iid,
local_bs, optimizer, lr, bidirectional, number)
with open(file_name, 'rb') as pickle_file:
experiments.append(pickle.load(pickle_file))
avg_experiments.append(np.mean(np.array(experiments)[:, 3], axis=0))
plot_graphs([list(range(epochs))] * len(lrs), avg_experiments, "Percentage of Gradient Compressed Upstream", "Epoch", "Accuracy", lrs)
if __name__ == "__main__":
bidirectional_sparsetopk_different_sparsity()
# compare_fmnist()
# train_learning_rate()
# plot_users_100_dir_1_sparsetopk_batch_64()
# plot_users_100_sgd_batch_64()
# plot_users_100_sgd()
# plot_users_100_dir_1_sparsetopk()
# compare_results()
| 34.490211 | 181 | 0.579656 | 5,751 | 45,803 | 4.345679 | 0.027821 | 0.053617 | 0.055458 | 0.043134 | 0.940981 | 0.93694 | 0.933019 | 0.929417 | 0.924696 | 0.923335 | 0 | 0.047424 | 0.230727 | 45,803 | 1,328 | 182 | 34.490211 | 0.66187 | 0.045477 | 0 | 0.884354 | 0 | 0.071429 | 0.21936 | 0.163381 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037415 | false | 0 | 0.003401 | 0 | 0.040816 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4aa0381cf910ffef846ad8394c86f68340f78aa6 | 25,853 | py | Python | tests/schema/product/mutation/product_type/snapshots/snap_test_update_product_type.py | simonsobs/acondbs | 6ca11c2889d827ecdb2b54d0cf3b94b8cdd281e6 | [
"MIT"
] | null | null | null | tests/schema/product/mutation/product_type/snapshots/snap_test_update_product_type.py | simonsobs/acondbs | 6ca11c2889d827ecdb2b54d0cf3b94b8cdd281e6 | [
"MIT"
] | 24 | 2020-04-02T19:29:07.000Z | 2022-03-08T03:05:43.000Z | tests/schema/product/mutation/product_type/snapshots/snap_test_update_product_type.py | simonsobs/acondbs | 6ca11c2889d827ecdb2b54d0cf3b94b8cdd281e6 | [
"MIT"
] | 1 | 2020-04-08T15:48:28.000Z | 2020-04-08T15:48:28.000Z | # -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_schema_error[error-nonexistent] 1'] = {
'data': {
'allProductTypes': {
'edges': [
{
'node': {
'fields': {
'edges': [
{
'node': {
'field': {
'name': 'contact',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'beam'
}
}
},
{
'node': {
'field': {
'name': 'produced_by',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'beam'
}
}
},
{
'node': {
'field': {
'name': 'date_produced',
'type_': 'DATE'
},
'type_': {
'name': 'beam'
}
}
}
]
},
'icon': 'mdi-spotlight-beam',
'indefArticle': 'a',
'name': 'beam',
'order': 1,
'plural': 'beams',
'products': {
'edges': [
]
},
'singular': 'beam',
'typeId': '2'
}
},
{
'node': {
'fields': {
'edges': [
{
'node': {
'field': {
'name': 'contact',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'map'
}
}
},
{
'node': {
'field': {
'name': 'produced_by',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'map'
}
}
},
{
'node': {
'field': {
'name': 'date_produced',
'type_': 'DATE'
},
'type_': {
'name': 'map'
}
}
}
]
},
'icon': 'mdi-map',
'indefArticle': 'a',
'name': 'map',
'order': 2,
'plural': 'maps',
'products': {
'edges': [
{
'node': {
'name': 'map1'
}
},
{
'node': {
'name': 'map2'
}
},
{
'node': {
'name': 'map3'
}
}
]
},
'singular': 'map',
'typeId': '1'
}
}
],
'totalCount': 2
}
}
}
snapshots['test_schema_success[empty-fields] 1'] = {
'data': {
'updateProductType': {
'ok': True,
'productType': {
'fields': {
'edges': [
]
},
'icon': 'mdi-compass',
'indefArticle': 'a',
'name': 'compass',
'order': 5,
'plural': 'compasses',
'products': {
'edges': [
{
'node': {
'name': 'map1'
}
},
{
'node': {
'name': 'map2'
}
},
{
'node': {
'name': 'map3'
}
}
]
},
'singular': 'compass',
'typeId': '1'
}
}
}
}
snapshots['test_schema_success[empty-fields] 2'] = {
'data': {
'allProductTypes': {
'edges': [
{
'node': {
'fields': {
'edges': [
{
'node': {
'field': {
'name': 'contact',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'beam'
}
}
},
{
'node': {
'field': {
'name': 'produced_by',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'beam'
}
}
},
{
'node': {
'field': {
'name': 'date_produced',
'type_': 'DATE'
},
'type_': {
'name': 'beam'
}
}
}
]
},
'icon': 'mdi-spotlight-beam',
'indefArticle': 'a',
'name': 'beam',
'order': 1,
'plural': 'beams',
'products': {
'edges': [
]
},
'singular': 'beam',
'typeId': '2'
}
},
{
'node': {
'fields': {
'edges': [
]
},
'icon': 'mdi-compass',
'indefArticle': 'a',
'name': 'compass',
'order': 5,
'plural': 'compasses',
'products': {
'edges': [
{
'node': {
'name': 'map1'
}
},
{
'node': {
'name': 'map2'
}
},
{
'node': {
'name': 'map3'
}
}
]
},
'singular': 'compass',
'typeId': '1'
}
}
],
'totalCount': 2
}
}
}
snapshots['test_schema_success[fields-unchanged] 1'] = {
'data': {
'updateProductType': {
'ok': True,
'productType': {
'fields': {
'edges': [
{
'node': {
'field': {
'name': 'contact',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
},
{
'node': {
'field': {
'name': 'produced_by',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
},
{
'node': {
'field': {
'name': 'date_produced',
'type_': 'DATE'
},
'type_': {
'name': 'compass'
}
}
}
]
},
'icon': 'mdi-compass',
'indefArticle': 'a',
'name': 'compass',
'order': 5,
'plural': 'compasses',
'products': {
'edges': [
{
'node': {
'name': 'map1'
}
},
{
'node': {
'name': 'map2'
}
},
{
'node': {
'name': 'map3'
}
}
]
},
'singular': 'compass',
'typeId': '1'
}
}
}
}
snapshots['test_schema_success[fields-unchanged] 2'] = {
'data': {
'allProductTypes': {
'edges': [
{
'node': {
'fields': {
'edges': [
{
'node': {
'field': {
'name': 'contact',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'beam'
}
}
},
{
'node': {
'field': {
'name': 'produced_by',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'beam'
}
}
},
{
'node': {
'field': {
'name': 'date_produced',
'type_': 'DATE'
},
'type_': {
'name': 'beam'
}
}
}
]
},
'icon': 'mdi-spotlight-beam',
'indefArticle': 'a',
'name': 'beam',
'order': 1,
'plural': 'beams',
'products': {
'edges': [
]
},
'singular': 'beam',
'typeId': '2'
}
},
{
'node': {
'fields': {
'edges': [
{
'node': {
'field': {
'name': 'contact',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
},
{
'node': {
'field': {
'name': 'produced_by',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
},
{
'node': {
'field': {
'name': 'date_produced',
'type_': 'DATE'
},
'type_': {
'name': 'compass'
}
}
}
]
},
'icon': 'mdi-compass',
'indefArticle': 'a',
'name': 'compass',
'order': 5,
'plural': 'compasses',
'products': {
'edges': [
{
'node': {
'name': 'map1'
}
},
{
'node': {
'name': 'map2'
}
},
{
'node': {
'name': 'map3'
}
}
]
},
'singular': 'compass',
'typeId': '1'
}
}
],
'totalCount': 2
}
}
}
snapshots['test_schema_success[update] 1'] = {
'data': {
'updateProductType': {
'ok': True,
'productType': {
'fields': {
'edges': [
{
'node': {
'field': {
'name': 'contact',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
},
{
'node': {
'field': {
'name': 'field_four',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
},
{
'node': {
'field': {
'name': 'field_five',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
}
]
},
'icon': 'mdi-compass',
'indefArticle': 'a',
'name': 'compass',
'order': 5,
'plural': 'compasses',
'products': {
'edges': [
{
'node': {
'name': 'map1'
}
},
{
'node': {
'name': 'map2'
}
},
{
'node': {
'name': 'map3'
}
}
]
},
'singular': 'compass',
'typeId': '1'
}
}
}
}
snapshots['test_schema_success[update] 2'] = {
'data': {
'allProductTypes': {
'edges': [
{
'node': {
'fields': {
'edges': [
{
'node': {
'field': {
'name': 'contact',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'beam'
}
}
},
{
'node': {
'field': {
'name': 'produced_by',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'beam'
}
}
},
{
'node': {
'field': {
'name': 'date_produced',
'type_': 'DATE'
},
'type_': {
'name': 'beam'
}
}
}
]
},
'icon': 'mdi-spotlight-beam',
'indefArticle': 'a',
'name': 'beam',
'order': 1,
'plural': 'beams',
'products': {
'edges': [
]
},
'singular': 'beam',
'typeId': '2'
}
},
{
'node': {
'fields': {
'edges': [
{
'node': {
'field': {
'name': 'contact',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
},
{
'node': {
'field': {
'name': 'field_four',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
},
{
'node': {
'field': {
'name': 'field_five',
'type_': 'UNICODE_TEXT'
},
'type_': {
'name': 'compass'
}
}
}
]
},
'icon': 'mdi-compass',
'indefArticle': 'a',
'name': 'compass',
'order': 5,
'plural': 'compasses',
'products': {
'edges': [
{
'node': {
'name': 'map1'
}
},
{
'node': {
'name': 'map2'
}
},
{
'node': {
'name': 'map3'
}
}
]
},
'singular': 'compass',
'typeId': '1'
}
}
],
'totalCount': 2
}
}
}
| 38.131268 | 68 | 0.134336 | 667 | 25,853 | 5.04048 | 0.098951 | 0.072278 | 0.104402 | 0.113028 | 0.93724 | 0.93367 | 0.918203 | 0.918203 | 0.893516 | 0.879536 | 0 | 0.010103 | 0.781766 | 25,853 | 677 | 69 | 38.187592 | 0.585785 | 0.002398 | 0 | 0.541353 | 0 | 0 | 0.128005 | 0.008919 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.054135 | 0.003008 | 0 | 0.003008 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
439910d88e6f79eb85280101a360f0a9d8d2d654 | 89,758 | py | Python | sdk/lusid/api/portfolios_api.py | mneedham/lusid-sdk-python | edabec16b357ba3fc48a53f3faacb4f94b18843e | [
"MIT"
] | null | null | null | sdk/lusid/api/portfolios_api.py | mneedham/lusid-sdk-python | edabec16b357ba3fc48a53f3faacb4f94b18843e | [
"MIT"
] | null | null | null | sdk/lusid/api/portfolios_api.py | mneedham/lusid-sdk-python | edabec16b357ba3fc48a53f3faacb4f94b18843e | [
"MIT"
] | null | null | null | # coding: utf-8
"""
LUSID API
FINBOURNE Technology # noqa: E501
The version of the OpenAPI document: 0.11.2808
Contact: info@finbourne.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from lusid.api_client import ApiClient
from lusid.exceptions import (
ApiTypeError,
ApiValueError
)
class PortfoliosApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def delete_portfolio(self, scope, code, **kwargs): # noqa: E501
"""Delete portfolio # noqa: E501
Delete a single portfolio. The deletion of the portfolio will be valid from the portfolio's creation datetime. This means that the portfolio will no longer exist at any effective datetime from the asAt datetime of deletion. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_portfolio(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio. (required)
:param str code: The code of the portfolio. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: DeletedEntityResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.delete_portfolio_with_http_info(scope, code, **kwargs) # noqa: E501
def delete_portfolio_with_http_info(self, scope, code, **kwargs): # noqa: E501
"""Delete portfolio # noqa: E501
Delete a single portfolio. The deletion of the portfolio will be valid from the portfolio's creation datetime. This means that the portfolio will no longer exist at any effective datetime from the asAt datetime of deletion. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_portfolio_with_http_info(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio. (required)
:param str code: The code of the portfolio. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(DeletedEntityResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_portfolio" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_portfolio`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_portfolio`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_portfolio`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `delete_portfolio`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `delete_portfolio`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `delete_portfolio`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios/{scope}/{code}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeletedEntityResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_portfolio_properties(self, scope, code, property_keys, **kwargs): # noqa: E501
"""Delete portfolio properties # noqa: E501
Delete one or more properties from a single portfolio. If the properties are time variant then an effective date time from which the properties will be deleted must be specified. If the properties are perpetual then it is invalid to specify an effective date time for deletion. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_portfolio_properties(scope, code, property_keys, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to delete properties from. (required)
:param str code: The code of the portfolio to delete properties from. Together with the scope this uniquely identifies the portfolio. (required)
:param list[str] property_keys: The property keys of the properties to delete. These take the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\". Each property must be from the \"Portfolio\" domain. (required)
:param str effective_at: The effective datetime or cut label at which to delete the properties.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: DeletedEntityResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.delete_portfolio_properties_with_http_info(scope, code, property_keys, **kwargs) # noqa: E501
def delete_portfolio_properties_with_http_info(self, scope, code, property_keys, **kwargs): # noqa: E501
"""Delete portfolio properties # noqa: E501
Delete one or more properties from a single portfolio. If the properties are time variant then an effective date time from which the properties will be deleted must be specified. If the properties are perpetual then it is invalid to specify an effective date time for deletion. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_portfolio_properties_with_http_info(scope, code, property_keys, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to delete properties from. (required)
:param str code: The code of the portfolio to delete properties from. Together with the scope this uniquely identifies the portfolio. (required)
:param list[str] property_keys: The property keys of the properties to delete. These take the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\". Each property must be from the \"Portfolio\" domain. (required)
:param str effective_at: The effective datetime or cut label at which to delete the properties.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(DeletedEntityResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'property_keys', 'effective_at'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_portfolio_properties" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'property_keys' is set
if ('property_keys' not in local_var_params or
local_var_params['property_keys'] is None):
raise ApiValueError("Missing the required parameter `property_keys` when calling `delete_portfolio_properties`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_portfolio_properties`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_portfolio_properties`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_portfolio_properties`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `delete_portfolio_properties`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `delete_portfolio_properties`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `delete_portfolio_properties`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
if 'effective_at' in local_var_params:
query_params.append(('effectiveAt', local_var_params['effective_at'])) # noqa: E501
if 'property_keys' in local_var_params:
query_params.append(('propertyKeys', local_var_params['property_keys'])) # noqa: E501
collection_formats['propertyKeys'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios/{scope}/{code}/properties', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeletedEntityResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_portfolio(self, scope, code, **kwargs): # noqa: E501
"""Get portfolio # noqa: E501
Retrieve the definition of a single portfolio. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_portfolio(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to retrieve the definition for. (required)
:param str code: The code of the portfolio to retrieve the definition for. Together with the scope this uniquely identifies the portfolio. (required)
:param str effective_at: The effective datetime or cut label at which to retrieve the portfolio definition. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to retrieve the portfolio definition. Defaults to return the latest version of the portfolio definition if not specified.
:param list[str] property_keys: A list of property keys from the \"Portfolio\" domain to decorate onto the portfolio. These take the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\".
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Portfolio
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_portfolio_with_http_info(scope, code, **kwargs) # noqa: E501
def get_portfolio_with_http_info(self, scope, code, **kwargs): # noqa: E501
"""Get portfolio # noqa: E501
Retrieve the definition of a single portfolio. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_portfolio_with_http_info(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to retrieve the definition for. (required)
:param str code: The code of the portfolio to retrieve the definition for. Together with the scope this uniquely identifies the portfolio. (required)
:param str effective_at: The effective datetime or cut label at which to retrieve the portfolio definition. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to retrieve the portfolio definition. Defaults to return the latest version of the portfolio definition if not specified.
:param list[str] property_keys: A list of property keys from the \"Portfolio\" domain to decorate onto the portfolio. These take the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\".
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(Portfolio, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'effective_at', 'as_at', 'property_keys'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_portfolio" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
if 'effective_at' in local_var_params:
query_params.append(('effectiveAt', local_var_params['effective_at'])) # noqa: E501
if 'as_at' in local_var_params:
query_params.append(('asAt', local_var_params['as_at'])) # noqa: E501
if 'property_keys' in local_var_params:
query_params.append(('propertyKeys', local_var_params['property_keys'])) # noqa: E501
collection_formats['propertyKeys'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios/{scope}/{code}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Portfolio', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_portfolio_commands(self, scope, code, **kwargs): # noqa: E501
"""[EARLY ACCESS] Get portfolio commands # noqa: E501
Gets all the commands that modified a single portfolio, including any input transactions. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_portfolio_commands(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to retrieve the commands for. (required)
:param str code: The code of the portfolio to retrieve the commands for. Together with the scope this uniquely identifies the portfolio. (required)
:param datetime from_as_at: The lower bound asAt datetime (inclusive) from which to retrieve commands. There is no lower bound if this is not specified.
:param datetime to_as_at: The upper bound asAt datetime (inclusive) from which to retrieve commands. There is no upper bound if this is not specified.
:param str filter: Expression to filter the result set. For example, to filter on the User ID, use \"userId.id eq 'string'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ResourceListOfProcessedCommand
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_portfolio_commands_with_http_info(scope, code, **kwargs) # noqa: E501
def get_portfolio_commands_with_http_info(self, scope, code, **kwargs): # noqa: E501
"""[EARLY ACCESS] Get portfolio commands # noqa: E501
Gets all the commands that modified a single portfolio, including any input transactions. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_portfolio_commands_with_http_info(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to retrieve the commands for. (required)
:param str code: The code of the portfolio to retrieve the commands for. Together with the scope this uniquely identifies the portfolio. (required)
:param datetime from_as_at: The lower bound asAt datetime (inclusive) from which to retrieve commands. There is no lower bound if this is not specified.
:param datetime to_as_at: The upper bound asAt datetime (inclusive) from which to retrieve commands. There is no upper bound if this is not specified.
:param str filter: Expression to filter the result set. For example, to filter on the User ID, use \"userId.id eq 'string'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ResourceListOfProcessedCommand, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'from_as_at', 'to_as_at', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_portfolio_commands" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio_commands`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio_commands`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio_commands`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio_commands`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio_commands`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio_commands`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
if 'from_as_at' in local_var_params:
query_params.append(('fromAsAt', local_var_params['from_as_at'])) # noqa: E501
if 'to_as_at' in local_var_params:
query_params.append(('toAsAt', local_var_params['to_as_at'])) # noqa: E501
if 'filter' in local_var_params:
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios/{scope}/{code}/commands', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResourceListOfProcessedCommand', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_portfolio_properties(self, scope, code, **kwargs): # noqa: E501
"""Get portfolio properties # noqa: E501
List all the properties of a single portfolio. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_portfolio_properties(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to list the properties for. (required)
:param str code: The code of the portfolio to list the properties for. Together with the scope this uniquely identifies the portfolio. (required)
:param str effective_at: The effective datetime or cut label at which to list the portfolio's properties. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to list the portfolio's properties. Defaults to return the latest version of each property if not specified.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: PortfolioProperties
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_portfolio_properties_with_http_info(scope, code, **kwargs) # noqa: E501
def get_portfolio_properties_with_http_info(self, scope, code, **kwargs): # noqa: E501
"""Get portfolio properties # noqa: E501
List all the properties of a single portfolio. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_portfolio_properties_with_http_info(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to list the properties for. (required)
:param str code: The code of the portfolio to list the properties for. Together with the scope this uniquely identifies the portfolio. (required)
:param str effective_at: The effective datetime or cut label at which to list the portfolio's properties. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to list the portfolio's properties. Defaults to return the latest version of each property if not specified.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(PortfolioProperties, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'effective_at', 'as_at'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_portfolio_properties" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio_properties`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio_properties`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio_properties`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio_properties`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio_properties`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio_properties`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
if 'effective_at' in local_var_params:
query_params.append(('effectiveAt', local_var_params['effective_at'])) # noqa: E501
if 'as_at' in local_var_params:
query_params.append(('asAt', local_var_params['as_at'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios/{scope}/{code}/properties', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PortfolioProperties', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_portfolio_relations(self, scope, code, **kwargs): # noqa: E501
"""[DEPRECATED] Get Relations for Portfolio # noqa: E501
Get relations for the specified Portfolio # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_portfolio_relations(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio. (required)
:param str code: The code of the portfolio. Together with the scope this uniquely identifies the portfolio. (required)
:param str effective_at: The effective datetime or cut label at which to retrieve relations. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to retrieve relations. Defaults to return the latest LUSID AsAt time if not specified.
:param str filter: Expression to filter the relations. Users should provide null or empty string for this field until further notice.
:param list[str] identifier_types: Identifiers types (as property keys) used for referencing Persons or Legal Entities. These take the format {domain}/{scope}/{code} e.g. \"Person/CompanyDetails/Role\". They must be from the \"Person\" or \"LegalEntity\" domain. Only identifier types stated will be used to look up relevant entities in relations. If not applicable, provide an empty array.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ResourceListOfRelation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_portfolio_relations_with_http_info(scope, code, **kwargs) # noqa: E501
def get_portfolio_relations_with_http_info(self, scope, code, **kwargs): # noqa: E501
"""[DEPRECATED] Get Relations for Portfolio # noqa: E501
Get relations for the specified Portfolio # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_portfolio_relations_with_http_info(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio. (required)
:param str code: The code of the portfolio. Together with the scope this uniquely identifies the portfolio. (required)
:param str effective_at: The effective datetime or cut label at which to retrieve relations. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to retrieve relations. Defaults to return the latest LUSID AsAt time if not specified.
:param str filter: Expression to filter the relations. Users should provide null or empty string for this field until further notice.
:param list[str] identifier_types: Identifiers types (as property keys) used for referencing Persons or Legal Entities. These take the format {domain}/{scope}/{code} e.g. \"Person/CompanyDetails/Role\". They must be from the \"Person\" or \"LegalEntity\" domain. Only identifier types stated will be used to look up relevant entities in relations. If not applicable, provide an empty array.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ResourceListOfRelation, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'effective_at', 'as_at', 'filter', 'identifier_types'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_portfolio_relations" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio_relations`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio_relations`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `get_portfolio_relations`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio_relations`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio_relations`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `get_portfolio_relations`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
if 'effective_at' in local_var_params:
query_params.append(('effectiveAt', local_var_params['effective_at'])) # noqa: E501
if 'as_at' in local_var_params:
query_params.append(('asAt', local_var_params['as_at'])) # noqa: E501
if 'filter' in local_var_params:
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
if 'identifier_types' in local_var_params:
query_params.append(('identifierTypes', local_var_params['identifier_types'])) # noqa: E501
collection_formats['identifierTypes'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios/{scope}/{code}/relations', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResourceListOfRelation', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def list_portfolios(self, **kwargs): # noqa: E501
"""List portfolios # noqa: E501
List all the portfolios matching the specified criteria. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_portfolios(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str effective_at: The effective datetime or cut label at which to list the portfolios. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to list the portfolios. Defaults to return the latest version of each portfolio if not specified.
:param str page: The pagination token to use to continue listing portfolios from a previous call to list portfolios. This value is returned from the previous call. If a pagination token is provided the filter, effectiveAt and asAt fields must not have changed since the original request. Also, if set, a start value cannot be provided.
:param int start: When paginating, skip this number of results.
:param int limit: When paginating, limit the number of returned results to this many. Defaults to 65,535 if not specified.
:param str filter: Expression to filter the result set. For example, to filter on the Type, use \"type eq 'Transaction'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param str query: Expression specifying the criteria that the returned portfolios must meet e.g. to see which portfolios have holdings in the instruments with a Lusid Instrument Id (LUID) of 'LUID_PPA8HI6M' or a Figi of 'BBG000BLNNH6' you would specify \"instrument.identifiers in (('LusidInstrumentId', 'LUID_PPA8HI6M'), ('Figi', 'BBG000BLNNH6'))\".
:param list[str] property_keys: A list of property keys from the \"Portfolio\" domain to decorate onto each portfolio. These take the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\".
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ResourceListOfPortfolio
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.list_portfolios_with_http_info(**kwargs) # noqa: E501
def list_portfolios_with_http_info(self, **kwargs): # noqa: E501
"""List portfolios # noqa: E501
List all the portfolios matching the specified criteria. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_portfolios_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str effective_at: The effective datetime or cut label at which to list the portfolios. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to list the portfolios. Defaults to return the latest version of each portfolio if not specified.
:param str page: The pagination token to use to continue listing portfolios from a previous call to list portfolios. This value is returned from the previous call. If a pagination token is provided the filter, effectiveAt and asAt fields must not have changed since the original request. Also, if set, a start value cannot be provided.
:param int start: When paginating, skip this number of results.
:param int limit: When paginating, limit the number of returned results to this many. Defaults to 65,535 if not specified.
:param str filter: Expression to filter the result set. For example, to filter on the Type, use \"type eq 'Transaction'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param str query: Expression specifying the criteria that the returned portfolios must meet e.g. to see which portfolios have holdings in the instruments with a Lusid Instrument Id (LUID) of 'LUID_PPA8HI6M' or a Figi of 'BBG000BLNNH6' you would specify \"instrument.identifiers in (('LusidInstrumentId', 'LUID_PPA8HI6M'), ('Figi', 'BBG000BLNNH6'))\".
:param list[str] property_keys: A list of property keys from the \"Portfolio\" domain to decorate onto each portfolio. These take the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\".
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ResourceListOfPortfolio, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['effective_at', 'as_at', 'page', 'start', 'limit', 'filter', 'query', 'property_keys'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_portfolios" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if 'limit' in local_var_params and local_var_params['limit'] > 5000: # noqa: E501
raise ApiValueError("Invalid value for parameter `limit` when calling `list_portfolios`, must be a value less than or equal to `5000`") # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] < 1: # noqa: E501
raise ApiValueError("Invalid value for parameter `limit` when calling `list_portfolios`, must be a value greater than or equal to `1`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'effective_at' in local_var_params:
query_params.append(('effectiveAt', local_var_params['effective_at'])) # noqa: E501
if 'as_at' in local_var_params:
query_params.append(('asAt', local_var_params['as_at'])) # noqa: E501
if 'page' in local_var_params:
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'start' in local_var_params:
query_params.append(('start', local_var_params['start'])) # noqa: E501
if 'limit' in local_var_params:
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'filter' in local_var_params:
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
if 'query' in local_var_params:
query_params.append(('query', local_var_params['query'])) # noqa: E501
if 'property_keys' in local_var_params:
query_params.append(('propertyKeys', local_var_params['property_keys'])) # noqa: E501
collection_formats['propertyKeys'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResourceListOfPortfolio', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def list_portfolios_for_scope(self, scope, **kwargs): # noqa: E501
"""List portfolios for scope # noqa: E501
List all the portfolios in a single scope. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_portfolios_for_scope(scope, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolios. (required)
:param str effective_at: The effective datetime or cut label at which to list the portfolios. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to list the portfolios. Defaults to return the latest version of each portfolio if not specified.
:param str page: The pagination token to use to continue listing portfolios from a previous call to list portfolios. This value is returned from the previous call. If a pagination token is provided the filter, effectiveAt and asAt fields must not have changed since the original request. Also, if set, a start value cannot be provided.
:param int start: When paginating, skip this number of results.
:param int limit: When paginating, limit the number of returned results to this many. Defaults to 65,535 if not specified.
:param str filter: Expression to filter the result set. For example, to filter on the Type, use \"type eq 'Transaction'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param list[str] property_keys: A list of property keys from the \"Portfolio\" domain to decorate onto each portfolio. These take the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\".
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ResourceListOfPortfolio
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.list_portfolios_for_scope_with_http_info(scope, **kwargs) # noqa: E501
def list_portfolios_for_scope_with_http_info(self, scope, **kwargs): # noqa: E501
"""List portfolios for scope # noqa: E501
List all the portfolios in a single scope. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_portfolios_for_scope_with_http_info(scope, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolios. (required)
:param str effective_at: The effective datetime or cut label at which to list the portfolios. Defaults to the current LUSID system datetime if not specified.
:param datetime as_at: The asAt datetime at which to list the portfolios. Defaults to return the latest version of each portfolio if not specified.
:param str page: The pagination token to use to continue listing portfolios from a previous call to list portfolios. This value is returned from the previous call. If a pagination token is provided the filter, effectiveAt and asAt fields must not have changed since the original request. Also, if set, a start value cannot be provided.
:param int start: When paginating, skip this number of results.
:param int limit: When paginating, limit the number of returned results to this many. Defaults to 65,535 if not specified.
:param str filter: Expression to filter the result set. For example, to filter on the Type, use \"type eq 'Transaction'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param list[str] property_keys: A list of property keys from the \"Portfolio\" domain to decorate onto each portfolio. These take the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\".
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ResourceListOfPortfolio, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'effective_at', 'as_at', 'page', 'start', 'limit', 'filter', 'property_keys'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_portfolios_for_scope" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `list_portfolios_for_scope`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `list_portfolios_for_scope`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `list_portfolios_for_scope`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] > 5000: # noqa: E501
raise ApiValueError("Invalid value for parameter `limit` when calling `list_portfolios_for_scope`, must be a value less than or equal to `5000`") # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] < 1: # noqa: E501
raise ApiValueError("Invalid value for parameter `limit` when calling `list_portfolios_for_scope`, must be a value greater than or equal to `1`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
query_params = []
if 'effective_at' in local_var_params:
query_params.append(('effectiveAt', local_var_params['effective_at'])) # noqa: E501
if 'as_at' in local_var_params:
query_params.append(('asAt', local_var_params['as_at'])) # noqa: E501
if 'page' in local_var_params:
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'start' in local_var_params:
query_params.append(('start', local_var_params['start'])) # noqa: E501
if 'limit' in local_var_params:
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'filter' in local_var_params:
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
if 'property_keys' in local_var_params:
query_params.append(('propertyKeys', local_var_params['property_keys'])) # noqa: E501
collection_formats['propertyKeys'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios/{scope}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResourceListOfPortfolio', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def update_portfolio(self, scope, code, update_portfolio_request, **kwargs): # noqa: E501
"""Update portfolio # noqa: E501
Update the definition of a single portfolio. Not all elements within a portfolio definition are modifiable due to the potential implications for data already stored against the portfolio. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_portfolio(scope, code, update_portfolio_request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to update the definition for. (required)
:param str code: The code of the portfolio to update the definition for. Together with the scope this uniquely identifies the portfolio. (required)
:param UpdatePortfolioRequest update_portfolio_request: The updated portfolio definition. (required)
:param str effective_at: The effective datetime or cut label at which to update the definition. Defaults to the current LUSID system datetime if not specified.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Portfolio
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.update_portfolio_with_http_info(scope, code, update_portfolio_request, **kwargs) # noqa: E501
def update_portfolio_with_http_info(self, scope, code, update_portfolio_request, **kwargs): # noqa: E501
"""Update portfolio # noqa: E501
Update the definition of a single portfolio. Not all elements within a portfolio definition are modifiable due to the potential implications for data already stored against the portfolio. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_portfolio_with_http_info(scope, code, update_portfolio_request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to update the definition for. (required)
:param str code: The code of the portfolio to update the definition for. Together with the scope this uniquely identifies the portfolio. (required)
:param UpdatePortfolioRequest update_portfolio_request: The updated portfolio definition. (required)
:param str effective_at: The effective datetime or cut label at which to update the definition. Defaults to the current LUSID system datetime if not specified.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(Portfolio, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'update_portfolio_request', 'effective_at'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method update_portfolio" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'update_portfolio_request' is set
if ('update_portfolio_request' not in local_var_params or
local_var_params['update_portfolio_request'] is None):
raise ApiValueError("Missing the required parameter `update_portfolio_request` when calling `update_portfolio`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `update_portfolio`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `update_portfolio`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `update_portfolio`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `update_portfolio`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `update_portfolio`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `update_portfolio`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
if 'effective_at' in local_var_params:
query_params.append(('effectiveAt', local_var_params['effective_at'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'update_portfolio_request' in local_var_params:
body_params = local_var_params['update_portfolio_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json-patch+json', 'application/json', 'text/json', 'application/*+json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios/{scope}/{code}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Portfolio', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def upsert_portfolio_properties(self, scope, code, request_body, **kwargs): # noqa: E501
"""Upsert portfolio properties # noqa: E501
Update or insert one or more properties onto a single portfolio. A property will be updated if it already exists and inserted if it does not. All properties must be of the domain 'Portfolio'. Properties have an <i>effectiveFrom</i> datetime for which the property is valid, and an <i>effectiveUntil</i> datetime until which the property is valid. Not supplying an <i>effectiveUntil</i> datetime results in the property being valid indefinitely, or until the next <i>effectiveFrom</i> datetime of the property. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.upsert_portfolio_properties(scope, code, request_body, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to update or insert the properties onto. (required)
:param str code: The code of the portfolio to update or insert the properties onto. Together with the scope this uniquely identifies the portfolio. (required)
:param dict(str, ModelProperty) request_body: The properties to be updated or inserted onto the portfolio. Each property in the request must be keyed by its unique property key. This has the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\". (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: PortfolioProperties
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.upsert_portfolio_properties_with_http_info(scope, code, request_body, **kwargs) # noqa: E501
def upsert_portfolio_properties_with_http_info(self, scope, code, request_body, **kwargs): # noqa: E501
"""Upsert portfolio properties # noqa: E501
Update or insert one or more properties onto a single portfolio. A property will be updated if it already exists and inserted if it does not. All properties must be of the domain 'Portfolio'. Properties have an <i>effectiveFrom</i> datetime for which the property is valid, and an <i>effectiveUntil</i> datetime until which the property is valid. Not supplying an <i>effectiveUntil</i> datetime results in the property being valid indefinitely, or until the next <i>effectiveFrom</i> datetime of the property. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.upsert_portfolio_properties_with_http_info(scope, code, request_body, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the portfolio to update or insert the properties onto. (required)
:param str code: The code of the portfolio to update or insert the properties onto. Together with the scope this uniquely identifies the portfolio. (required)
:param dict(str, ModelProperty) request_body: The properties to be updated or inserted onto the portfolio. Each property in the request must be keyed by its unique property key. This has the format {domain}/{scope}/{code} e.g. \"Portfolio/Manager/Id\". (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(PortfolioProperties, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'request_body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method upsert_portfolio_properties" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'request_body' is set
if ('request_body' not in local_var_params or
local_var_params['request_body'] is None):
raise ApiValueError("Missing the required parameter `request_body` when calling `upsert_portfolio_properties`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `upsert_portfolio_properties`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `upsert_portfolio_properties`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `upsert_portfolio_properties`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `upsert_portfolio_properties`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `upsert_portfolio_properties`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `upsert_portfolio_properties`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'request_body' in local_var_params:
body_params = local_var_params['request_body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json-patch+json', 'application/json', 'text/json', 'application/*+json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/portfolios/{scope}/{code}/properties', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PortfolioProperties', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 61.816804 | 549 | 0.641659 | 11,171 | 89,758 | 4.975472 | 0.034285 | 0.045195 | 0.074054 | 0.030802 | 0.978914 | 0.974452 | 0.971321 | 0.968478 | 0.961768 | 0.956676 | 0 | 0.017136 | 0.275062 | 89,758 | 1,451 | 550 | 61.859407 | 0.83705 | 0.470966 | 0 | 0.775934 | 1 | 0.076072 | 0.30416 | 0.065668 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029046 | false | 0 | 0.006916 | 0 | 0.065007 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.