hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
a32dd863098ced1fbbf1c0211af34809b2e487b7 | 187 | py | Python | spira/lne/__init__.py | cloudcalvin/spira | 2dcaef188f2bc8c3839e1b5ff0be027e0cd4908c | [
"MIT"
] | null | null | null | spira/lne/__init__.py | cloudcalvin/spira | 2dcaef188f2bc8c3839e1b5ff0be027e0cd4908c | [
"MIT"
] | 1 | 2021-10-17T10:18:04.000Z | 2021-10-17T10:18:04.000Z | spira/lne/__init__.py | cloudcalvin/spira | 2dcaef188f2bc8c3839e1b5ff0be027e0cd4908c | [
"MIT"
] | null | null | null | from spira.lne.geometry import Geometry
from spira.lne.graph import Graph
from spira.lne.mesh import Mesh
from spira.lne.graph import GraphAbstract
from spira.lne.mesh import MeshAbstract | 37.4 | 41 | 0.84492 | 30 | 187 | 5.266667 | 0.3 | 0.28481 | 0.379747 | 0.21519 | 0.56962 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101604 | 187 | 5 | 42 | 37.4 | 0.940476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a339030e45624748168c7a0898d767c1ea041110 | 19,456 | py | Python | tests/dhcpv4/process/test_v4_release.py | isc-projects/forge | dfec8b41003d6b5a229f69ee93616e0e5cc6d71b | [
"0BSD"
] | 22 | 2015-02-27T11:51:05.000Z | 2022-02-28T12:39:29.000Z | tests/dhcpv4/process/test_v4_release.py | isc-projects/forge | dfec8b41003d6b5a229f69ee93616e0e5cc6d71b | [
"0BSD"
] | 16 | 2018-10-30T15:00:12.000Z | 2019-01-11T17:55:13.000Z | tests/dhcpv4/process/test_v4_release.py | isc-projects/forge | dfec8b41003d6b5a229f69ee93616e0e5cc6d71b | [
"0BSD"
] | 11 | 2015-02-27T11:51:36.000Z | 2021-03-30T08:33:54.000Z | """DHCPv4 address release process"""
# pylint: disable=invalid-name,line-too-long
import pytest
import srv_control
import misc
import srv_msg
@pytest.mark.v4
@pytest.mark.release
@pytest.mark.parametrize("backend", ['memfile', 'mysql', 'postgresql'])
def test_v4_release_success(backend):
misc.test_setup()
srv_control.define_temporary_lease_db_backend(backend)
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:11:11:22')
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:11:11:22')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
my_lease = srv_msg.get_all_leases()
srv_msg.check_leases(my_lease, backend=backend)
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
@pytest.mark.v4
@pytest.mark.release
def test_v4_release_success_with_additional_offer():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
misc.test_procedure()
srv_msg.client_save_option('server_id')
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:00')
srv_msg.client_does_include_with_value('client_id', '00010203040506')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_add_saved_option(erase=True)
srv_msg.client_sets_value('Client', 'chaddr', 'default')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
@pytest.mark.v4
@pytest.mark.release
def test_v4_release_fail_with_different_chaddr_client_id():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1F:D0:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00001FD0040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_include_option(61)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
srv_msg.response_check_option_content(61, 'value', '00001FD0040111')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00001FD0040111')
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_include_option(61)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
srv_msg.response_check_option_content(61, 'value', '00001FD0040111')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:11:22:33')
srv_msg.client_does_include_with_value('client_id', '00001FD0112233')
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
# address not released
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:00')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
@pytest.mark.v4
@pytest.mark.release
def test_v4_release_fail_with_same_chaddr_different_client_id():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1F:D0:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00001FD0040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_include_option(61)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
srv_msg.response_check_option_content(61, 'value', '00001FD0040111')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00001FD0040111')
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_include_option(61)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
srv_msg.response_check_option_content(61, 'value', '00001FD0040111')
misc.test_procedure()
# client id changed!
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00001FD0112233')
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
# address not released
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:00')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
@pytest.mark.v4
@pytest.mark.release
def test_v4_release_fail_with_different_chaddr_same_client_id():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1F:D0:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00001FD0040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_include_option(61)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
srv_msg.response_check_option_content(61, 'value', '00001FD0040111')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00001FD0040111')
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_include_option(61)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
srv_msg.response_check_option_content(61, 'value', '00001FD0040111')
misc.test_procedure()
# chaddr changed!
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:11:11:11')
srv_msg.client_does_include_with_value('client_id', '00001FD0040111')
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
# address not released
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:00')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_include_option(61)
@pytest.mark.v4
@pytest.mark.release
def test_v4_release_only_chaddr_same_chaddr():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1F:D0:00:00:11')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:00:00:11')
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
misc.test_procedure()
# client id changed!
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:00:00:11')
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
# address not released
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:00')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
@pytest.mark.v4
@pytest.mark.release
def test_v4_release_fail_only_chaddr_different_chaddr():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1F:D0:00:00:11')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:00:00:11')
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_include_option(1)
srv_msg.response_check_include_option(54)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_option_content(54, 'value', '$(SRV4_ADDR)')
misc.test_procedure()
# chaddr changed!
srv_msg.client_sets_value('Client', 'chaddr', '00:1f:d0:11:11:11')
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
# address not released
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:00')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
@pytest.mark.v4
@pytest.mark.release
def test_v4_release_leases_expired():
misc.test_setup()
srv_control.set_time('renew-timer', 1)
srv_control.set_time('rebind-timer', 2)
srv_control.set_time('valid-lifetime', 3)
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.forge_sleep(4, 'seconds')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
| 38.67992 | 81 | 0.743575 | 3,016 | 19,456 | 4.380637 | 0.041114 | 0.12307 | 0.118983 | 0.149561 | 0.961853 | 0.956101 | 0.953906 | 0.951408 | 0.951408 | 0.951408 | 0 | 0.087491 | 0.118215 | 19,456 | 502 | 82 | 38.756972 | 0.682618 | 0.012798 | 0 | 0.937343 | 0 | 0 | 0.189506 | 0.010421 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02005 | false | 0.080201 | 0.010025 | 0 | 0.030075 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
a3483dbc59d0540caaf3e390f3baaa4713ef81cb | 35,200 | py | Python | htc-api/api/htc_api.py | synthetichealth/syntheticmass | 1ec6ca5ade493c957061b32a49ee345f7ab8fd46 | [
"Apache-2.0"
] | 34 | 2016-09-30T17:17:58.000Z | 2022-02-04T15:53:09.000Z | htc-api/api/htc_api.py | Sulaco-Chris/syntheticmass | 1ec6ca5ade493c957061b32a49ee345f7ab8fd46 | [
"Apache-2.0"
] | 23 | 2016-07-21T17:51:07.000Z | 2018-11-29T15:09:06.000Z | htc-api/api/htc_api.py | Sulaco-Chris/syntheticmass | 1ec6ca5ade493c957061b32a49ee345f7ab8fd46 | [
"Apache-2.0"
] | 18 | 2016-09-30T17:44:05.000Z | 2021-12-23T07:55:11.000Z | #!flask/bin/python
from flask import Flask, jsonify, request, abort, g, send_from_directory
from flask_cors import CORS
from flask.ext.autodoc import Autodoc
from flask_cache import Cache
import logging
import re
#need simplejson to deal with Postgres Decimal types
import simplejson as json
#NOTE: may need to run on Linux: "ln -s /usr/local/pgsql/lib/libpq.so.5 /usr/lib64/libpq.so.5"
import psycopg2 as pg
import psycopg2.pool as pgp
import sys
import time
import postgis2geojson as p2g
from psycopg2.extras import RealDictCursor
app = Flask(__name__)
# define the cache config, register the cache instance, and bind it to the app
cache = Cache(app,config={'CACHE_TYPE': 'simple'})
CORS(app)
auto = Autodoc(app)
global pool
global log
#Postgres connection management
def setup_pool():
global pool
with open('htc_login.txt') as f:
#each of these is expected to appear on a separate line
host = f.readline().rstrip()
port = f.readline().rstrip()
db = f.readline().rstrip()
user = f.readline().rstrip()
pw = f.readline().rstrip()
pool = pgp.ThreadedConnectionPool(20, 100, host=host, port=port, database=db, user=user, password=pw)
#get current db connection if holding one, otherwise get a new one from the pool
def get_db_con():
global pool
max_attempts = 10
con = getattr(g, '_database', None)
if con is None:
#Need to get a connection, use a try loop to handle pool depletions a bit better
#Otherwise psycopg2.pool throws exception and server returns 500 error to client
for attempt in range(1, max_attempts):
try:
con = g._database = pool.getconn()
if (attempt > 1):
log.debug("connection newly acquired from pool, attempt=%s" % attempt)
return con
except:
#On any errors, add exponentially increasing time delays.
#This seems to handle at least 30X the pool size in requests without hard errors.
e = sys.exc_info()[0]
log.error("exception during connection attempt=%s: %s" % (attempt, e))
if (attempt == max_attempts):
#give up!
raise
time.sleep(attempt**2)
else:
log.debug("connection reused from session variable.")
con.autocommit = True
return con
#Automatically return db connections
@app.teardown_appcontext
def return_db_con(exception):
global pool
con = getattr(g, '_database', None)
if con is not None:
pool.putconn(con)
#log.debug("connection returned to pool.")
#format simple data for return as JSON
def getData(conn, query, params=None):
"Use this for non-geometry SELECTs, produces plain json based on DB field names"
with conn.cursor(cursor_factory=RealDictCursor) as cur:
if (params):
cur.execute(query, params)
else:
cur.execute(query)
return json.dumps(cur.fetchall(), indent=2)
#removes CR LF characters from string, for safer logging
def sanitize(s):
return re.sub("[\r\n]+", " ", s)
#Get IP of client making the call, TO BE USED FOR DEBUGGING PURPOSES ONLY!
#Should handle running behind proxy, but could be subject to spoofing
#Reference: http://esd.io/blog/flask-apps-heroku-real-ip-spoofing.html
def get_ip():
if not request.headers.getlist("X-Forwarded-For"):
ip = request.remote_addr
else:
ip = request.headers.getlist("X-Forwarded-For")[0]
#be sure to remove any CRLF characters, to limit log entry spoofing
return sanitize(ip)
#
#API calls
#
#Documentation Index
@app.route('/htc/api/v1')
@cache.cached(timeout=300) # cache this view for 5 minutes
def documentation():
return auto.html(title='MA High Tech Counsel API Documentation')
#All Counties
#
#Request geojson of all counties
@app.route('/htc/api/v1/counties', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_counties_all():
"""Counties in GeoJSON"""
log.debug("entering get_counties_all() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, s.pop_male / s.pop as pct_male, s.pop_female / s.pop as pct_female, s.pop_sm, " \
"chr.hs_graduate as chr_hs_grad, chr.college as chr_college, chr.unemployed as chr_unemployed, chr.diabetes_rate as chr_diabetes, " \
"chr.adult_obesity as chr_adult_obesity, chr.adult_smoking as chr_adult_smoking, opioid.deaths as opioid_deaths, " \
"age.fact_pop_0_4 as pop_0_4,age.fact_pop_5_9 as pop_5_9,age.fact_pop_10_14 as pop_10_14,age.fact_pop_15_19 as pop_15_19, "\
"age.fact_pop_20_24 as pop_20_24,age.fact_pop_25_29 as pop_25_29,age.fact_pop_30_34 as pop_30_34,age.fact_pop_35_39 as pop_35_39, " \
"age.fact_pop_40_44 as pop_40_44,age.fact_pop_45_49 as pop_45_49,age.fact_pop_50_54 as pop_50_54,age.fact_pop_55_59 as pop_55_59, " \
"age.fact_pop_60_64 as pop_60_64,age.fact_pop_65_69 as pop_65_69,age.fact_pop_70_74 as pop_70_74,age.fact_pop_75_79 as pop_75_79, " \
"age.fact_pop_80_84 as pop_80_84,age.fact_pop_85_110 as pop_85_110, " \
"ST_AsGeoJSON(the_geom) AS geometry " \
"FROM synth_ma.county_stats s " \
"JOIN synth_ma.ma_opioid_county opioid ON opioid.countyfp = s.ct_fips AND opioid.year = '2015' " \
"JOIN tiger_cb14_500k.county g ON g.statefp = '25' AND g.countyfp = s.ct_fips " \
"JOIN synth_ma.ma_county_age age ON age.ct_fips = s.ct_fips " \
"JOIN county_health.chr ON chr.statefp = '25' AND chr.release_year = 2016 AND chr.countyfp = s.ct_fips"
data = p2g.getData(con, sql)
log.debug("leaving get_counties_all()")
return data
#Request geojson of all counties (synthetic data)
@app.route('/htc/api/v1/synth/counties', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_synth_counties_all():
"""Counties in GeoJSON synthetic"""
log.debug("entering get_synth_counties_all() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, s.pop_sm, " \
"ST_AsGeoJSON(s.ct_poly) AS geometry, " \
"dd.rate as pct_diabetes, dhd.rate as pct_heart_disease, doa.rate as pct_opioid_addiction " \
"FROM synth_ma.synth_county_pop_stats s " \
"JOIN synth_ma.synth_county_disease_stats dd ON dd.ct_fips = s.ct_fips AND dd.disease_name = 'diabetes' " \
"JOIN synth_ma.synth_county_disease_stats dhd ON dhd.ct_fips = s.ct_fips AND dhd.disease_name = 'heart_disease' " \
"JOIN synth_ma.synth_county_disease_stats doa ON doa.ct_fips = s.ct_fips AND doa.disease_name = 'opioid_addiction' "
data = p2g.getData(con, sql)
log.debug("leaving get_synth_counties_all()")
return data
#Request list of all counties
@app.route('/htc/api/v1/counties/list', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_counties():
"""Counties list in JSON"""
log.debug("entering get_counties() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT ct_name, ct_fips " \
"FROM synth_ma.county_stats"
data = getData(con, sql)
log.debug("leaving get_counties()")
return data
#Request list of all counties (synthetic)
@app.route('/htc/api/v1/synth/counties/list', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_synth_counties():
"""Counties list in JSON synthetic"""
log.debug("entering get_synth_counties() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT ct_name, ct_fips " \
"FROM synth_ma.synth_county_pop_stats"
data = getData(con, sql)
log.debug("leaving get_synth_counties()")
return data
#Request list of disease names that we have statistics for (synthetic)
@app.route('/htc/api/v1/synth/diseases/list', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300)
def get_synth_diseases():
"""Disease list in JSON synthetic"""
log.debug("entering get_synth_diseases() IP=%s" %get_ip())
con = get_db_con()
sql = "SELECT DISTINCT disease_name FROM synth_ma.synth_county_disease_stats"
data = getData(con, sql)
log.debug("leaving get_synth_diseases()")
return data
#Request geojson of only the geometry of all counties
@app.route('/htc/api/v1/counties/geoms', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_counties_geom():
"""Counties in GeoJSON, geometry only"""
log.debug("entering get_counties_geom() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT countyfp AS ct_fips, ST_AsGeoJSON(the_geom) AS geometry " \
"FROM tiger_cb14_500k.county WHERE statefp='25'"
data = p2g.getData(con, sql)
log.debug("leaving get_counties_geom()")
return data
#Request only the statistics of all counties
@app.route('/htc/api/v1/counties/stats', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_counties_stats():
"""Counties in JSON, statistics only"""
log.debug("entering get_counties_stats() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, s.pop_male / s.pop as pct_male, s.pop_female / s.pop as pct_female, s.pop_sm, " \
"chr.hs_graduate / 100 as chr_hs_grad, chr.college / 100 as chr_college, chr.unemployed / 100 as chr_unemployed, chr.diabetes_rate / 100 as chr_diabetes, " \
"chr.adult_obesity / 100 as chr_adult_obesity, chr.adult_smoking / 100 as chr_adult_smoking, opioid.deaths as opioid_deaths, " \
"age.fact_pop_0_4 as pop_0_4,age.fact_pop_5_9 as pop_5_9,age.fact_pop_10_14 as pop_10_14,age.fact_pop_15_19 as pop_15_19, "\
"age.fact_pop_20_24 as pop_20_24,age.fact_pop_25_29 as pop_25_29,age.fact_pop_30_34 as pop_30_34,age.fact_pop_35_39 as pop_35_39, " \
"age.fact_pop_40_44 as pop_40_44,age.fact_pop_45_49 as pop_45_49,age.fact_pop_50_54 as pop_50_54,age.fact_pop_55_59 as pop_55_59, " \
"age.fact_pop_60_64 as pop_60_64,age.fact_pop_65_69 as pop_65_69,age.fact_pop_70_74 as pop_70_74,age.fact_pop_75_79 as pop_75_79, " \
"age.fact_pop_80_84 as pop_80_84,age.fact_pop_85_110 as pop_85_110 " \
"FROM synth_ma.county_stats s " \
"JOIN synth_ma.ma_opioid_county opioid ON opioid.countyfp = s.ct_fips AND opioid.year = '2015' " \
"JOIN synth_ma.ma_county_age age ON age.ct_fips = s.ct_fips " \
"JOIN county_health.chr ON chr.statefp = '25' AND chr.release_year = 2016 AND chr.countyfp = s.ct_fips"
data = getData(con, sql)
log.debug("leaving get_counties_stats()")
return data
#Request only the statistics of all counties (synthetic)
@app.route('/htc/api/v1/synth/counties/stats', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_synth_counties_stats():
"""Counties in JSON, statistics only synthetic"""
log.debug("entering get_synth_counties_stats() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, s.pop_sm, " \
"dd.rate as pct_diabetes, dhd.rate as pct_heart_disease, doa.rate as pct_opioid_addiction " \
"FROM synth_ma.synth_county_pop_stats s " \
"JOIN synth_ma.synth_county_disease_stats dd ON dd.ct_fips = s.ct_fips AND dd.disease_name = 'diabetes' " \
"JOIN synth_ma.synth_county_disease_stats dhd ON dhd.ct_fips = s.ct_fips AND dhd.disease_name = 'heart_disease' " \
"JOIN synth_ma.synth_county_disease_stats doa ON doa.ct_fips = s.ct_fips AND doa.disease_name = 'opioid_addiction' "
data = getData(con, sql)
log.debug("leaving get_synth_counties_stats()")
return data
#Single County
#
#Request geojson of single county by name
@app.route('/htc/api/v1/counties/name/<string:ct_name>', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_county_by_name(ct_name):
"""County in GeoJSON, by name"""
log.debug("entering get_county_by_name() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, s.pop_male / s.pop as pct_male, s.pop_female / s.pop as pct_female, s.pop_sm, " \
"chr.hs_graduate as chr_hs_grad, chr.college as chr_college, chr.unemployed as chr_unemployed, chr.diabetes_rate as chr_diabetes, " \
"chr.adult_obesity as chr_adult_obesity, chr.adult_smoking as chr_adult_smoking, opioid.deaths as opioid_deaths, " \
"age.fact_pop_0_4 as pop_0_4,age.fact_pop_5_9 as pop_5_9,age.fact_pop_10_14 as pop_10_14,age.fact_pop_15_19 as pop_15_19, "\
"age.fact_pop_20_24 as pop_20_24,age.fact_pop_25_29 as pop_25_29,age.fact_pop_30_34 as pop_30_34,age.fact_pop_35_39 as pop_35_39, " \
"age.fact_pop_40_44 as pop_40_44,age.fact_pop_45_49 as pop_45_49,age.fact_pop_50_54 as pop_50_54,age.fact_pop_55_59 as pop_55_59, " \
"age.fact_pop_60_64 as pop_60_64,age.fact_pop_65_69 as pop_65_69,age.fact_pop_70_74 as pop_70_74,age.fact_pop_75_79 as pop_75_79, " \
"age.fact_pop_80_84 as pop_80_84,age.fact_pop_85_110 as pop_85_110, " \
"ST_AsGeoJSON(s.ct_poly) AS geometry " \
"FROM synth_ma.county_stats s " \
"JOIN synth_ma.ma_opioid_county opioid ON opioid.countyfp = s.ct_fips AND opioid.year = '2015' " \
"JOIN tiger_cb14_500k.county g ON g.statefp = '25' AND g.countyfp = s.ct_fips " \
"JOIN county_health.chr ON chr.statefp = '25' AND chr.release_year = 2016 AND chr.countyfp = s.ct_fips " \
"JOIN synth_ma.ma_county_age age ON age.ct_fips = s.ct_fips " \
"WHERE ct_name=%s"
sql_params = (ct_name.title(),)
data = p2g.getData(con, sql, sql_params)
log.debug("leaving get_county_by_name()")
return data
#Request geojson of single county by name (synthetic)
@app.route('/htc/api/v1/synth/counties/name/<string:ct_name>', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_synth_county_by_name(ct_name):
"""County in GeoJSON, by name synthetic"""
log.debug("entering get_synth_county_by_name() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, s.pop_sm, " \
"ST_AsGeoJSON(s.ct_poly) AS geometry, " \
"dd.rate AS pct_diabetes, dhd.rate as pct_heart_disease, doa.rate as pct_opioid_addiction " \
"FROM synth_ma.synth_county_pop_stats s " \
"JOIN synth_ma.synth_county_disease_stats dd ON dd.ct_fips = s.ct_fips AND dd.disease_name = 'diabetes' " \
"JOIN synth_ma.synth_county_disease_stats dhd ON dhd.ct_fips = s.ct_fips AND dhd.disease_name = 'heart_disease' " \
"JOIN synth_ma.synth_county_disease_stats doa ON doa.ct_fips = s.ct_fips AND doa.disease_name = 'opioid_addiction' " \
"WHERE s.ct_name = %s"
sql_params = (ct_name.title(),)
data = p2g.getData(con, sql, sql_params)
log.debug("leaving get_synth_county_by_name()")
return data
#Request geojson of only the geometry of a single county by name
@app.route('/htc/api/v1/counties/name/<string:ct_name>/geom', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_county_by_name_geom(ct_name):
"""County in GeoJSON, by name, geometry only"""
log.debug("entering get_county_by_name_geom() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT countyfp AS ct_fips, ST_AsGeoJSON(the_geom) AS geometry " \
"FROM tiger_cb14_500k.county " \
"WHERE statefp='25' AND name=%s"
sql_params = (ct_name.title(),)
data = p2g.getData(con, sql, sql_params)
log.debug("leaving get_county_by_name_geom()")
return data
#Request only the statistics of a single county by name
@app.route('/htc/api/v1/counties/name/<string:ct_name>/stats', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_county_by_name_stats(ct_name):
"""County in JSON, by name, statistics only"""
log.debug("entering get_county_by_name_stats() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, s.pop_male / s.pop as pct_male, s.pop_female / s.pop as pct_female, s.pop_sm, " \
"chr.hs_graduate as chr_hs_grad, chr.college as chr_college, chr.unemployed as chr_unemployed, chr.diabetes as chr_diabetes, " \
"chr.adult_obesity as chr_adult_obesity, chr.adult_smoking as chr_adult_smoking, opioid.deaths as opioid_deaths, " \
"age.fact_pop_0_4 as pop_0_4,age.fact_pop_5_9 as pop_5_9,age.fact_pop_10_14 as pop_10_14,age.fact_pop_15_19 as pop_15_19, "\
"age.fact_pop_20_24 as pop_20_24,age.fact_pop_25_29 as pop_25_29,age.fact_pop_30_34 as pop_30_34,age.fact_pop_35_39 as pop_35_39, " \
"age.fact_pop_40_44 as pop_40_44,age.fact_pop_45_49 as pop_45_49,age.fact_pop_50_54 as pop_50_54,age.fact_pop_55_59 as pop_55_59, " \
"age.fact_pop_60_64 as pop_60_64,age.fact_pop_65_69 as pop_65_69,age.fact_pop_70_74 as pop_70_74,age.fact_pop_75_79 as pop_75_79, " \
"age.fact_pop_80_84 as pop_80_84,age.fact_pop_85_110 as pop_85_110, " \
"FROM synth_ma.county_stats s " \
"JOIN synth_ma.ma_opioid_county opioid ON opioid.countyfp = s.ct_fips AND opioid.year = '2015' " \
"JOIN county_health.chr ON chr.statefp = '25' AND chr.release_year = 2016 AND chr.countyfp = s.ct_fips " \
"JOIN synth_ma.ma_county_age age ON age.ct_fips = s.ct_fips " \
"WHERE s.ct_name=%s"
sql_params = (ct_name.title(),)
data = getData(con, sql, sql_params)
log.debug("leaving get_county_by_name_stats()")
return data
#Request only the statistics of a single county by name (synthetic)
@app.route('/htc/api/v1/synth/counties/name/<string:ct_name>/stats', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_synth_county_by_name_stats(ct_name):
"""County in JSON, by name, statistics only (synthetic)"""
log.debug("entering get_synth_county_by_name_stats() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, s.pop_sm, " \
"dd.rate as pct_diabetes, dhd.rate as pct_heart_disease, doa.rate as pct_opioid_addiction " \
"FROM synth_ma.synth_county_pop_stats s " \
"JOIN synth_ma.synth_county_disease_stats dd ON dd.ct_fips = s.ct_fips AND dd.disease_name = 'diabetes' " \
"JOIN synth_ma.synth_county_disease_stats dhd ON dhd.ct_fips = s.ct_fips AND dhd.disease_name = 'heart_disease' " \
"JOIN synth_ma.synth_county_disease_stats doa ON doa.ct_fips = s.ct_fips AND doa.disease_name = 'opioid_addiction' " \
"WHERE s.ct_name = %s "
sql_params = (ct_name.title(),)
data = getData(con, sql, sql_params)
log.debug("leaving get_synth_county_by_name_stats()")
return data
#Request single county by county id
@app.route('/htc/api/v1/counties/id/<string:ct_fips>', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_county_by_id(ct_fips):
"""County in GeoJSON, by id"""
log.debug("entering get_county_by_id() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, s.pop_male / s.pop as pct_male, s.pop_female / s.pop as pct_female, s.pop_sm, " \
"chr.hs_graduate as chr_hs_grad, chr.college as chr_college, chr.unemployed as chr_unemployed, chr.diabetes_rate as chr_diabetes, " \
"chr.adult_obesity as chr_adult_obesity, chr.adult_smoking as chr_adult_smoking, opioid.deaths as opioid_deaths, " \
"age.fact_pop_0_4 as pop_0_4,age.fact_pop_5_9 as pop_5_9,age.fact_pop_10_14 as pop_10_14,age.fact_pop_15_19 as pop_15_19, "\
"age.fact_pop_20_24 as pop_20_24,age.fact_pop_25_29 as pop_25_29,age.fact_pop_30_34 as pop_30_34,age.fact_pop_35_39 as pop_35_39, " \
"age.fact_pop_40_44 as pop_40_44,age.fact_pop_45_49 as pop_45_49,age.fact_pop_50_54 as pop_50_54,age.fact_pop_55_59 as pop_55_59, " \
"age.fact_pop_60_64 as pop_60_64,age.fact_pop_65_69 as pop_65_69,age.fact_pop_70_74 as pop_70_74,age.fact_pop_75_79 as pop_75_79, " \
"age.fact_pop_80_84 as pop_80_84,age.fact_pop_85_110 as pop_85_110, " \
"ST_AsGeoJSON(the_geom) AS geometry " \
"FROM synth_ma.county_stats s " \
"JOIN synth_ma.ma_opioid_county opioid ON opioid.countyfp = s.ct_fips AND opioid.year = '2015' " \
"JOIN tiger_cb14_500k.county g ON g.statefp = '25' AND g.countyfp = s.ct_fips " \
"JOIN county_health.chr ON chr.statefp = '25' AND chr.release_year = 2016 AND chr.countyfp = s.ct_fips " \
"JOIN synth_ma.ma_county_age age ON age.ct_fips = s.ct_fips " \
"WHERE ct_fips=%s"
sql_params = (ct_fips,)
data = p2g.getData(con, sql, sql_params)
log.debug("leaving get_county_by_id()")
return data
#Request single county by county id (synthetic)
@app.route('/htc/api/v1/synth/counties/id/<string:ct_fips>', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_synth_county_by_id(ct_fips):
"""County in GeoJSON, by id (synthetic)"""
log.debug("entering get_synth_county_by_id() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, s.pop_sm, " \
"dd.rate as pct_diabetes, dhd.rate as pct_heart_disease, doa.rate as pct_opioid_addiction, " \
"ST_AsGeoJSON(s.ct_poly) AS geometry " \
"FROM synth_ma.synth_county_pop_stats s " \
"JOIN synth_ma.synth_county_disease_stats dd ON dd.ct_fips = s.ct_fips AND dd.disease_name = 'diabetes' " \
"JOIN synth_ma.synth_county_disease_stats dhd ON dhd.ct_fips = s.ct_fips AND dhd.disease_name = 'heart_disease' " \
"JOIN synth_ma.synth_county_disease_stats doa ON doa.ct_fips = s.ct_fips AND doa.disease_name = 'opioid_addiction' " \
"WHERE s.ct_fips = %s"
sql_params = (ct_fips,)
data = p2g.getData(con, sql, sql_params)
log.debug("leaving get_synth_county_by_id()")
return data
#Request geojson of only geometry of a single county by county id
@app.route('/htc/api/v1/counties/id/<string:ct_fips>/geom', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_county_by_id_geom(ct_fips):
"""County in GeoJSON, by id, geometry only"""
log.debug("entering get_county_by_id_geom() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT countyfp AS ct_fips, ST_AsGeoJSON(the_geom) AS geometry " \
"FROM tiger_cb14_500k.county " \
"WHERE statefp='25' AND countyfp=%s"
sql_params = (ct_fips,)
data = p2g.getData(con, sql, sql_params)
log.debug("leaving get_county_by_id_geom()")
return data
#Request only the statistics of a single county by county id
@app.route('/htc/api/v1/counties/id/<string:ct_fips>/stats', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_county_by_id_stats(ct_fips):
"""County in JSON, by id, statistics only"""
log.debug("entering get_county_by_id_stats() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, s.pop_male / s.pop as pct_male, s.pop_female / s.pop as pct_female, s.pop_sm, " \
"chr.hs_graduate as chr_hs_grad, chr.college as chr_college, chr.unemployed as chr_unemployed, chr.diabetes_rate as chr_diabetes, " \
"chr.adult_obesity as chr_adult_obesity, chr.adult_smoking as chr_adult_smoking, opioid.deaths as opioid_deaths, " \
"age.fact_pop_0_4 as pop_0_4,age.fact_pop_5_9 as pop_5_9,age.fact_pop_10_14 as pop_10_14,age.fact_pop_15_19 as pop_15_19, "\
"age.fact_pop_20_24 as pop_20_24,age.fact_pop_25_29 as pop_25_29,age.fact_pop_30_34 as pop_30_34,age.fact_pop_35_39 as pop_35_39, " \
"age.fact_pop_40_44 as pop_40_44,age.fact_pop_45_49 as pop_45_49,age.fact_pop_50_54 as pop_50_54,age.fact_pop_55_59 as pop_55_59, " \
"age.fact_pop_60_64 as pop_60_64,age.fact_pop_65_69 as pop_65_69,age.fact_pop_70_74 as pop_70_74,age.fact_pop_75_79 as pop_75_79, " \
"age.fact_pop_80_84 as pop_80_84,age.fact_pop_85_110 as pop_85_110, " \
"FROM synth_ma.county_stats s " \
"JOIN synth_ma.ma_opioid_county opioid ON opioid.countyfp = s.ct_fips AND opioid.year = '2015' " \
"JOIN county_health.chr ON chr.statefp = '25' AND chr.release_year = 2016 AND chr.countyfp = s.ct_fips " \
"JOIN synth_ma.ma_county_age age ON age.ct_fips = s.ct_fips " \
"WHERE ct_fips=%s"
sql_params = (ct_fips,)
data = getData(con, sql, sql_params)
log.debug("leaving get_county_by_id_stats()")
return data
#Request only the statistics of a single county by county id (synthetic)
@app.route('/htc/api/v1/synth/counties/id/<string:ct_fips>/stats', methods=['GET'])
@auto.doc()
@cache.memoize(timeout=300) # cache this view for 5 minutes
def get_synth_county_by_id_stats(ct_fips):
"""County in JSON, by id, statistics only (synthetic)"""
log.debug("entering get_synth_county_by_id_stats() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.sq_mi, s.pop, CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, s.pop_sm, " \
"dd.rate as pct_diabetes, dhd.rate as pct_heart_disease, doa.rate as pct_opioid_addiction " \
"FROM synth_ma.synth_county_pop_stats s " \
"JOIN synth_ma.synth_county_disease_stats dd ON dd.ct_fips = s.ct_fips AND dd.disease_name = 'diabetes' " \
"JOIN synth_ma.synth_county_disease_stats dhd ON dhd.ct_fips = s.ct_fips AND dhd.disease_name = 'heart_disease' " \
"JOIN synth_ma.synth_county_disease_stats doa ON doa.ct_fips = s.ct_fips AND doa.disease_name = 'opioid_addiction' " \
"WHERE s.ct_fips = %s"
sql_params = (ct_fips,)
data = getData(con, sql, sql_params)
log.debug("leaving get_synth_county_by_id_stats()")
return data
#
# All cousub requests
#
#Request geojson of all cousubs
@app.route('/htc/api/v1/cousubs', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_cousub_all():
"""Cousubs in GeoJSON"""
log.debug("entering get_cousub_all() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.cs_fips, s.cs_name, s.sq_mi, s.pop, s.pop_sm, " \
"CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, " \
"CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, " \
"op.deaths AS opioid_deaths, pred.pred_diabetes as pct_diabetes, " \
"age.fact_pop_0_4 as pop_0_4,age.fact_pop_5_9 as pop_5_9,age.fact_pop_10_14 as pop_10_14,age.fact_pop_15_19 as pop_15_19, "\
"age.fact_pop_20_24 as pop_20_24,age.fact_pop_25_29 as pop_25_29,age.fact_pop_30_34 as pop_30_34,age.fact_pop_35_39 as pop_35_39, " \
"age.fact_pop_40_44 as pop_40_44,age.fact_pop_45_49 as pop_45_49,age.fact_pop_50_54 as pop_50_54,age.fact_pop_55_59 as pop_55_59, " \
"age.fact_pop_60_64 as pop_60_64,age.fact_pop_65_69 as pop_65_69,age.fact_pop_70_74 as pop_70_74,age.fact_pop_75_79 as pop_75_79, " \
"age.fact_pop_80_84 as pop_80_84,age.fact_pop_85_110 as pop_85_110, " \
"ST_AsGeoJSON(g.the_geom) AS geometry " \
"FROM synth_ma.cousub_stats s " \
"JOIN synth_ma.ma_cousub_pred pred ON pred.state_fips = '25' AND pred.county = s.ct_fips AND pred.county_sub_fips = s.cs_fips AND s.cs_fips != '00000' " \
"JOIN tiger_cb14_500k.cousub g ON g.statefp = '25' AND g.countyfp = s.ct_fips AND g.cousubfp = s.cs_fips AND s.cs_fips != '00000' " \
"JOIN synth_ma.ma_cousub_age age ON age.cs_fips = s.cs_fips " \
"JOIN synth_ma.ma_opioid2 op ON op.cousubfp = s.cs_fips AND s.cs_fips != '00000' AND year = '2015'"
data = p2g.getData(con, sql)
log.debug("leaving get_cousub_all()")
return data
#Request geojson of all cousubs (synthetic)
@app.route('/htc/api/v1/synth/cousubs', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_synth_cousub_all():
"""Cousubs in GeoJSON (synthetic)"""
log.debug("entering get_synth_cousub_all() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.cs_fips, s.cs_name, s.sq_mi, s.pop, s.pop_sm, " \
"CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, " \
"CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, " \
"dd.rate as pct_diabetes, dhd.rate as pct_heart_disease, doa.rate as pct_opioid_addiction, " \
"ST_AsGeoJSON(s.cs_poly) AS geometry " \
"FROM synth_ma.synth_cousub_pop_stats s " \
"JOIN synth_ma.synth_cousub_disease_stats dd ON dd.cs_fips = s.cs_fips AND dd.disease_name = 'diabetes' " \
"JOIN synth_ma.synth_cousub_disease_stats dhd ON dhd.cs_fips = s.cs_fips AND dhd.disease_name = 'heart_disease' " \
"JOIN synth_ma.synth_cousub_disease_stats doa ON doa.cs_fips = s.cs_fips AND doa.disease_name = 'opioid_addiction' "
data = p2g.getData(con, sql)
log.debug("leaving get_synth_cousub_all()")
return data
#Request geojson of only the geometry of all cousubs
@app.route('/htc/api/v1/cousubs/geoms', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_cousub_geom():
"""Cousubs in GeoJSON, geometry only"""
log.debug("entering get_cousub_geom() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT countyfp AS ct_fips, cousubfp AS cs_fips, ST_AsGeoJSON(the_geom) AS geometry " \
"FROM tiger_cb14_500k.cousub " \
"WHERE statefp='25' AND cousubfp != '00000'"
data = p2g.getData(con, sql)
log.debug("leaving get_cousub_geom()")
return data
#Request only the statistics of all cousubs
@app.route('/htc/api/v1/cousubs/stats', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_cousub_stats():
"""Cousubs in JSON, statistics only"""
log.debug("entering get_cousub_stats() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.cs_fips, s.cs_name, s.sq_mi, s.pop, s.pop_sm, " \
"CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, " \
"CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, " \
"op.deaths AS opioid_deaths, pred.pred_diabetes as pct_diabetes, " \
"age.fact_pop_0_4 as pop_0_4,age.fact_pop_5_9 as pop_5_9,age.fact_pop_10_14 as pop_10_14,age.fact_pop_15_19 as pop_15_19, "\
"age.fact_pop_20_24 as pop_20_24,age.fact_pop_25_29 as pop_25_29,age.fact_pop_30_34 as pop_30_34,age.fact_pop_35_39 as pop_35_39, " \
"age.fact_pop_40_44 as pop_40_44,age.fact_pop_45_49 as pop_45_49,age.fact_pop_50_54 as pop_50_54,age.fact_pop_55_59 as pop_55_59, " \
"age.fact_pop_60_64 as pop_60_64,age.fact_pop_65_69 as pop_65_69,age.fact_pop_70_74 as pop_70_74,age.fact_pop_75_79 as pop_75_79, " \
"age.fact_pop_80_84 as pop_80_84,age.fact_pop_85_110 as pop_85_110 " \
"FROM synth_ma.cousub_stats s " \
"JOIN synth_ma.ma_cousub_pred pred ON pred.state_fips = '25' AND pred.county = s.ct_fips AND pred.county_sub_fips = s.cs_fips AND s.cs_fips != '00000' " \
"JOIN synth_ma.ma_opioid2 op ON op.cousubfp = s.cs_fips AND s.cs_fips != '00000' AND year = '2015' " \
"JOIN synth_ma.ma_cousub_age age ON age.cs_fips = s.cs_fips " \
"WHERE s.cs_fips != '00000'"
data = getData(con, sql)
log.debug("leaving get_cousub_stats()")
return data
#Request only the statistics of all cousubs (synthetic)
@app.route('/htc/api/v1/synth/cousubs/stats', methods=['GET'])
@auto.doc()
@cache.cached(timeout=300) # cache this view for 5 minutes
def get_synth_cousub_stats():
"""Cousubs in JSON, statistics only (synthetic)"""
log.debug("entering get_synth_cousub_stats() IP=%s" % get_ip())
con = get_db_con()
sql = "SELECT s.ct_fips, s.ct_name, s.cs_fips, s.cs_name, s.sq_mi, s.pop, s.pop_sm, " \
"CASE WHEN s.pop > 0 THEN s.pop_male / s.pop ELSE 0 END AS pct_male, " \
"CASE WHEN s.pop > 0 THEN s.pop_female / s.pop ELSE 0 END AS pct_female, " \
"dd.rate as pct_diabetes, dhd.rate as pct_heart_disease, doa.rate as pct_opioid_addiction " \
"FROM synth_ma.synth_cousub_pop_stats s " \
"JOIN synth_ma.synth_cousub_disease_stats dd ON dd.cs_fips = s.cs_fips AND dd.disease_name = 'diabetes' " \
"JOIN synth_ma.synth_cousub_disease_stats dhd ON dhd.cs_fips = s.cs_fips AND dhd.disease_name = 'heart_disease' " \
"JOIN synth_ma.synth_cousub_disease_stats doa ON doa.cs_fips = s.cs_fips AND doa.disease_name = 'opioid_addiction' "
data = getData(con, sql)
log.debug("leaving get_synth_cousub_stats()")
return data
#Block level, with filtering
#Example: /htc/api/v1/block_window?minx=-71.26&maxx=-71.22&miny=42.49&maxy=42.51
@app.route('/htc/api/v1/block_window', methods=['GET'])
@auto.doc()
def get_block_window():
"""Blocks in GeoJSON, by window
Example: /htc/api/v1/block_window?minx=-71.26&maxx=-71.22&miny=42.49&maxy=42.51
"""
log.debug("entering get_block_window() IP=%s" % get_ip())
minx = request.args.get('minx')
maxx = request.args.get('maxx')
miny = request.args.get('miny')
maxy = request.args.get('maxy')
if not (minx and maxx and miny and maxy):
abort(404)
con = get_db_con()
sql = "SELECT s.block_id, s.sq_mi, s.pop, s.pop_male / s.pop as pct_male, s.pop_female / s.pop as pct_female, s.pop_sm, " \
"ST_AsGeoJSON(s.blk_poly) AS geometry " \
"FROM synth_ma.blk_stats s " \
"WHERE s.blk_poly && ST_SetSRID(ST_MakeBox2D(ST_Point(%s,%s), ST_Point(%s,%s)), 4269) AND s.pop > 0"
sql_params = (minx, miny, maxx, maxy)
data = p2g.getData(con, sql, sql_params)
log.debug("leaving get_block_window()")
return data
##TODO
#return ccda
@app.route('/htc/api/v1/synth/ccda/id/<string:patient_uuid>', methods=['GET'])
@auto.doc()
# this view should not be cached
def get_synth_ccda_by_id(patient_uuid):
"""Synthetic Patient in C-CDA, by id"""
log.debug("entering get_synth_ccda_by_id() IP=%s" % get_ip())
return send_from_directory('/ccda', patient_uuid + '.xml')
#Specific requests for less typing
#
#Request geojson of ?
if __name__ == '__main__':
log = app.logger
#Logging levels: CRITICAL: 50, ERROR: 40, WARNING: 30, INFO: 20, DEBUG: 10, NOTSET: 0
log.setLevel(10)
logging.basicConfig(format="%(asctime)-15s %(threadName)s %(message)s")
setup_pool()
log.debug("starting server")
app.run(debug=True, host="0.0.0.0", port=8080, threaded=True)
| 53.82263 | 209 | 0.707642 | 6,314 | 35,200 | 3.65917 | 0.06636 | 0.043629 | 0.062327 | 0.015582 | 0.849723 | 0.839335 | 0.822065 | 0.804146 | 0.781423 | 0.750216 | 0 | 0.053455 | 0.17358 | 35,200 | 653 | 210 | 53.905054 | 0.74077 | 0.094773 | 0 | 0.605882 | 0 | 0.213725 | 0.645601 | 0.215526 | 0 | 0 | 0 | 0.001531 | 0 | 0 | null | null | 0.001961 | 0.02549 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
a356f0a377b3e3763ead0160ea30e255eac32503 | 196 | py | Python | py_grpc_profile/__init__.py | fossabot/py-grpc-profile | c68540891efb06b3bb8d6beb54238bd9e03c40c8 | [
"Apache-2.0"
] | null | null | null | py_grpc_profile/__init__.py | fossabot/py-grpc-profile | c68540891efb06b3bb8d6beb54238bd9e03c40c8 | [
"Apache-2.0"
] | 2 | 2021-03-23T23:49:14.000Z | 2022-02-11T03:38:44.000Z | py_grpc_profile/__init__.py | fossabot/py-grpc-profile | c68540891efb06b3bb8d6beb54238bd9e03c40c8 | [
"Apache-2.0"
] | 1 | 2021-03-23T23:47:17.000Z | 2021-03-23T23:47:17.000Z | try:
import importlib.metadata as importlib_metadata
except ImportError:
import importlib_metadata # type: ignore
__version__: str = importlib_metadata.version(__name__) # type: ignore
| 28 | 71 | 0.785714 | 22 | 196 | 6.5 | 0.545455 | 0.475524 | 0.321678 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153061 | 196 | 6 | 72 | 32.666667 | 0.861446 | 0.127551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6e7130246867f165170005d2a0d238ddd1072843 | 5,099 | py | Python | init_models.py | Zdddzz/smartcontract | e587c72cb76883437cd653a47cf14f038ab72f74 | [
"MIT"
] | null | null | null | init_models.py | Zdddzz/smartcontract | e587c72cb76883437cd653a47cf14f038ab72f74 | [
"MIT"
] | null | null | null | init_models.py | Zdddzz/smartcontract | e587c72cb76883437cd653a47cf14f038ab72f74 | [
"MIT"
] | 1 | 2022-02-28T07:35:16.000Z | 2022-02-28T07:35:16.000Z | # -*- coding: utf-8 -*-
# @Time : 2020-12-18 22:04
# @Author : Di Zhu
import os
import json
import utils
from networks import Transformer, Measurer
def init_transformer():
file_path = os.path.realpath(__file__)
print(file_path)
base_dir = os.path.dirname(file_path)
model_dir = os.path.join(base_dir, 'pretrained_models', 'transformer')
if os.path.exists(os.path.join(model_dir, 'best.pth.tar')):
with open(os.path.join(model_dir, 'best_config.json')) as f:
model_config = json.load(f)
model = Transformer(n_source_vocab=model_config['n_source_vocab'],
n_target_vocab=model_config['n_target_vocab'],
max_len=model_config['max_len'],
d_word_vec=model_config['d_word_vec'],
d_inner=model_config['d_inner'],
n_layers=model_config['n_layers'],
n_head=model_config['n_head'],
dropout=model_config['dropout'])
print("loading: ", os.path.join(model_dir, 'best.pth.tar'))
utils.load_checkpoint(os.path.join(model_dir, 'best.pth.tar'), model)
elif os.path.exists(os.path.join(model_dir, 'last.pth.tar')):
with open(os.path.join(model_dir, 'last_config.json')) as f:
model_config = json.load(f)
model = Transformer(n_source_vocab=model_config['n_source_vocab'],
n_target_vocab=model_config['n_target_vocab'],
max_len=model_config['max_len'],
d_word_vec=model_config['d_word_vec'],
d_inner=model_config['d_inner'],
n_layers=model_config['n_layers'],
n_head=model_config['n_head'],
dropout=model_config['dropout'],
dec_emb_pre_weight_sharing=model_config['dec_emb_pre_weight_sharing'])
print("loading: ", os.path.join(model_dir, 'last.pth.tar'))
utils.load_checkpoint(os.path.join(model_dir, 'last.pth.tar'), model)
else:
raise Exception
return model, model_config
def init_measurer():
file_path = os.path.realpath(__file__)
print(file_path+"111")
base_dir = os.path.dirname(file_path)
model_dir = os.path.join(base_dir, 'pretrained_models', 'measurer')
if os.path.exists(os.path.join(model_dir, 'best.pth.tar')):
with open(os.path.join(model_dir, 'best_config.json')) as f:
model_config = json.load(f)
model = Measurer(n_source_vocab=model_config['n_source_vocab'],
n_target_vocab=model_config['n_target_vocab'],
max_len=model_config['max_len'],
d_word_vec=model_config['d_word_vec'],
d_inner=model_config['d_inner'],
n_layers=model_config['n_layers'],
n_head=model_config['n_head'],
dropout=model_config['dropout'])
print("loading: ", os.path.join(model_dir, 'best.pth.tar'))
utils.load_checkpoint(os.path.join(model_dir, 'best.pth.tar'), model)
elif os.path.exists(os.path.join(model_dir, 'last.pth.tar')):
with open(os.path.join(model_dir, 'last_config.json')) as f:
model_config = json.load(f)
model = Measurer(n_source_vocab=model_config['n_source_vocab'],
n_target_vocab=model_config['n_target_vocab'],
max_len=model_config['max_len'],
d_word_vec=model_config['d_word_vec'],
d_inner=model_config['d_inner'],
n_layers=model_config['n_layers'],
n_head=model_config['n_head'],
dropout=model_config['dropout'])
print("loading: ", os.path.join(model_dir, 'last.pth.tar'))
utils.load_checkpoint(os.path.join(model_dir, 'last.pth.tar'), model)
else:
raise Exception
return model, model_config
def init_opt_encoder(require_grad=True):
measurer, measurer_config = init_measurer()
encoder_config = measurer_config
encoder_config.pop("n_source_vocab")
encoder_config["n_vocab"] = encoder_config.pop("n_target_vocab")
if require_grad:
measurer.target_encoder.grads()
else:
measurer.target_encoder.no_grads()
return measurer.target_encoder, encoder_config
def init_asm_encoder(require_grad=True):
measurer, measurer_config = init_measurer()
encoder_config = measurer_config
encoder_config.pop("n_target_vocab")
encoder_config["n_vocab"] = encoder_config.pop("n_source_vocab")
if require_grad:
measurer.source_encoder.grads()
else:
measurer.source_encoder.no_grads()
return measurer.source_encoder, encoder_config
model, _ = init_transformer()
model.eval() | 45.526786 | 102 | 0.592469 | 635 | 5,099 | 4.412598 | 0.127559 | 0.153105 | 0.06424 | 0.085653 | 0.882941 | 0.832263 | 0.820128 | 0.820128 | 0.820128 | 0.763026 | 0 | 0.00443 | 0.291626 | 5,099 | 112 | 103 | 45.526786 | 0.771318 | 0.012748 | 0 | 0.726316 | 0 | 0 | 0.136752 | 0.005168 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042105 | false | 0 | 0.042105 | 0 | 0.126316 | 0.063158 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
42eb061ad5cf49161ad2e1c888993bcd8ae95357 | 217 | py | Python | exception_handling/views.py | bakefayat/django-drug | 5c1eb79dd39fca57f22cc0c5f1cdf4e76f36f43a | [
"MIT"
] | 1 | 2022-01-08T13:31:45.000Z | 2022-01-08T13:31:45.000Z | exception_handling/views.py | bakefayat/django-drug | 5c1eb79dd39fca57f22cc0c5f1cdf4e76f36f43a | [
"MIT"
] | null | null | null | exception_handling/views.py | bakefayat/django-drug | 5c1eb79dd39fca57f22cc0c5f1cdf4e76f36f43a | [
"MIT"
] | null | null | null | from django.shortcuts import render
def handler404(request, exception):
return render(request, "exception_handling/404.html")
def handler500(request):
return render(request, "exception_handling/500.html")
| 21.7 | 57 | 0.774194 | 26 | 217 | 6.384615 | 0.576923 | 0.289157 | 0.228916 | 0.337349 | 0.433735 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063158 | 0.124424 | 217 | 9 | 58 | 24.111111 | 0.810526 | 0 | 0 | 0 | 0 | 0 | 0.248848 | 0.248848 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
2814750a49f9120054cd00b7143eaee9b8fa5740 | 45 | py | Python | tensor2struct/models/scan/__init__.py | chenyangh/tensor2struct-public | d3257cba6d76d3c658a58a78f687d986bdc755cf | [
"MIT"
] | 69 | 2021-04-14T06:35:07.000Z | 2022-03-31T18:35:05.000Z | tensor2struct/models/scan/__init__.py | chenyangh/tensor2struct-public | d3257cba6d76d3c658a58a78f687d986bdc755cf | [
"MIT"
] | 11 | 2021-04-16T11:16:04.000Z | 2022-03-22T21:21:29.000Z | tensor2struct/models/scan/__init__.py | chenyangh/tensor2struct-public | d3257cba6d76d3c658a58a78f687d986bdc755cf | [
"MIT"
] | 18 | 2021-04-14T07:19:56.000Z | 2022-03-23T19:26:18.000Z | from . import scan_enc
from . import scan_dec | 22.5 | 22 | 0.8 | 8 | 45 | 4.25 | 0.625 | 0.588235 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155556 | 45 | 2 | 23 | 22.5 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
282215968965f0623d1217f623c296820045bb64 | 37 | py | Python | molsysmt/tools/file_crd/__init__.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | molsysmt/tools/file_crd/__init__.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | molsysmt/tools/file_crd/__init__.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | from .is_file_crd import is_file_crd
| 18.5 | 36 | 0.864865 | 8 | 37 | 3.5 | 0.625 | 0.428571 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 37 | 1 | 37 | 37 | 0.848485 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
2823b5c4c8aafb83018f62826e8cfab03a922699 | 1,423 | py | Python | .c9/metadata/workspace/unikastaroak/view.py | bipoza/unikastaroak | 4044d3ff3eaa4172275a8f46d9765a3840a51d7b | [
"Apache-2.0"
] | null | null | null | .c9/metadata/workspace/unikastaroak/view.py | bipoza/unikastaroak | 4044d3ff3eaa4172275a8f46d9765a3840a51d7b | [
"Apache-2.0"
] | null | null | null | .c9/metadata/workspace/unikastaroak/view.py | bipoza/unikastaroak | 4044d3ff3eaa4172275a8f46d9765a3840a51d7b | [
"Apache-2.0"
] | null | null | null | {"filter":false,"title":"view.py","tooltip":"/unikastaroak/view.py","undoManager":{"mark":2,"position":2,"stack":[[{"start":{"row":0,"column":0},"end":{"row":12,"column":68},"action":"insert","lines":["def signup(request):"," if request.method == 'POST':"," form = UserCreationForm(request.POST)"," if form.is_valid():"," form.save()"," username = form.cleaned_data.get('username')"," raw_password = form.cleaned_data.get('password1')"," user = authenticate(username=username, password=raw_password)"," login(request, user)"," return redirect('/')"," else:"," form = UserCreationForm()"," return render(request, 'register/register.html', {'form': form})"],"id":1}],[{"start":{"row":0,"column":0},"end":{"row":1,"column":0},"action":"insert","lines":["",""],"id":2}],[{"start":{"row":0,"column":0},"end":{"row":2,"column":45},"action":"insert","lines":["from django.contrib.auth import login, authenticate","from django.contrib.auth.forms import UserCreationForm","from django.shortcuts import render, redirect"],"id":3}]]},"ace":{"folds":[],"scrolltop":0,"scrollleft":0,"selection":{"start":{"row":2,"column":45},"end":{"row":2,"column":45},"isBackwards":true},"options":{"guessTabSize":true,"useWrapMode":false,"wrapToView":true},"firstLineState":0},"timestamp":1510907493939,"hash":"05a2c1e8b8d943311cfbc5d0917b48182c3faa5b"} | 1,423 | 1,423 | 0.629656 | 165 | 1,423 | 5.4 | 0.460606 | 0.035915 | 0.030303 | 0.050505 | 0.10101 | 0.074074 | 0.074074 | 0 | 0 | 0 | 0 | 0.052262 | 0.099086 | 1,423 | 1 | 1,423 | 1,423 | 0.642746 | 0 | 0 | 0 | 0 | 0 | 0.73736 | 0.183287 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
955b2b60a97ffb22f7057d6ed349ea03d5008d0d | 2,674 | py | Python | tests/transactions/builder/test_htlc_claim.py | ciband/python-crypto | 27acf5659efe1aedc0ce5ae73b1ebf03126cb7f4 | [
"MIT"
] | null | null | null | tests/transactions/builder/test_htlc_claim.py | ciband/python-crypto | 27acf5659efe1aedc0ce5ae73b1ebf03126cb7f4 | [
"MIT"
] | 1 | 2018-08-29T17:18:13.000Z | 2018-08-29T17:18:13.000Z | tests/transactions/builder/test_htlc_claim.py | ciband/python-crypto | 27acf5659efe1aedc0ce5ae73b1ebf03126cb7f4 | [
"MIT"
] | null | null | null | from crypto.configuration.network import set_network
from crypto.constants import TRANSACTION_HTLC_CLAIM, TRANSACTION_TYPE_GROUP
from crypto.networks.devnet import Devnet
from crypto.transactions.builder.htlc_claim import HtlcClaim
set_network(Devnet)
def test_htlc_claim_transaction():
"""Test if timelock transaction gets built
"""
lock_transaction_id = '943c220691e711c39c79d437ce185748a0018940e1a4144293af9d05627d2eb4'
# This should be the hashed secret used for HTLC Lock transaction
unlock_secret = 'my secret that should be 32bytes'
transaction = HtlcClaim(lock_transaction_id, unlock_secret)
transaction.set_type_group(TRANSACTION_TYPE_GROUP.CORE)
transaction.set_nonce(1)
transaction.schnorr_sign('testing')
transaction_dict = transaction.to_dict()
assert transaction_dict['nonce'] == 1
assert transaction_dict['signature']
assert transaction_dict['type'] is TRANSACTION_HTLC_CLAIM
assert transaction_dict['typeGroup'] == 1
assert transaction_dict['typeGroup'] == TRANSACTION_TYPE_GROUP.CORE.value
assert transaction_dict['fee'] == 0
assert transaction_dict['asset']['claim']['lockTransactionId'] == '943c220691e711c39c79d437ce185748a0018940e1a4144293af9d05627d2eb4'
assert transaction_dict['asset']['claim']['unlockSecret'] == 'my secret that should be 32bytes'
transaction.schnorr_verify() # if no exception is raised, it means the transaction is valid
def test_htlc_claim_transaction_custom_fee():
"""Test if timelock transaction gets built with a custom fee
"""
lock_transaction_id = '943c220691e711c39c79d437ce185748a0018940e1a4144293af9d05627d2eb4'
# This should be the hashed secret used for HTLC Lock transaction
unlock_secret = 'my secret that should be 32bytes'
transaction = HtlcClaim(lock_transaction_id, unlock_secret, 5)
transaction.set_type_group(TRANSACTION_TYPE_GROUP.CORE)
transaction.set_nonce(1)
transaction.schnorr_sign('testing')
transaction_dict = transaction.to_dict()
assert transaction_dict['nonce'] == 1
assert transaction_dict['signature']
assert transaction_dict['type'] is TRANSACTION_HTLC_CLAIM
assert transaction_dict['typeGroup'] == 1
assert transaction_dict['typeGroup'] == TRANSACTION_TYPE_GROUP.CORE.value
assert transaction_dict['fee'] == 5
assert transaction_dict['asset']['claim']['lockTransactionId'] == '943c220691e711c39c79d437ce185748a0018940e1a4144293af9d05627d2eb4'
assert transaction_dict['asset']['claim']['unlockSecret'] == 'my secret that should be 32bytes'
transaction.schnorr_verify() # if no exception is raised, it means the transaction is valid
| 46.103448 | 136 | 0.777487 | 303 | 2,674 | 6.633663 | 0.221122 | 0.134328 | 0.167164 | 0.035821 | 0.879602 | 0.852736 | 0.818905 | 0.818905 | 0.818905 | 0.818905 | 0 | 0.090672 | 0.137996 | 2,674 | 57 | 137 | 46.912281 | 0.781345 | 0.133882 | 0 | 0.717949 | 0 | 0 | 0.249348 | 0.111208 | 0 | 0 | 0 | 0 | 0.410256 | 1 | 0.051282 | false | 0 | 0.102564 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2547b9fc5282f99a351fdf3f249b077df9de78d7 | 120 | py | Python | restdoctor/utils/sentry.py | WinaZar/restdoctor | 2ea2db69228e5425805a2b17160f54cda077aa46 | [
"MIT"
] | 20 | 2020-09-28T17:54:26.000Z | 2022-02-16T21:35:09.000Z | restdoctor/utils/sentry.py | WinaZar/restdoctor | 2ea2db69228e5425805a2b17160f54cda077aa46 | [
"MIT"
] | 32 | 2020-10-04T17:26:31.000Z | 2022-03-29T01:19:14.000Z | restdoctor/utils/sentry.py | pashaandsik/restdoctor | 2465039729b31420518ac0f047dd289d8c84dfa3 | [
"MIT"
] | 19 | 2020-10-01T16:54:14.000Z | 2022-01-18T14:41:53.000Z | try:
from sentry_sdk import capture_exception
except ImportError:
def capture_exception() -> None:
pass
| 20 | 44 | 0.708333 | 14 | 120 | 5.857143 | 0.857143 | 0.390244 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.233333 | 120 | 5 | 45 | 24 | 0.891304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
66653592df22b0cd4a105956142548e3395f050b | 33,913 | py | Python | tests/dhcpv6/kea_only/host_reservation/test_host_reservation_options.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | tests/dhcpv6/kea_only/host_reservation/test_host_reservation_options.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | tests/dhcpv6/kea_only/host_reservation/test_host_reservation_options.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | """Host Reservation including options DHCPv6 stored in MySQL database"""
# pylint: disable=invalid-name,line-too-long
import pytest
import srv_control
import misc
import srv_msg
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.reserved_options
def test_v6_host_reservation_mysql_duid_ll_matching_option():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('MySQL')
srv_control.new_db_backend_reservation('MySQL', 'duid', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'MySQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::100', '$(EMPTY)', 'MySQL', '1')
srv_control.option_db_record_reservation('7',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'MySQL',
'1')
srv_control.upload_db_reservation('MySQL')
srv_control.config_srv('preference', '0', '123')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::100')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:21')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '123')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.reserved_options
def test_v6_host_reservation_mysql_duid_ll_matching_option_no_address_1():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('MySQL')
srv_control.new_db_backend_reservation('MySQL', 'duid', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'MySQL', '1')
srv_control.option_db_record_reservation('7',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'MySQL',
'1')
srv_control.upload_db_reservation('MySQL')
srv_control.config_srv('preference', '0', '123')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', 'NOT ', 'addr', '3000::100')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:21')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '123')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.reserved_options
def test_v6_host_reservation_mysql_duid_ll_matching_option_no_address_2():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('MySQL')
srv_control.new_db_backend_reservation('MySQL', 'duid', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'MySQL', '1')
srv_control.option_db_record_reservation('7',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'MySQL',
'1')
srv_control.upload_db_reservation('MySQL')
srv_control.config_srv('preference', '0', '123')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', 'NOT ', 'addr', '3000::100')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', 'NOT ', '3')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:21')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '123')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.reserved_options
def test_v6_host_reservation_mysql_duid_ll_matching_option_inforequest():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('MySQL')
srv_control.new_db_backend_reservation('MySQL', 'duid', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'MySQL', '1')
srv_control.option_db_record_reservation('7',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'MySQL',
'1')
srv_control.upload_db_reservation('MySQL')
srv_control.config_srv('preference', '0', '123')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_send_msg('INFOREQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', 'NOT ', '3')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '123')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', 'NOT ', '3')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:21')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '123')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.reserved_options
def test_v6_host_reservation_mysql_option_multiple():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('MySQL')
srv_control.new_db_backend_reservation('MySQL', 'duid', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'MySQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::100', '$(EMPTY)', 'MySQL', '1')
srv_control.option_db_record_reservation('7',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'MySQL',
'1')
srv_control.option_db_record_reservation('21',
'srv1.example.com,srv2.isc.org',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'MySQL',
'1')
srv_control.option_db_record_reservation('23',
'2001:db8::1,2001:db8::2',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'MySQL',
'1')
srv_control.option_db_record_reservation('59',
'http://www.kea-reserved.isc.org',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'MySQL',
'1')
# Add option reservation code 60 value 10 space dhcp6 persistent 1 client class $(EMPTY) subnet id 1 and scope subnet to MySQL record id 1.
srv_control.upload_db_reservation('MySQL')
srv_control.config_srv('preference', '0', '123')
srv_control.config_srv_opt('sip-server-dns', 'srv4.example.com,srv5.isc.org')
# 21
srv_control.config_srv_opt('dns-servers', '2001:db8::4,2001:db8::5')
# 23
srv_control.config_srv_opt('bootfile-url', 'http://www.kea.isc.org')
# 59
srv_control.config_srv_opt('bootfile-param', '000B48656C6C6F20776F726C640003666F6F')
# 60
srv_control.config_srv_opt('new-tzdb-timezone', 'Europe/Zurich')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_requests_option('21')
srv_msg.client_requests_option('23')
srv_msg.client_requests_option('42')
srv_msg.client_requests_option('59')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::100')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
srv_msg.response_check_include_option('Response', None, '59')
srv_msg.response_check_option_content('Response',
'59',
None,
'optdata',
'http://www.kea-reserved.isc.org')
srv_msg.response_check_include_option('Response', None, '21')
srv_msg.response_check_option_content('Response',
'21',
None,
'addr',
'srv1.example.com.,srv2.isc.org.')
srv_msg.response_check_include_option('Response', None, '23')
srv_msg.response_check_option_content('Response',
'23',
None,
'addr',
'2001:db8::1,2001:db8::2')
srv_msg.response_check_include_option('Response', None, '42')
srv_msg.response_check_option_content('Response', '42', None, 'optdata', 'Europe/Zurich')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:21')
srv_msg.client_requests_option('7')
srv_msg.client_requests_option('21')
srv_msg.client_requests_option('42')
srv_msg.client_requests_option('23')
srv_msg.client_requests_option('59')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '123')
srv_msg.response_check_include_option('Response', None, '59')
srv_msg.response_check_option_content('Response',
'59',
None,
'optdata',
'http://www.kea.isc.org')
srv_msg.response_check_include_option('Response', None, '21')
srv_msg.response_check_option_content('Response',
'21',
None,
'addr',
'srv4.example.com.,srv5.isc.org.')
srv_msg.response_check_include_option('Response', None, '23')
srv_msg.response_check_option_content('Response',
'23',
None,
'addr',
'2001:db8::4,2001:db8::5')
srv_msg.response_check_include_option('Response', None, '42')
srv_msg.response_check_option_content('Response', '42', None, 'optdata', 'Europe/Zurich')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.reserved_options
def test_v6_host_reservation_pgsql_hwaddrr_matching_option():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('PostgreSQL')
srv_control.new_db_backend_reservation('PostgreSQL', 'hw-address', 'f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'PostgreSQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::100', '$(EMPTY)', 'PostgreSQL', '1')
srv_control.option_db_record_reservation('7',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'PostgreSQL',
'1')
srv_control.upload_db_reservation('PostgreSQL')
srv_control.config_srv_opt('preference', '12')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:01:00:01:52:7b:a8:f0:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
# Response sub-option 5 from option 3 MUST contain address 3000::100.
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:22')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
# Response sub-option 5 from option 3 MUST contain address 3000::100.
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '12')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.reserved_options
def test_v6_host_reservation_pgsql_hwaddrr_matching_option_no_address():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('PostgreSQL')
srv_control.new_db_backend_reservation('PostgreSQL', 'hw-address', 'f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'PostgreSQL', '1')
srv_control.option_db_record_reservation('7',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'PostgreSQL',
'1')
srv_control.upload_db_reservation('PostgreSQL')
srv_control.config_srv_opt('preference', '12')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:01:00:01:52:7b:a8:f0:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', 'NOT ', '3')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:22')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', 'NOT ', '3')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '12')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.reserved_options
def test_v6_host_reservation_pgsql_hwaddrr_matching_option_inforequest():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('PostgreSQL')
srv_control.new_db_backend_reservation('PostgreSQL', 'hw-address', 'f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'PostgreSQL', '1')
srv_control.option_db_record_reservation('7',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'PostgreSQL',
'1')
srv_control.upload_db_reservation('PostgreSQL')
srv_control.config_srv_opt('preference', '12')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:01:00:01:52:7b:a8:f0:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_send_msg('INFOREQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', 'NOT ', '3')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '12')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:01:00:01:52:7b:a8:f0:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', 'NOT ', '3')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.reserved_options
def test_v6_host_reservation_pgsql_option_multiple():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('PostgreSQL')
srv_control.new_db_backend_reservation('PostgreSQL', 'duid', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'PostgreSQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::100', '$(EMPTY)', 'PostgreSQL', '1')
srv_control.option_db_record_reservation('7',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'PostgreSQL',
'1')
srv_control.option_db_record_reservation('21',
'srv1.example.com,srv2.isc.org',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'PostgreSQL',
'1')
srv_control.option_db_record_reservation('23',
'2001:db8::1,2001:db8::2',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'PostgreSQL',
'1')
srv_control.option_db_record_reservation('59',
'http://www.kea-reserved.isc.org',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'PostgreSQL',
'1')
# Add option reservation code 60 value 10 space dhcp6 persistent 1 client class $(EMPTY) subnet id 1 and scope subnet to MySQL record id 1.
srv_control.upload_db_reservation('PostgreSQL')
srv_control.config_srv('preference', '0', '123')
srv_control.config_srv_opt('sip-server-dns', 'srv4.example.com,srv5.isc.org')
# 21
srv_control.config_srv_opt('dns-servers', '2001:db8::4,2001:db8::5')
# 23
srv_control.config_srv_opt('bootfile-url', 'http://www.kea.isc.org')
# 59
srv_control.config_srv_opt('new-tzdb-timezone', 'Europe/Zurich')
# 60 and not reserved
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_requests_option('7')
srv_msg.client_requests_option('21')
srv_msg.client_requests_option('23')
srv_msg.client_requests_option('42')
srv_msg.client_requests_option('59')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::100')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '10')
srv_msg.response_check_include_option('Response', None, '59')
srv_msg.response_check_option_content('Response',
'59',
None,
'optdata',
'http://www.kea-reserved.isc.org')
srv_msg.response_check_include_option('Response', None, '21')
srv_msg.response_check_option_content('Response',
'21',
None,
'addr',
'srv1.example.com.,srv2.isc.org.')
srv_msg.response_check_include_option('Response', None, '23')
srv_msg.response_check_option_content('Response',
'23',
None,
'addr',
'2001:db8::1,2001:db8::2')
srv_msg.response_check_include_option('Response', None, '42')
srv_msg.response_check_option_content('Response', '42', None, 'optdata', 'Europe/Zurich')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:21')
srv_msg.client_requests_option('7')
srv_msg.client_requests_option('21')
srv_msg.client_requests_option('23')
srv_msg.client_requests_option('59')
srv_msg.client_requests_option('42')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_include_option('Response', None, '7')
srv_msg.response_check_option_content('Response', '7', None, 'value', '123')
srv_msg.response_check_include_option('Response', None, '59')
srv_msg.response_check_option_content('Response',
'59',
None,
'optdata',
'http://www.kea.isc.org')
srv_msg.response_check_include_option('Response', None, '21')
srv_msg.response_check_option_content('Response',
'21',
None,
'addr',
'srv4.example.com.,srv5.isc.org.')
srv_msg.response_check_include_option('Response', None, '23')
srv_msg.response_check_option_content('Response',
'23',
None,
'addr',
'2001:db8::4,2001:db8::5')
srv_msg.response_check_include_option('Response', None, '42')
srv_msg.response_check_option_content('Response', '42', None, 'optdata', 'Europe/Zurich')
| 48.865994 | 143 | 0.563442 | 3,913 | 33,913 | 4.534373 | 0.040378 | 0.080482 | 0.086795 | 0.117793 | 0.987995 | 0.987995 | 0.987544 | 0.987544 | 0.987544 | 0.985797 | 0 | 0.053599 | 0.305163 | 33,913 | 693 | 144 | 48.936508 | 0.699372 | 0.016601 | 0 | 0.975124 | 0 | 0.006634 | 0.177427 | 0.037862 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014925 | true | 0.033168 | 0.006634 | 0 | 0.021559 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6694a6b0ef3fcacd5d7341f70251fe1e9f0331b3 | 34,514 | py | Python | src/azure/servicebus/servicebusservice.py | milinda/azure-sdk-for-python | 7cfe2b4c1828ecd843e6bc0342914fbdbd1c3e8c | [
"Apache-2.0"
] | 2 | 2015-01-28T10:06:11.000Z | 2015-01-28T10:06:21.000Z | src/azure/servicebus/servicebusservice.py | milinda/azure-sdk-for-python | 7cfe2b4c1828ecd843e6bc0342914fbdbd1c3e8c | [
"Apache-2.0"
] | null | null | null | src/azure/servicebus/servicebusservice.py | milinda/azure-sdk-for-python | 7cfe2b4c1828ecd843e6bc0342914fbdbd1c3e8c | [
"Apache-2.0"
] | null | null | null | #-------------------------------------------------------------------------
# Copyright 2011 Microsoft Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#--------------------------------------------------------------------------
import base64
import os
import urllib2
from azure.http.httpclient import _HTTPClient
from azure.http import HTTPError
from azure.servicebus import (_update_service_bus_header, _create_message,
convert_topic_to_xml, _convert_response_to_topic,
convert_queue_to_xml, _convert_response_to_queue,
convert_subscription_to_xml, _convert_response_to_subscription,
convert_rule_to_xml, _convert_response_to_rule,
_convert_xml_to_queue, _convert_xml_to_topic,
_convert_xml_to_subscription, _convert_xml_to_rule,
_service_bus_error_handler, AZURE_SERVICEBUS_NAMESPACE,
AZURE_SERVICEBUS_ACCESS_KEY, AZURE_SERVICEBUS_ISSUER)
from azure.http import HTTPRequest
from azure import (_validate_not_none, Feed,
_convert_response_to_feeds, _str_or_none, _int_or_none,
_get_request_body, _update_request_uri_query,
_dont_fail_on_exist, _dont_fail_not_exist,
WindowsAzureError, _parse_response, _convert_class_to_xml,
_parse_response_for_dict, _parse_response_for_dict_prefix,
_parse_response_for_dict_filter,
_parse_enum_results_list, _update_request_uri_query_local_storage,
_get_table_host, _get_queue_host, _get_blob_host,
_parse_simple_list, SERVICE_BUS_HOST_BASE, xml_escape)
class ServiceBusService:
def create_queue(self, queue_name, queue=None, fail_on_exist=False):
'''
Creates a new queue. Once created, this queue's resource manifest is immutable.
queue: queue object to create.
queue_name: the name of the queue.
fail_on_exist: specify whether to throw an exception when the queue exists.
'''
_validate_not_none('queue_name', queue_name)
request = HTTPRequest()
request.method = 'PUT'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(queue_name) + ''
request.body = _get_request_body(convert_queue_to_xml(queue))
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
if not fail_on_exist:
try:
self._perform_request(request)
return True
except WindowsAzureError as e:
_dont_fail_on_exist(e)
return False
else:
self._perform_request(request)
return True
def delete_queue(self, queue_name, fail_not_exist=False):
'''
Deletes an existing queue. This operation will also remove all associated state
including messages in the queue.
fail_not_exist: specify whether to throw an exception if the queue doesn't exist.
'''
_validate_not_none('queue_name', queue_name)
request = HTTPRequest()
request.method = 'DELETE'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(queue_name) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
if not fail_not_exist:
try:
self._perform_request(request)
return True
except WindowsAzureError as e:
_dont_fail_not_exist(e)
return False
else:
self._perform_request(request)
return True
def get_queue(self, queue_name):
'''
Retrieves an existing queue.
queue_name: name of the queue.
'''
_validate_not_none('queue_name', queue_name)
request = HTTPRequest()
request.method = 'GET'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(queue_name) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _convert_response_to_queue(response)
def list_queues(self):
'''
Enumerates the queues in the service namespace.
'''
request = HTTPRequest()
request.method = 'GET'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/$Resources/Queues'
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _convert_response_to_feeds(response, _convert_xml_to_queue)
def create_topic(self, topic_name, topic=None, fail_on_exist=False):
'''
Creates a new topic. Once created, this topic resource manifest is immutable.
topic_name: name of the topic.
topic: the Topic object to create.
fail_on_exist: specify whether to throw an exception when the topic exists.
'''
_validate_not_none('topic_name', topic_name)
request = HTTPRequest()
request.method = 'PUT'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + ''
request.body = _get_request_body(convert_topic_to_xml(topic))
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
if not fail_on_exist:
try:
self._perform_request(request)
return True
except WindowsAzureError as e:
_dont_fail_on_exist(e)
return False
else:
self._perform_request(request)
return True
def delete_topic(self, topic_name, fail_not_exist=False):
'''
Deletes an existing topic. This operation will also remove all associated state
including associated subscriptions.
topic_name: name of the topic.
fail_not_exist: specify whether throw exception when topic doesn't exist.
'''
_validate_not_none('topic_name', topic_name)
request = HTTPRequest()
request.method = 'DELETE'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
if not fail_not_exist:
try:
self._perform_request(request)
return True
except WindowsAzureError as e:
_dont_fail_not_exist(e)
return False
else:
self._perform_request(request)
return True
def get_topic(self, topic_name):
'''
Retrieves the description for the specified topic.
topic_name: name of the topic.
'''
_validate_not_none('topic_name', topic_name)
request = HTTPRequest()
request.method = 'GET'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _convert_response_to_topic(response)
def list_topics(self):
'''
Retrieves the topics in the service namespace.
'''
request = HTTPRequest()
request.method = 'GET'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/$Resources/Topics'
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _convert_response_to_feeds(response, _convert_xml_to_topic)
def create_rule(self, topic_name, subscription_name, rule_name, rule=None, fail_on_exist=False):
'''
Creates a new rule. Once created, this rule's resource manifest is immutable.
topic_name: the name of the topic
subscription_name: the name of the subscription
rule_name: name of the rule.
fail_on_exist: specify whether to throw an exception when the rule exists.
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
_validate_not_none('rule_name', rule_name)
request = HTTPRequest()
request.method = 'PUT'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + '/rules/' + str(rule_name) + ''
request.body = _get_request_body(convert_rule_to_xml(rule))
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
if not fail_on_exist:
try:
self._perform_request(request)
return True
except WindowsAzureError as e:
_dont_fail_on_exist(e)
return False
else:
self._perform_request(request)
return True
def delete_rule(self, topic_name, subscription_name, rule_name, fail_not_exist=False):
'''
Deletes an existing rule.
topic_name: the name of the topic
subscription_name: the name of the subscription
rule_name: the name of the rule. DEFAULT_RULE_NAME=$Default. Use DEFAULT_RULE_NAME
to delete default rule for the subscription.
fail_not_exist: specify whether throw exception when rule doesn't exist.
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
_validate_not_none('rule_name', rule_name)
request = HTTPRequest()
request.method = 'DELETE'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + '/rules/' + str(rule_name) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
if not fail_not_exist:
try:
self._perform_request(request)
return True
except WindowsAzureError as e:
_dont_fail_not_exist(e)
return False
else:
self._perform_request(request)
return True
def get_rule(self, topic_name, subscription_name, rule_name):
'''
Retrieves the description for the specified rule.
topic_name: the name of the topic
subscription_name: the name of the subscription
rule_name: name of the rule
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
_validate_not_none('rule_name', rule_name)
request = HTTPRequest()
request.method = 'GET'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + '/rules/' + str(rule_name) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _convert_response_to_rule(response)
def list_rules(self, topic_name, subscription_name):
'''
Retrieves the rules that exist under the specified subscription.
topic_name: the name of the topic
subscription_name: the name of the subscription
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
request = HTTPRequest()
request.method = 'GET'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + '/rules/'
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _convert_response_to_feeds(response, _convert_xml_to_rule)
def create_subscription(self, topic_name, subscription_name, subscription=None, fail_on_exist=False):
'''
Creates a new subscription. Once created, this subscription resource manifest is
immutable.
topic_name: the name of the topic
subscription_name: the name of the subscription
fail_on_exist: specify whether throw exception when subscription exists.
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
request = HTTPRequest()
request.method = 'PUT'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + ''
request.body = _get_request_body(convert_subscription_to_xml(subscription))
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
if not fail_on_exist:
try:
self._perform_request(request)
return True
except WindowsAzureError as e:
_dont_fail_on_exist(e)
return False
else:
self._perform_request(request)
return True
def delete_subscription(self, topic_name, subscription_name, fail_not_exist=False):
'''
Deletes an existing subscription.
topic_name: the name of the topic
subscription_name: the name of the subscription
fail_not_exist: specify whether to throw an exception when the subscription doesn't exist.
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
request = HTTPRequest()
request.method = 'DELETE'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
if not fail_not_exist:
try:
self._perform_request(request)
return True
except WindowsAzureError as e:
_dont_fail_not_exist(e)
return False
else:
self._perform_request(request)
return True
def get_subscription(self, topic_name, subscription_name):
'''
Gets an existing subscription.
topic_name: the name of the topic
subscription_name: the name of the subscription
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
request = HTTPRequest()
request.method = 'GET'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _convert_response_to_subscription(response)
def list_subscriptions(self, topic_name):
'''
Retrieves the subscriptions in the specified topic.
topic_name: the name of the topic
'''
_validate_not_none('topic_name', topic_name)
request = HTTPRequest()
request.method = 'GET'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/'
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _convert_response_to_feeds(response, _convert_xml_to_subscription)
def send_topic_message(self, topic_name, message=None):
'''
Enqueues a message into the specified topic. The limit to the number of messages
which may be present in the topic is governed by the message size in MaxTopicSizeInBytes.
If this message causes the topic to exceed its quota, a quota exceeded error is
returned and the message will be rejected.
topic_name: name of the topic.
message: the Message object containing message body and properties.
'''
_validate_not_none('topic_name', topic_name)
request = HTTPRequest()
request.method = 'POST'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/messages'
request.headers = message.add_headers(request)
request.body = _get_request_body(message.body)
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
def peek_lock_subscription_message(self, topic_name, subscription_name, timeout='60'):
'''
This operation is used to atomically retrieve and lock a message for processing.
The message is guaranteed not to be delivered to other receivers during the lock
duration period specified in buffer description. Once the lock expires, the
message will be available to other receivers (on the same subscription only)
during the lock duration period specified in the topic description. Once the lock
expires, the message will be available to other receivers. In order to complete
processing of the message, the receiver should issue a delete command with the
lock ID received from this operation. To abandon processing of the message and
unlock it for other receivers, an Unlock Message command should be issued, or
the lock duration period can expire.
topic_name: the name of the topic
subscription_name: the name of the subscription
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
request = HTTPRequest()
request.method = 'POST'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + '/messages/head'
request.query = [('timeout', _int_or_none(timeout))]
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _create_message(response, self)
def unlock_subscription_message(self, topic_name, subscription_name, sequence_number, lock_token):
'''
Unlock a message for processing by other receivers on a given subscription.
This operation deletes the lock object, causing the message to be unlocked.
A message must have first been locked by a receiver before this operation
is called.
topic_name: the name of the topic
subscription_name: the name of the subscription
sequence_name: The sequence number of the message to be unlocked as returned
in BrokerProperties['SequenceNumber'] by the Peek Message operation.
lock_token: The ID of the lock as returned by the Peek Message operation in
BrokerProperties['LockToken']
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
_validate_not_none('sequence_number', sequence_number)
_validate_not_none('lock_token', lock_token)
request = HTTPRequest()
request.method = 'PUT'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + '/messages/' + str(sequence_number) + '/' + str(lock_token) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
def read_delete_subscription_message(self, topic_name, subscription_name, timeout='60'):
'''
Read and delete a message from a subscription as an atomic operation. This
operation should be used when a best-effort guarantee is sufficient for an
application; that is, using this operation it is possible for messages to
be lost if processing fails.
topic_name: the name of the topic
subscription_name: the name of the subscription
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
request = HTTPRequest()
request.method = 'DELETE'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + '/messages/head'
request.query = [('timeout', _int_or_none(timeout))]
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _create_message(response, self)
def delete_subscription_message(self, topic_name, subscription_name, sequence_number, lock_token):
'''
Completes processing on a locked message and delete it from the subscription.
This operation should only be called after processing a previously locked
message is successful to maintain At-Least-Once delivery assurances.
topic_name: the name of the topic
subscription_name: the name of the subscription
sequence_name: The sequence number of the message to be deleted as returned
in BrokerProperties['SequenceNumber'] by the Peek Message operation.
lock_token: The ID of the lock as returned by the Peek Message operation in
BrokerProperties['LockToken']
'''
_validate_not_none('topic_name', topic_name)
_validate_not_none('subscription_name', subscription_name)
_validate_not_none('sequence_number', sequence_number)
_validate_not_none('lock_token', lock_token)
request = HTTPRequest()
request.method = 'DELETE'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(topic_name) + '/subscriptions/' + str(subscription_name) + '/messages/' + str(sequence_number) + '/' + str(lock_token) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
def send_queue_message(self, queue_name, message=None):
'''
Sends a message into the specified queue. The limit to the number of messages
which may be present in the topic is governed by the message size the
MaxTopicSizeInMegaBytes. If this message will cause the queue to exceed its
quota, a quota exceeded error is returned and the message will be rejected.
queue_name: name of the queue
message: the Message object containing message body and properties.
'''
_validate_not_none('queue_name', queue_name)
request = HTTPRequest()
request.method = 'POST'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(queue_name) + '/messages'
request.headers = message.add_headers(request)
request.body = _get_request_body(message.body)
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
def peek_lock_queue_message(self, queue_name, timeout='60'):
'''
Automically retrieves and locks a message from a queue for processing. The
message is guaranteed not to be delivered to other receivers (on the same
subscription only) during the lock duration period specified in the queue
description. Once the lock expires, the message will be available to other
receivers. In order to complete processing of the message, the receiver
should issue a delete command with the lock ID received from this operation.
To abandon processing of the message and unlock it for other receivers,
an Unlock Message command should be issued, or the lock duration period
can expire.
queue_name: name of the queue
'''
_validate_not_none('queue_name', queue_name)
request = HTTPRequest()
request.method = 'POST'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(queue_name) + '/messages/head'
request.query = [('timeout', _int_or_none(timeout))]
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _create_message(response, self)
def unlock_queue_message(self, queue_name, sequence_number, lock_token):
'''
Unlocks a message for processing by other receivers on a given subscription.
This operation deletes the lock object, causing the message to be unlocked.
A message must have first been locked by a receiver before this operation is
called.
queue_name: name of the queue
sequence_name: The sequence number of the message to be unlocked as returned
in BrokerProperties['SequenceNumber'] by the Peek Message operation.
lock_token: The ID of the lock as returned by the Peek Message operation in
BrokerProperties['LockToken']
'''
_validate_not_none('queue_name', queue_name)
_validate_not_none('sequence_number', sequence_number)
_validate_not_none('lock_token', lock_token)
request = HTTPRequest()
request.method = 'PUT'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(queue_name) + '/messages/' + str(sequence_number) + '/' + str(lock_token) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
def read_delete_queue_message(self, queue_name, timeout='60'):
'''
Reads and deletes a message from a queue as an atomic operation. This operation
should be used when a best-effort guarantee is sufficient for an application;
that is, using this operation it is possible for messages to be lost if
processing fails.
queue_name: name of the queue
'''
_validate_not_none('queue_name', queue_name)
request = HTTPRequest()
request.method = 'DELETE'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(queue_name) + '/messages/head'
request.query = [('timeout', _int_or_none(timeout))]
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
return _create_message(response, self)
def delete_queue_message(self, queue_name, sequence_number, lock_token):
'''
Completes processing on a locked message and delete it from the queue. This
operation should only be called after processing a previously locked message
is successful to maintain At-Least-Once delivery assurances.
queue_name: name of the queue
sequence_name: The sequence number of the message to be deleted as returned
in BrokerProperties['SequenceNumber'] by the Peek Message operation.
lock_token: The ID of the lock as returned by the Peek Message operation in
BrokerProperties['LockToken']
'''
_validate_not_none('queue_name', queue_name)
_validate_not_none('sequence_number', sequence_number)
_validate_not_none('lock_token', lock_token)
request = HTTPRequest()
request.method = 'DELETE'
request.host = self.service_namespace + SERVICE_BUS_HOST_BASE
request.path = '/' + str(queue_name) + '/messages/' + str(sequence_number) + '/' + str(lock_token) + ''
request.path, request.query = _update_request_uri_query(request)
request.headers = _update_service_bus_header(request, self.account_key, self.issuer)
response = self._perform_request(request)
def receive_queue_message(self, queue_name, peek_lock=True, timeout=60):
if peek_lock:
return self.peek_lock_queue_message(queue_name, timeout)
else:
return self.read_delete_queue_message(queue_name, timeout)
def receive_subscription_message(self, topic_name, subscription_name, peek_lock=True, timeout=60):
if peek_lock:
return self.peek_lock_subscription_message(topic_name, subscription_name, timeout)
else:
return self.read_delete_subscription_message(topic_name, subscription_name, timeout)
def __init__(self, service_namespace=None, account_key=None, issuer=None, x_ms_version='2011-06-01'):
self.requestid = None
self.service_namespace = service_namespace
self.account_key = account_key
self.issuer = issuer
#get service namespace, account key and issuer. If they are set when constructing, then use them.
#else find them from environment variables.
if not service_namespace:
if os.environ.has_key(AZURE_SERVICEBUS_NAMESPACE):
self.service_namespace = os.environ[AZURE_SERVICEBUS_NAMESPACE]
if not account_key:
if os.environ.has_key(AZURE_SERVICEBUS_ACCESS_KEY):
self.account_key = os.environ[AZURE_SERVICEBUS_ACCESS_KEY]
if not issuer:
if os.environ.has_key(AZURE_SERVICEBUS_ISSUER):
self.issuer = os.environ[AZURE_SERVICEBUS_ISSUER]
if not self.service_namespace or not self.account_key or not self.issuer:
raise WindowsAzureError('You need to provide servicebus namespace, access key and Issuer')
self.x_ms_version = x_ms_version
self._httpclient = _HTTPClient(service_instance=self, service_namespace=service_namespace, account_key=account_key, issuer=issuer, x_ms_version=self.x_ms_version)
self._filter = self._httpclient.perform_request
def with_filter(self, filter):
'''Returns a new service which will process requests with the
specified filter. Filtering operations can include logging, automatic
retrying, etc... The filter is a lambda which receives the HTTPRequest
and another lambda. The filter can perform any pre-processing on the
request, pass it off to the next lambda, and then perform any post-processing
on the response.'''
res = ServiceBusService(self.service_namespace, self.account_key,
self.issuer, self.x_ms_version)
old_filter = self._filter
def new_filter(request):
return filter(request, old_filter)
res._filter = new_filter
return res
def _perform_request(self, request):
try:
resp = self._filter(request)
except HTTPError as e:
return _service_bus_error_handler(e)
if not resp:
return None
return resp
| 48.886686 | 170 | 0.667005 | 4,107 | 34,514 | 5.305089 | 0.078403 | 0.034285 | 0.032357 | 0.039012 | 0.825041 | 0.799018 | 0.785478 | 0.757298 | 0.735313 | 0.71778 | 0 | 0.001211 | 0.258156 | 34,514 | 705 | 171 | 48.956028 | 0.84975 | 0.26285 | 0 | 0.738717 | 0 | 0 | 0.048422 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.07601 | false | 0 | 0.019002 | 0.002375 | 0.204276 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
66e595cbd66e5daecb8f9a0aa0fee7d613e6ad1a | 14,157 | py | Python | xu/src/res/resources.py | sonnts996/XuCompa-Request | f343e7bfd1b4263eb76438c96d347c549cc75ce3 | [
"Apache-2.0"
] | null | null | null | xu/src/res/resources.py | sonnts996/XuCompa-Request | f343e7bfd1b4263eb76438c96d347c549cc75ce3 | [
"Apache-2.0"
] | null | null | null | xu/src/res/resources.py | sonnts996/XuCompa-Request | f343e7bfd1b4263eb76438c96d347c549cc75ce3 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Resource object code
#
# Created by: The Resource Compiler for PyQt5 (Qt v5.15.1)
#
# WARNING! All changes made in this file will be lost!
from PyQt5 import QtCore
qt_resource_data = b"\
\x00\x00\x00\x91\
\x51\
\x4c\x61\x62\x65\x6c\x7b\x0d\x0a\x20\x20\x20\x20\x62\x61\x63\x6b\
\x67\x72\x6f\x75\x6e\x64\x3a\x20\x67\x72\x61\x79\x3b\x0d\x0a\x20\
\x20\x20\x20\x62\x6f\x72\x64\x65\x72\x3a\x20\x6e\x6f\x6e\x65\x3b\
\x0d\x0a\x20\x20\x20\x20\x62\x6f\x72\x64\x65\x72\x2d\x72\x61\x64\
\x69\x75\x73\x3a\x20\x35\x70\x78\x3b\x0d\x0a\x20\x20\x20\x20\x63\
\x6f\x6c\x6f\x72\x3a\x20\x23\x45\x35\x37\x33\x37\x33\x3b\x0d\x0a\
\x20\x20\x20\x20\x66\x6f\x6e\x74\x2d\x73\x69\x7a\x65\x3a\x20\x31\
\x33\x70\x78\x3b\x0d\x0a\x20\x20\x20\x20\x66\x6f\x6e\x74\x2d\x73\
\x74\x79\x6c\x65\x3a\x20\x69\x74\x61\x6c\x69\x63\x3b\x0d\x0a\x7d\
\
\x00\x00\x08\xb1\
\x3c\
\x3f\x78\x6d\x6c\x20\x76\x65\x72\x73\x69\x6f\x6e\x3d\x22\x31\x2e\
\x30\x22\x3f\x3e\x0d\x0a\x3c\x73\x76\x67\x20\x78\x6d\x6c\x6e\x73\
\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\
\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x20\x76\x65\
\x72\x73\x69\x6f\x6e\x3d\x22\x31\x2e\x31\x22\x20\x77\x69\x64\x74\
\x68\x3d\x22\x35\x31\x32\x22\x20\x68\x65\x69\x67\x68\x74\x3d\x22\
\x35\x31\x32\x22\x20\x78\x3d\x22\x30\x22\x20\x79\x3d\x22\x30\x22\
\x20\x76\x69\x65\x77\x42\x6f\x78\x3d\x22\x30\x20\x30\x20\x35\x30\
\x33\x2e\x31\x31\x38\x20\x35\x30\x33\x2e\x31\x31\x38\x22\x0d\x0a\
\x20\x20\x20\x20\x20\x73\x74\x79\x6c\x65\x3d\x22\x65\x6e\x61\x62\
\x6c\x65\x2d\x62\x61\x63\x6b\x67\x72\x6f\x75\x6e\x64\x3a\x6e\x65\
\x77\x20\x30\x20\x30\x20\x35\x31\x32\x20\x35\x31\x32\x22\x20\x78\
\x6d\x6c\x3a\x73\x70\x61\x63\x65\x3d\x22\x70\x72\x65\x73\x65\x72\
\x76\x65\x22\x20\x63\x6c\x61\x73\x73\x3d\x22\x22\x3e\x3c\x67\x3e\
\x0d\x0a\x3c\x70\x61\x74\x68\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\
\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\
\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x20\x73\x74\x79\x6c\x65\
\x3d\x22\x22\x0d\x0a\x20\x20\x20\x20\x20\x20\x64\x3d\x22\x4d\x33\
\x33\x35\x2e\x31\x35\x31\x2c\x31\x36\x37\x2e\x39\x36\x37\x63\x31\
\x30\x2e\x34\x34\x39\x2c\x31\x30\x2e\x34\x34\x39\x2c\x31\x38\x2e\
\x38\x30\x38\x2c\x32\x32\x2e\x39\x38\x38\x2c\x32\x35\x2e\x30\x37\
\x38\x2c\x33\x35\x2e\x35\x32\x37\x20\x20\x63\x32\x32\x2e\x39\x38\
\x38\x2c\x34\x38\x2e\x30\x36\x35\x2c\x31\x35\x2e\x36\x37\x33\x2c\
\x31\x30\x38\x2e\x36\x36\x39\x2d\x32\x35\x2e\x30\x37\x38\x2c\x31\
\x34\x38\x2e\x33\x37\x35\x4c\x32\x32\x33\x2e\x33\x34\x37\x2c\x34\
\x36\x34\x2e\x37\x31\x38\x63\x2d\x35\x31\x2e\x32\x2c\x35\x31\x2e\
\x32\x2d\x31\x33\x33\x2e\x37\x34\x37\x2c\x35\x31\x2e\x32\x2d\x31\
\x38\x33\x2e\x39\x30\x32\x2c\x30\x20\x20\x63\x2d\x35\x31\x2e\x32\
\x2d\x35\x31\x2e\x32\x2d\x35\x31\x2e\x32\x2d\x31\x33\x33\x2e\x37\
\x34\x37\x2c\x30\x2d\x31\x38\x33\x2e\x39\x30\x32\x6c\x37\x39\x2e\
\x34\x31\x32\x2d\x37\x39\x2e\x34\x31\x32\x63\x2d\x39\x2e\x34\x30\
\x34\x2c\x33\x31\x2e\x33\x34\x37\x2d\x38\x2e\x33\x35\x39\x2c\x36\
\x34\x2e\x37\x38\x34\x2c\x33\x2e\x31\x33\x35\x2c\x39\x35\x2e\x30\
\x38\x36\x6c\x2d\x33\x33\x2e\x34\x33\x37\x2c\x33\x33\x2e\x34\x33\
\x37\x20\x20\x63\x2d\x32\x32\x2e\x39\x38\x38\x2c\x32\x32\x2e\x39\
\x38\x38\x2d\x32\x32\x2e\x39\x38\x38\x2c\x36\x31\x2e\x36\x34\x39\
\x2c\x30\x2c\x38\x35\x2e\x36\x38\x32\x63\x32\x34\x2e\x30\x33\x33\
\x2c\x32\x34\x2e\x30\x33\x33\x2c\x36\x31\x2e\x36\x34\x39\x2c\x32\
\x34\x2e\x30\x33\x33\x2c\x38\x35\x2e\x36\x38\x32\x2c\x30\x6c\x31\
\x31\x31\x2e\x38\x30\x34\x2d\x31\x31\x31\x2e\x38\x30\x34\x20\x20\
\x63\x31\x31\x2e\x34\x39\x34\x2d\x31\x31\x2e\x34\x39\x34\x2c\x31\
\x37\x2e\x37\x36\x33\x2d\x32\x37\x2e\x31\x36\x37\x2c\x31\x37\x2e\
\x37\x36\x33\x2d\x34\x32\x2e\x38\x34\x31\x73\x2d\x36\x2e\x32\x36\
\x39\x2d\x33\x31\x2e\x33\x34\x37\x2d\x31\x37\x2e\x37\x36\x33\x2d\
\x34\x32\x2e\x38\x34\x31\x63\x2d\x31\x31\x2e\x34\x39\x34\x2d\x31\
\x31\x2e\x34\x39\x34\x2d\x32\x37\x2e\x31\x36\x37\x2d\x31\x37\x2e\
\x37\x36\x33\x2d\x34\x32\x2e\x38\x34\x31\x2d\x31\x37\x2e\x37\x36\
\x33\x6c\x35\x36\x2e\x34\x32\x34\x2d\x35\x36\x2e\x34\x32\x34\x20\
\x20\x43\x33\x31\x32\x2e\x31\x36\x33\x2c\x31\x34\x39\x2e\x31\x35\
\x39\x2c\x33\x32\x33\x2e\x36\x35\x37\x2c\x31\x35\x37\x2e\x35\x31\
\x38\x2c\x33\x33\x35\x2e\x31\x35\x31\x2c\x31\x36\x37\x2e\x39\x36\
\x37\x7a\x22\x0d\x0a\x20\x20\x20\x20\x20\x20\x66\x69\x6c\x6c\x3d\
\x22\x23\x35\x63\x62\x61\x66\x66\x22\x20\x64\x61\x74\x61\x2d\x6f\
\x72\x69\x67\x69\x6e\x61\x6c\x3d\x22\x23\x66\x66\x64\x31\x35\x63\
\x22\x20\x63\x6c\x61\x73\x73\x3d\x22\x22\x2f\x3e\x0d\x0a\x3c\x70\
\x61\x74\x68\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\
\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\
\x30\x2f\x73\x76\x67\x22\x20\x73\x74\x79\x6c\x65\x3d\x22\x22\x0d\
\x0a\x20\x20\x20\x20\x20\x20\x64\x3d\x22\x4d\x31\x36\x37\x2e\x39\
\x36\x37\x2c\x33\x33\x35\x2e\x31\x35\x31\x63\x2d\x31\x30\x2e\x34\
\x34\x39\x2d\x31\x30\x2e\x34\x34\x39\x2d\x31\x38\x2e\x38\x30\x38\
\x2d\x32\x32\x2e\x39\x38\x38\x2d\x32\x35\x2e\x30\x37\x38\x2d\x33\
\x35\x2e\x35\x32\x37\x20\x20\x63\x2d\x32\x32\x2e\x39\x38\x38\x2d\
\x34\x38\x2e\x30\x36\x35\x2d\x31\x35\x2e\x36\x37\x33\x2d\x31\x30\
\x38\x2e\x36\x36\x39\x2c\x32\x35\x2e\x30\x37\x38\x2d\x31\x34\x38\
\x2e\x33\x37\x36\x4c\x32\x37\x39\x2e\x37\x37\x31\x2c\x33\x38\x2e\
\x34\x63\x35\x31\x2e\x32\x2d\x35\x31\x2e\x32\x2c\x31\x33\x33\x2e\
\x37\x34\x37\x2d\x35\x31\x2e\x32\x2c\x31\x38\x33\x2e\x39\x30\x32\
\x2c\x30\x63\x35\x31\x2e\x32\x2c\x35\x31\x2e\x32\x2c\x35\x31\x2e\
\x32\x2c\x31\x33\x33\x2e\x37\x34\x37\x2c\x30\x2c\x31\x38\x33\x2e\
\x39\x30\x32\x20\x20\x6c\x2d\x37\x39\x2e\x34\x31\x32\x2c\x37\x39\
\x2e\x34\x31\x32\x63\x39\x2e\x34\x30\x34\x2d\x33\x31\x2e\x33\x34\
\x37\x2c\x38\x2e\x33\x35\x39\x2d\x36\x34\x2e\x37\x38\x34\x2d\x33\
\x2e\x31\x33\x35\x2d\x39\x35\x2e\x30\x38\x36\x6c\x33\x33\x2e\x34\
\x33\x37\x2d\x33\x33\x2e\x34\x33\x37\x63\x32\x32\x2e\x39\x38\x38\
\x2d\x32\x32\x2e\x39\x38\x38\x2c\x32\x32\x2e\x39\x38\x38\x2d\x36\
\x31\x2e\x36\x34\x39\x2c\x30\x2d\x38\x35\x2e\x36\x38\x32\x20\x20\
\x63\x2d\x32\x34\x2e\x30\x33\x33\x2d\x32\x34\x2e\x30\x33\x33\x2d\
\x36\x31\x2e\x36\x34\x39\x2d\x32\x34\x2e\x30\x33\x33\x2d\x38\x35\
\x2e\x36\x38\x32\x2c\x30\x4c\x32\x31\x38\x2e\x31\x32\x32\x2c\x32\
\x30\x30\x2e\x33\x35\x39\x63\x2d\x31\x31\x2e\x34\x39\x34\x2c\x31\
\x31\x2e\x34\x39\x34\x2d\x31\x37\x2e\x37\x36\x33\x2c\x32\x37\x2e\
\x31\x36\x37\x2d\x31\x37\x2e\x37\x36\x33\x2c\x34\x32\x2e\x38\x34\
\x31\x73\x36\x2e\x32\x36\x39\x2c\x33\x31\x2e\x33\x34\x37\x2c\x31\
\x37\x2e\x37\x36\x33\x2c\x34\x32\x2e\x38\x34\x31\x20\x20\x63\x31\
\x31\x2e\x34\x39\x34\x2c\x31\x31\x2e\x34\x39\x34\x2c\x32\x37\x2e\
\x31\x36\x37\x2c\x31\x37\x2e\x37\x36\x33\x2c\x34\x32\x2e\x38\x34\
\x31\x2c\x31\x37\x2e\x37\x36\x33\x6c\x2d\x35\x36\x2e\x34\x32\x34\
\x2c\x35\x36\x2e\x34\x32\x34\x43\x31\x39\x30\x2e\x39\x35\x35\x2c\
\x33\x35\x33\x2e\x39\x35\x39\x2c\x31\x37\x39\x2e\x34\x36\x31\x2c\
\x33\x34\x35\x2e\x36\x2c\x31\x36\x37\x2e\x39\x36\x37\x2c\x33\x33\
\x35\x2e\x31\x35\x31\x7a\x22\x0d\x0a\x20\x20\x20\x20\x20\x20\x66\
\x69\x6c\x6c\x3d\x22\x23\x35\x61\x34\x65\x65\x39\x22\x20\x64\x61\
\x74\x61\x2d\x6f\x72\x69\x67\x69\x6e\x61\x6c\x3d\x22\x23\x66\x66\
\x37\x30\x35\x38\x22\x20\x63\x6c\x61\x73\x73\x3d\x22\x22\x2f\x3e\
\x0d\x0a\x3c\x67\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\
\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\
\x30\x30\x2f\x73\x76\x67\x22\x3e\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\
\x3c\x67\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\x2f\
\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\
\x2f\x73\x76\x67\x22\x3e\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\
\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\
\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\
\x76\x67\x22\x3e\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\x20\x78\
\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\
\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\
\x22\x3e\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\x20\x78\x6d\x6c\
\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\
\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x3e\
\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\x20\x78\x6d\x6c\x6e\x73\
\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\
\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x3e\x0d\x0a\
\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\x20\x78\x6d\x6c\x6e\x73\x3d\x22\
\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\
\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x3e\x0d\x0a\x3c\x2f\
\x67\x3e\x0d\x0a\x3c\x67\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\x74\
\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\
\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x3e\x0d\x0a\x3c\x2f\x67\x3e\
\x0d\x0a\x3c\x67\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\
\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\
\x30\x30\x2f\x73\x76\x67\x22\x3e\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\
\x3c\x67\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\x2f\
\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\
\x2f\x73\x76\x67\x22\x3e\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\
\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\
\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\
\x76\x67\x22\x3e\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\x20\x78\
\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\
\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\
\x22\x3e\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\x20\x78\x6d\x6c\
\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\
\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x3e\
\x0d\x0a\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\x20\x78\x6d\x6c\x6e\x73\
\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\
\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x3e\x0d\x0a\
\x3c\x2f\x67\x3e\x0d\x0a\x3c\x67\x20\x78\x6d\x6c\x6e\x73\x3d\x22\
\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\
\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x3e\x0d\x0a\x3c\x2f\
\x67\x3e\x0d\x0a\x3c\x2f\x67\x3e\x3c\x2f\x73\x76\x67\x3e\x0d\x0a\
\
\x00\x00\x02\x04\
\x3c\
\x73\x76\x67\x20\x66\x69\x6c\x6c\x3d\x22\x6f\x72\x61\x6e\x67\x65\
\x22\x20\x77\x69\x64\x74\x68\x3d\x22\x32\x34\x22\x20\x68\x65\x69\
\x67\x68\x74\x3d\x22\x32\x34\x22\x20\x78\x6d\x6c\x6e\x73\x3d\x22\
\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\
\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x3e\x0d\x0a\x0d\x0a\
\x20\x3c\x67\x3e\x0d\x0a\x20\x20\x3c\x74\x69\x74\x6c\x65\x3e\x62\
\x61\x63\x6b\x67\x72\x6f\x75\x6e\x64\x3c\x2f\x74\x69\x74\x6c\x65\
\x3e\x0d\x0a\x20\x20\x3c\x72\x65\x63\x74\x20\x66\x69\x6c\x6c\x3d\
\x22\x6e\x6f\x6e\x65\x22\x20\x69\x64\x3d\x22\x63\x61\x6e\x76\x61\
\x73\x5f\x62\x61\x63\x6b\x67\x72\x6f\x75\x6e\x64\x22\x20\x68\x65\
\x69\x67\x68\x74\x3d\x22\x36\x30\x32\x22\x20\x77\x69\x64\x74\x68\
\x3d\x22\x38\x30\x32\x22\x20\x79\x3d\x22\x2d\x31\x22\x20\x78\x3d\
\x22\x2d\x31\x22\x2f\x3e\x0d\x0a\x20\x3c\x2f\x67\x3e\x0d\x0a\x20\
\x3c\x67\x3e\x0d\x0a\x20\x20\x3c\x74\x69\x74\x6c\x65\x3e\x4c\x61\
\x79\x65\x72\x20\x31\x3c\x2f\x74\x69\x74\x6c\x65\x3e\x0d\x0a\x20\
\x20\x3c\x70\x61\x74\x68\x20\x69\x64\x3d\x22\x73\x76\x67\x5f\x31\
\x22\x20\x66\x69\x6c\x6c\x3d\x22\x6e\x6f\x6e\x65\x22\x20\x64\x3d\
\x22\x6d\x30\x2c\x30\x6c\x32\x34\x2c\x30\x6c\x30\x2c\x32\x34\x6c\
\x2d\x32\x34\x2c\x30\x6c\x30\x2c\x2d\x32\x34\x7a\x22\x2f\x3e\x0d\
\x0a\x20\x20\x3c\x70\x61\x74\x68\x20\x69\x64\x3d\x22\x73\x76\x67\
\x5f\x32\x22\x20\x64\x3d\x22\x6d\x31\x36\x2e\x33\x30\x39\x32\x35\
\x2c\x35\x2e\x38\x34\x63\x2d\x30\x2e\x33\x36\x2c\x2d\x30\x2e\x35\
\x31\x20\x2d\x30\x2e\x39\x36\x2c\x2d\x30\x2e\x38\x34\x20\x2d\x31\
\x2e\x36\x33\x2c\x2d\x30\x2e\x38\x34\x6c\x2d\x31\x31\x2c\x30\x2e\
\x30\x31\x63\x2d\x31\x2e\x31\x2c\x30\x20\x2d\x32\x2c\x30\x2e\x38\
\x39\x20\x2d\x32\x2c\x31\x2e\x39\x39\x6c\x30\x2c\x31\x30\x63\x30\
\x2c\x31\x2e\x31\x20\x30\x2e\x39\x2c\x31\x2e\x39\x39\x20\x32\x2c\
\x31\x2e\x39\x39\x6c\x31\x31\x2c\x30\x2e\x30\x31\x63\x30\x2e\x36\
\x37\x2c\x30\x20\x31\x2e\x32\x37\x2c\x2d\x30\x2e\x33\x33\x20\x31\
\x2e\x36\x33\x2c\x2d\x30\x2e\x38\x34\x6c\x34\x2e\x33\x37\x2c\x2d\
\x36\x2e\x31\x36\x6c\x2d\x34\x2e\x33\x37\x2c\x2d\x36\x2e\x31\x36\
\x7a\x22\x2f\x3e\x0d\x0a\x20\x3c\x2f\x67\x3e\x0d\x0a\x3c\x2f\x73\
\x76\x67\x3e\
"
qt_resource_name = b"\
\x00\x04\
\x00\x06\xfa\x5e\
\x00\x69\
\x00\x63\x00\x6f\x00\x6e\
\x00\x0b\
\x06\x29\xf3\x3e\
\x00\x77\
\x00\x69\x00\x6e\x00\x64\x00\x6f\x00\x77\x00\x5f\x00\x69\x00\x63\x00\x6f\x00\x6e\
\x00\x08\
\x0c\x43\x45\xe3\
\x00\x74\
\x00\x65\x00\x6d\x00\x70\x00\x2e\x00\x63\x00\x73\x00\x73\
\x00\x08\
\x00\x4e\x54\xa7\
\x00\x6c\
\x00\x69\x00\x6e\x00\x6b\x00\x2e\x00\x73\x00\x76\x00\x67\
\x00\x09\
\x08\xbf\xbe\x27\
\x00\x6c\
\x00\x61\x00\x62\x00\x65\x00\x6c\x00\x2e\x00\x73\x00\x76\x00\x67\
"
qt_resource_struct_v1 = b"\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00\x01\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x05\
\x00\x00\x00\x0e\x00\x02\x00\x00\x00\x01\x00\x00\x00\x04\
\x00\x00\x00\x2a\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\
\x00\x00\x00\x40\x00\x00\x00\x00\x00\x01\x00\x00\x00\x95\
\x00\x00\x00\x56\x00\x00\x00\x00\x00\x01\x00\x00\x09\x4a\
"
qt_resource_struct_v2 = b"\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00\x01\
\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x05\
\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x0e\x00\x02\x00\x00\x00\x01\x00\x00\x00\x04\
\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x2a\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\
\x00\x00\x01\x75\x87\xc3\x2c\x41\
\x00\x00\x00\x40\x00\x00\x00\x00\x00\x01\x00\x00\x00\x95\
\x00\x00\x01\x75\x87\xc3\x2c\x41\
\x00\x00\x00\x56\x00\x00\x00\x00\x00\x01\x00\x00\x09\x4a\
\x00\x00\x01\x75\x87\xc3\x2c\x41\
"
qt_version = [int(v) for v in QtCore.qVersion().split('.')]
if qt_version < [5, 8, 0]:
rcc_version = 1
qt_resource_struct = qt_resource_struct_v1
else:
rcc_version = 2
qt_resource_struct = qt_resource_struct_v2
def qInitResources():
QtCore.qRegisterResourceData(rcc_version, qt_resource_struct, qt_resource_name, qt_resource_data)
def qCleanupResources():
QtCore.qUnregisterResourceData(rcc_version, qt_resource_struct, qt_resource_name, qt_resource_data)
qInitResources()
| 53.422642 | 103 | 0.725577 | 3,341 | 14,157 | 3.063155 | 0.042502 | 0.073285 | 0.074751 | 0.059801 | 0.825484 | 0.789427 | 0.746238 | 0.668556 | 0.602599 | 0.565663 | 0 | 0.393915 | 0.024864 | 14,157 | 264 | 104 | 53.625 | 0.34741 | 0.010737 | 0 | 0.266129 | 0 | 0.790323 | 0.000071 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0.008065 | false | 0 | 0.004032 | 0 | 0.012097 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
06dd1476a33249a0a9fe8e45e44b9e852fe77da3 | 163 | py | Python | examples/minmax.py | quynhanh-ngx/pytago | de976ad8d85702ae665e97978bc4a75d282c857f | [
"MIT"
] | 206 | 2021-06-24T16:16:13.000Z | 2022-03-31T07:44:17.000Z | examples/minmax.py | quynhanh-ngx/pytago | de976ad8d85702ae665e97978bc4a75d282c857f | [
"MIT"
] | 13 | 2021-06-24T17:51:36.000Z | 2022-02-23T10:07:17.000Z | examples/minmax.py | quynhanh-ngx/pytago | de976ad8d85702ae665e97978bc4a75d282c857f | [
"MIT"
] | 14 | 2021-06-26T02:19:45.000Z | 2022-03-30T03:02:49.000Z | def main():
print(max([1, 2, 3]))
print(min([1, 2, 3]))
print(max("a", "b", "c"))
print(min("a", "b", "c"))
if __name__ == '__main__':
main()
| 18.111111 | 29 | 0.447853 | 26 | 163 | 2.5 | 0.5 | 0.246154 | 0.092308 | 0.246154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04878 | 0.245399 | 163 | 8 | 30 | 20.375 | 0.479675 | 0 | 0 | 0 | 0 | 0 | 0.08589 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | true | 0 | 0 | 0 | 0.142857 | 0.571429 | 1 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
663da0ba8202908c876e4fdd2a1adaac8b36c7d1 | 2,520 | py | Python | python/clime.py | PROBIC/l1-inconsistency | 4c95bf6e96dfec0c7d947c04e908653c09ca7e0d | [
"MIT"
] | 1 | 2019-05-04T21:26:42.000Z | 2019-05-04T21:26:42.000Z | python/clime.py | PROBIC/l1-inconsistency | 4c95bf6e96dfec0c7d947c04e908653c09ca7e0d | [
"MIT"
] | null | null | null | python/clime.py | PROBIC/l1-inconsistency | 4c95bf6e96dfec0c7d947c04e908653c09ca7e0d | [
"MIT"
] | null | null | null | import rpy2.robjects as ro
import rpy2.robjects.numpy2ri
rpy2.robjects.numpy2ri.activate()
from rpy2.robjects.packages import importr
import default_params as depa
from running_utils import to_cov
import numpy as np
eps = 1e-6
def clime(data, alpha = depa.default_alpha):
C = to_cov(data)
d = {'fastclime.lambda': 'fastclime_lamb'}
fastclime = importr('fastclime', robject_translations = d)
out1 = fastclime.fastclime(C)
O = fastclime.fastclime_lamb(out1[4], out1[5], alpha)
return np.array(O[0])
def custom_clime(data, TPC):
C = to_cov(data)
d = {'fastclime.lambda': 'fastclime_lamb'}
fastclime = importr('fastclime', robject_translations = d)
out1 = fastclime.fastclime(C)
yla = depa.default_alpha
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], yla)[0])
while (sum(sum(1 for i in l if abs(i) > 0) for l in P) > TPC):
yla *= 2
if (yla*eps > 1):
break
np.array(fastclime.fastclime_lamb(out1[4], out1[5], yla)[0])
ala = yla/2
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], ala)[0])
while (sum(sum(1 for i in l if abs(i) > 0) for l in P) < TPC):
ala /= 2
if ala < eps:
break
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], ala)[0])
for _ in xrange(8):
kes = (ala + yla)/2
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], kes)[0])
TC = sum(sum(1 for i in l if abs(i) > 0) for l in P)
if (TC == TPC):
break
if (TC > TPC):
ala = kes
else:
yla = kes
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], kes)[0])
return P
def clime_precision(data, TPC):
C = to_cov(data)
d = {'fastclime.lambda': 'fastclime_lamb'}
fastclime = importr('fastclime', robject_translations = d)
out1 = fastclime.fastclime(C)
yla = depa.default_alpha
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], yla)[0])
while (sum(sum(1 for i in l if abs(i) > 0) for l in P) > TPC):
yla *= 2
if (yla*eps > 1):
break
np.array(fastclime.fastclime_lamb(out1[4], out1[5], yla)[0])
ala = yla/2
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], ala)[0])
while (sum(sum(1 for i in l if abs(i) > 0) for l in P) < TPC):
ala /= 2
if ala < eps:
break
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], ala)[0])
for _ in xrange(8):
kes = (ala + yla)/2
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], kes)[0])
TC = sum(sum(1 for i in l if abs(i) > 0) for l in P)
if (TC == TPC):
break
if (TC > TPC):
ala = kes
else:
yla = kes
P = np.array(fastclime.fastclime_lamb(out1[4], out1[5], kes)[0])
return P | 31.5 | 66 | 0.654762 | 451 | 2,520 | 3.587583 | 0.137472 | 0.128554 | 0.176761 | 0.2089 | 0.832509 | 0.832509 | 0.832509 | 0.812732 | 0.812732 | 0.812732 | 0 | 0.048356 | 0.179365 | 2,520 | 80 | 67 | 31.5 | 0.734043 | 0 | 0 | 0.831169 | 0 | 0 | 0.04641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038961 | false | 0 | 0.116883 | 0 | 0.194805 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
66510df685296b1871aa92339b59312e7eb3bae8 | 14,622 | py | Python | tests/test_tcp.py | benoitc/pyuv | 51a2f8687e3b6cd54af5ce81aabfc00b7fe40a18 | [
"MIT"
] | 1 | 2020-01-21T11:10:38.000Z | 2020-01-21T11:10:38.000Z | tests/test_tcp.py | benoitc/pyuv | 51a2f8687e3b6cd54af5ce81aabfc00b7fe40a18 | [
"MIT"
] | null | null | null | tests/test_tcp.py | benoitc/pyuv | 51a2f8687e3b6cd54af5ce81aabfc00b7fe40a18 | [
"MIT"
] | null | null | null |
import socket
import sys
from common import unittest2
import common
import pyuv
try:
memoryview
except NameError:
# Fix for Python 2.6
memoryview = str
TEST_PORT = 1234
class TCPErrorTest(unittest2.TestCase):
def on_client_connect_error(self, client, error):
self.assertNotEqual(error, None)
client.close()
def test_client1(self):
loop = pyuv.Loop.default_loop()
client = pyuv.TCP(loop)
client.connect(("127.0.0.1", TEST_PORT), self.on_client_connect_error)
loop.run()
def test_client2(self):
loop = pyuv.Loop.default_loop()
client = pyuv.TCP(loop)
self.assertFalse(client.readable)
self.assertFalse(client.writable)
client.close()
loop.run()
def test_open(self):
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
loop = pyuv.Loop.default_loop()
client = pyuv.TCP(loop)
client.open(sock.fileno())
client.connect(("127.0.0.1", TEST_PORT), self.on_client_connect_error)
loop.run()
class TCPTest(unittest2.TestCase):
def setUp(self):
self.loop = pyuv.Loop.default_loop()
self.server = None
self.client = None
self.client_connections = []
def on_connection(self, server, error):
self.assertEqual(error, None)
client = pyuv.TCP(pyuv.Loop.default_loop())
server.accept(client)
self.client_connections.append(client)
client.start_read(self.on_client_connection_read)
client.write(b"PING"+common.linesep)
def on_client_connection_read(self, client, data, error):
if data is None:
client.close()
self.client_connections.remove(client)
self.server.close()
return
def on_client_connection(self, client, error):
self.assertEqual(error, None)
client.start_read(self.on_client_read)
self.assertTrue(client.readable)
self.assertTrue(client.writable)
def on_client_read(self, client, data, error):
self.assertNotEqual(data, None)
self.assertEquals(data, b"PING"+common.linesep)
client.close()
def test_tcp1(self):
self.server = pyuv.TCP(self.loop)
self.server.bind(("0.0.0.0", TEST_PORT))
self.server.listen(self.on_connection)
self.client = pyuv.TCP(self.loop)
self.client.connect(("127.0.0.1", TEST_PORT), self.on_client_connection)
self.loop.run()
class TCPTestUnicode(unittest2.TestCase):
def setUp(self):
self.loop = pyuv.Loop.default_loop()
self.server = None
self.client = None
self.client_connections = []
def on_connection(self, server, error):
self.assertEqual(error, None)
client = pyuv.TCP(pyuv.Loop.default_loop())
server.accept(client)
self.client_connections.append(client)
client.start_read(self.on_client_connection_read)
if sys.version_info >= (3, 0):
data = 'PING'
else:
data = unicode('PING')
client.write(data)
def on_client_connection_read(self, client, data, error):
if data is None:
client.close()
self.client_connections.remove(client)
self.server.close()
return
def on_client_connection(self, client, error):
self.assertEqual(error, None)
client.start_read(self.on_client_read)
def on_client_read(self, client, data, error):
self.assertNotEqual(data, None)
self.assertEqual(data, b"PING")
client.close()
def test_tcp_unicode(self):
self.server = pyuv.TCP(self.loop)
self.server.bind(("0.0.0.0", TEST_PORT))
self.server.listen(self.on_connection)
self.client = pyuv.TCP(self.loop)
self.client.connect(("127.0.0.1", TEST_PORT), self.on_client_connection)
self.loop.run()
class TCPTestMemoryview(unittest2.TestCase):
def setUp(self):
self.loop = pyuv.Loop.default_loop()
self.server = None
self.client = None
self.client_connections = []
def on_connection(self, server, error):
self.assertEqual(error, None)
client = pyuv.TCP(pyuv.Loop.default_loop())
server.accept(client)
self.client_connections.append(client)
client.start_read(self.on_client_connection_read)
data = memoryview(b"PING")
client.write(data)
def on_client_connection_read(self, client, data, error):
if data is None:
client.close()
self.client_connections.remove(client)
self.server.close()
return
def on_client_connection(self, client, error):
self.assertEqual(error, None)
client.start_read(self.on_client_read)
def on_client_read(self, client, data, error):
self.assertNotEqual(data, None)
self.assertEqual(data, b"PING")
client.close()
def test_tcp_memoryview(self):
self.server = pyuv.TCP(self.loop)
self.server.bind(("0.0.0.0", TEST_PORT))
self.server.listen(self.on_connection)
self.client = pyuv.TCP(self.loop)
self.client.connect(("127.0.0.1", TEST_PORT), self.on_client_connection)
self.loop.run()
class TCPTestNull(unittest2.TestCase):
def setUp(self):
self.loop = pyuv.Loop.default_loop()
self.server = None
self.client = None
self.client_connections = []
def on_connection(self, server, error):
self.assertEqual(error, None)
client = pyuv.TCP(pyuv.Loop.default_loop())
server.accept(client)
self.client_connections.append(client)
client.start_read(self.on_client_connection_read)
client.write(b"PIN\x00G"+common.linesep)
def on_client_connection_read(self, client, data, error):
if data is None:
client.close()
self.client_connections.remove(client)
self.server.close()
return
def on_client_connection(self, client, error):
self.assertEqual(error, None)
client.start_read(self.on_client_read)
def on_client_read(self, client, data, error):
self.assertNotEqual(data, None)
self.assertEquals(data, b"PIN\x00G"+common.linesep)
client.close()
def test_tcp_null(self):
self.server = pyuv.TCP(self.loop)
self.server.bind(("0.0.0.0", TEST_PORT))
self.server.listen(self.on_connection)
self.client = pyuv.TCP(self.loop)
self.client.connect(("127.0.0.1", TEST_PORT), self.on_client_connection)
self.loop.run()
class TCPTestList(unittest2.TestCase):
def setUp(self):
self.loop = pyuv.Loop.default_loop()
self.server = None
self.client = None
self.client_connections = []
def on_connection(self, server, error):
client = pyuv.TCP(pyuv.Loop.default_loop())
server.accept(client)
self.client_connections.append(client)
client.start_read(self.on_client_connection_read)
client.writelines([b"PING1", b"PING2", b"PING3", b"PING4", b"PING5", b"PING6", b"PING7", b"PING8", b"PING9", b"PING10", b"PING11", b"PING12"])
def on_client_connection_read(self, client, data, error):
if data is None:
client.close()
self.client_connections.remove(client)
self.server.close()
return
def on_client_connection(self, client, error):
self.assertEquals(error, None)
client.start_read(self.on_client_read)
def on_client_read(self, client, data, error):
self.assertNotEqual(data, None)
self.assertEquals(data, b"PING1PING2PING3PING4PING5PING6PING7PING8PING9PING10PING11PING12")
client.close()
def test_tcp_list(self):
self.server = pyuv.TCP(self.loop)
self.server.bind(("0.0.0.0", TEST_PORT))
self.server.listen(self.on_connection)
self.client = pyuv.TCP(self.loop)
self.client.connect(("127.0.0.1", TEST_PORT), self.on_client_connection)
self.loop.run()
class TCPTestListUnicode(unittest2.TestCase):
def setUp(self):
self.loop = pyuv.Loop.default_loop()
self.server = None
self.client = None
self.client_connections = []
def on_connection(self, server, error):
client = pyuv.TCP(pyuv.Loop.default_loop())
server.accept(client)
self.client_connections.append(client)
client.start_read(self.on_client_connection_read)
if sys.version_info >= (3, 0):
data = 'PING'
else:
data = unicode('PING')
client.writelines([data for x in range(100)])
def on_client_connection_read(self, client, data, error):
if data is None:
client.close()
self.client_connections.remove(client)
self.server.close()
return
def on_client_connection(self, client, error):
self.assertEquals(error, None)
client.start_read(self.on_client_read)
def on_client_read(self, client, data, error):
self.assertEquals(data, b"PING"*100)
client.close()
def test_tcp_list_unicode(self):
self.server = pyuv.TCP(self.loop)
self.server.bind(("0.0.0.0", TEST_PORT))
self.server.listen(self.on_connection)
self.client = pyuv.TCP(self.loop)
self.client.connect(("127.0.0.1", TEST_PORT), self.on_client_connection)
self.loop.run()
class TCPTestListNull(unittest2.TestCase):
def setUp(self):
self.loop = pyuv.Loop.default_loop()
self.server = None
self.client = None
self.client_connections = []
def on_connection(self, server, error):
client = pyuv.TCP(pyuv.Loop.default_loop())
server.accept(client)
self.client_connections.append(client)
client.start_read(self.on_client_connection_read)
client.writelines([b"PING1", b"PING2", b"PING3", b"PING4", b"PING5", b"PING\x00\xFF6"])
def on_client_connection_read(self, client, data, error):
if data is None:
client.close()
self.client_connections.remove(client)
self.server.close()
return
def on_client_connection(self, client, error):
self.assertEquals(error, None)
client.start_read(self.on_client_read)
def on_client_read(self, client, data, error):
self.assertNotEqual(data, None)
self.assertEquals(data, b"PING1PING2PING3PING4PING5PING\x00\xFF6")
client.close()
def test_tcp_list_null(self):
self.server = pyuv.TCP(self.loop)
self.server.bind(("0.0.0.0", TEST_PORT))
self.server.listen(self.on_connection)
self.client = pyuv.TCP(self.loop)
self.client.connect(("127.0.0.1", TEST_PORT), self.on_client_connection)
self.loop.run()
class TCPTestInvalidData(unittest2.TestCase):
def setUp(self):
self.loop = pyuv.Loop.default_loop()
self.server = None
self.client = None
self.client_connections = []
def on_connection(self, server, error):
client = pyuv.TCP(pyuv.Loop.default_loop())
server.accept(client)
self.client_connections.append(client)
client.start_read(self.on_client_connection_read)
self.assertRaises(TypeError, client.write, 1234)
self.assertRaises(TypeError, client.write, object())
self.assertRaises(TypeError, client.writelines, 1234)
self.assertRaises(TypeError, client.writelines, object())
client.close()
self.client_connections.remove(client)
self.server.close()
def on_client_connection_read(self, client, data, error):
client.close()
self.client_connections.remove(client)
self.server.close()
self.fail('Expected write to fail.' % data)
return
def on_client_connection(self, client, error):
self.assertEquals(error, None)
client.start_read(self.on_client_read)
def on_client_read(self, client, data, error):
self.assertEqual(data, None)
client.close()
def test_invalid_data(self):
self.server = pyuv.TCP(self.loop)
self.server.bind(("0.0.0.0", TEST_PORT))
self.server.listen(self.on_connection)
self.client = pyuv.TCP(self.loop)
self.client.connect(("127.0.0.1", TEST_PORT), self.on_client_connection)
self.loop.run()
class TCPShutdownTest(unittest2.TestCase):
def setUp(self):
self.loop = pyuv.Loop.default_loop()
self.server = None
self.client = None
self.client_connections = []
def on_connection(self, server, error):
client = pyuv.TCP(pyuv.Loop.default_loop())
server.accept(client)
self.client_connections.append(client)
client.start_read(self.on_client_connection_read)
client.write(b"PING"+common.linesep)
def on_client_connection_read(self, client, data, error):
if data is None:
client.close(self.on_close)
self.client_connections.remove(client)
self.server.close(self.on_close)
return
def on_client_connection(self, client, error):
self.assertEquals(error, None)
client.start_read(self.on_client_read)
def on_close(self, handle):
self.close_cb_called += 1
def on_client_shutdown(self, client, error):
self.shutdown_cb_called += 1
client.close(self.on_close)
def on_client_read(self, client, data, error):
self.assertNotEqual(data, None)
self.assertEquals(data, b"PING"+common.linesep)
client.shutdown(self.on_client_shutdown)
def test_tcp_shutdown(self):
self.shutdown_cb_called = 0
self.close_cb_called = 0
self.server = pyuv.TCP(self.loop)
self.server.bind(("0.0.0.0", TEST_PORT))
self.server.listen(self.on_connection)
self.client = pyuv.TCP(self.loop)
self.client.connect(("127.0.0.1", TEST_PORT), self.on_client_connection)
self.loop.run()
self.assertEqual(self.shutdown_cb_called, 1)
self.assertEqual(self.close_cb_called, 3)
class TCPFlagsTest(unittest2.TestCase):
def test_tcp_flags(self):
loop = pyuv.Loop.default_loop()
tcp = pyuv.TCP(loop)
tcp.nodelay(True)
tcp.keepalive(True, 60)
tcp.simultaneous_accepts(True)
tcp.close()
loop.run()
self.assertTrue(True)
if __name__ == '__main__':
unittest2.main(verbosity=2)
| 32.136264 | 150 | 0.640336 | 1,848 | 14,622 | 4.904762 | 0.072511 | 0.092674 | 0.071492 | 0.046117 | 0.861099 | 0.826898 | 0.8109 | 0.8109 | 0.8109 | 0.806046 | 0 | 0.018157 | 0.242922 | 14,622 | 454 | 151 | 32.207048 | 0.800632 | 0.001231 | 0 | 0.792244 | 0 | 0 | 0.031438 | 0.006918 | 0 | 0 | 0 | 0 | 0.113573 | 1 | 0.168975 | false | 0 | 0.01385 | 0 | 0.238227 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b0e0a49d4b56a873f6100fa3d1123c9f072de438 | 49,917 | py | Python | main.py | swatibajaj54/some-hidden-messages-in-dna | b3962852ccb2abef18398e1a81d0822fca52a9a2 | [
"MIT"
] | null | null | null | main.py | swatibajaj54/some-hidden-messages-in-dna | b3962852ccb2abef18398e1a81d0822fca52a9a2 | [
"MIT"
] | null | null | null | main.py | swatibajaj54/some-hidden-messages-in-dna | b3962852ccb2abef18398e1a81d0822fca52a9a2 | [
"MIT"
] | null | null | null | from ori_finder import get_skew_diag_data, count, get_hamming_mismatch, get_pattern_match, minimum_skew_value, get_approximate_pattern_match, frequent_words_with_mismatches, get_approximate_pattern_count
from neighbors import neighbors
from motif_finder import probability_matrix, consensus_motif, score_matrix, gibbs_sampler, many_runs_gibbs_sampler, greedy_motif_search, greedy_motif_search_with_pseudocounts, median_string, motif_enumeration1, profile_most_probable_k_mer, entropy_matrix, distance_between_pattern_and_string, best_score_from_random_motif
from ori_finder import frequent_words_with_mismatches_and_rc
from genome_assembly import string_composition, string_spelled_by_genome_path, overlap_graph_problem, de_bruijn_graph, de_bruijn_graph_from_text, eulerian_cycle
from sequencing_antibiotics import protein_translation, peptide_encoding_seq
if __name__ == '__main__':
# rna_sequence = 'AUGGCCAUGGCGCCCAGAACUGAGAUCAAUAGUACCCGUAUUAACGGGUGA'
# print(protein_translation(rna_sequence))
dna_seq = 'GCCATGGCCATGGCCCCCAGAACTGAGATCAATAGTACCCGTATTAACGGGTGA'
peptide_seq = 'MA'
print(peptide_encoding_seq(dna_seq, peptide_seq))
# text = 'AAGATTCTCTAAGA'
# k = 4
# print(de_bruijn_graph_from_text(text, k))
# patterns = ['GAGG', 'CAGG', 'GGGG', 'GGGA', 'CAGG', 'AGGG', 'GGAG']
# patterns = ['ATGCG', 'GCATG', 'CATGC', 'AGGCA', 'GGCAT', 'GGCAC']
# print(overlap_graph_problem(patterns))
# res = ''
# with open('data.txt') as f:
# patterns = f.readline()
# # genome_path = ['ACCGA', 'CCGAA', 'CGAAG', 'GAAGC', 'AAGCT']
# patterns = patterns.split()
# res = overlap_graph_problem(patterns)
# f = open("genome_assembly.txt", "w")
# for key in res:
# f.write(key + ": " + res[key])
# f.write("\n")
# f.close()
# content_list = open("dataset.txt", 'r').read().splitlines()
# dict = {}
#
# for item in content_list:
# res_arr = item.split(": ")
# if len(res_arr) != 2:
# print("file format invalid")
# break
# val = (res_arr[1]).split(" ")
# int_vals = []
# for character in val:
# int_vals.append(int(character))
# dict[int(res_arr[0])] = int_vals
#
#
# f = open("euliar_path.txt", "w")
# res = eulerian_cycle(dict)
# res_str = ' '.join(map(str, res))
# print(res_str)
# f.write(res_str)
# f.close()
# # genome_path = ['ACCGA', 'CCGAA', 'CGAAG', 'GAAGC', 'AAGCT']
# res = de_bruijn_graph_from_text(text, k)
# f = open("result_de_bruijn_graph.txt", "w")
# for key in res:
# f.write(key + ": " + res[key])
# f.write("\n")
# f.close()
# text = "CCTTTGATCTCTTTACTATTTCGTTCAAATTCAATTCGCCTGACATCGACGGTTGACTTGCAATTGCCGCATCCCGAGGCTGACCTCCGCGCTGAGAGTGCGAATCTGTCGGAATTCCGCGGCCTTAGCAGCTCTCGCAAATACGCAGGCAACAGGACCCCGCTGAGGCCCAGGCCTGTTAATTTGTAGGCTGAGCTACCACGATTGAAAAGCAAAGATGACCTCATACTTTTATTAATGTATTTATCGACAAAATCTGTGTAATCTACCATTACGTGGCGCGTTCCGTAAGCTGTCTGCTCTAATATTTGCGGAAGGTAAGTTAAAGCGGAAAGTACTGCTAGGGTAGCAACTCAGTGATCAGCTGATTGTCAAACGAAGTTGCTAGGCTCCGGAAGCCCGAGTCCTTATATCAAGCACTAACAAACGAAACGCGTAGTAGCAATCCTGTCAGAAATTCTGCCCGCTAAGCCCGCGCTCACATTACATGCGGCTTTCTATCAACCCTGGGCTTACGTCTTCGCGAGCTCAGCCCTATGATCGTCTGTATAGGAAAGGACTAGCTGAAGCTGGGCACCACTGTGAGAATCTGGCACCGTATATCCGCATTGGCATATCCGTCATATCCTCATATATTTATCAAGAATAACAAATGCATCACCAACTTTGACGTTCTCCTGGTGAGCGCTATCTCCGTAGTGCCCAGGGAGGGGATGCAGAATCGAGACTCTCGTCCTGAGGCTAAATAAGGGACACAGAACATCGGAGTCCTTCATCGTGCGTCTTTCTTGGAGCCTCGGCATATCAGGGTACGAGGACCGATCAAGGCTGTTGAGGCCGCCACGATTTGACAGATTGCTACTGTCGTGGTCACTAAGATAGCATCGGAAGGCGCGGGGCTTGTCTTCATGCTTGGGGCTTAACCTGGTACAAGTGAGCATTACCCAGAACTCGTGTGTCGTATTCATATCCGCAACAAGTCGCCGAACCCTCTATATGTCACAGACGAGACTGCTGATGGTCACTTTGTGCGCTAGGTAAAGACGGAGTTGTAGGCATTCTGGGAGAGCTTTCGCCCCGGACTGGATCCTTGTGATTTTCGAGAGTATAGGTATCCCTGGCTTCACTGGGGGAGCGCACTTGTAGTTTGACTGGCAGGCTAGCATACTTTACGCAAATTTTTTCCTCTTGACTACCTAGTCGAGTGCCTATCGTGCTATTCGGGGTCAGATTCATAATGGGTTCGCTAATAGTAGATAAACTATGTGAAGTCAGTAATAACGGTCAATGGCCGTGTGCCAGCTCCTCGGGATGGCAACTTCACACTAGACCTAAAAGCTACTCTGAGAGGAGGGAGGTGTTGGTTAAGGGACTACCTCTACTGACTCCACTTGCCCGAATCCTCGGAGTTTGTTATTGTTACTATGGGTGGGAACCCCGCAAGTCGCGAGCCCAGGTGAAGCGGTAGAAAGTGGATGTCTAAGGTCGCATGTGGTAATTCACGGACACGATACGTAAACACTACGTCACCAGAGTTAGTGAGGTCTATCTCACCTATGGAGGAAGTACGACGCTAGTCCCACCAGGACGGGTGGATCTATCAACCAACTAGGGTAAAGATCACTTTTTGTCAGCCCTTGCTCAGTTGAGACTCCAGATTGATAAGGAGCTTGCGCGGCTTATGGTACGAAGTAGGCACTGTAAGGTGTAGATACGTTGTTATCGGCTTCTATGTGCGTCCATACCACAAGGCAGCTATGGAGAACAGATCGAGGAATACATATCTGTACAGGCGTATCTGCACCGGAGCGTCCGTGCTCTGCCTGGCGCGCAATGGTTATGTTTCTAGCTCCGGAGGGCGCTCTAGTTGGGACAATTCCAGTCAGGCGGACGGATTTGACCTCCACTCCGGTAAATGAATTCAGGTAACATGTAACAAAAGGAATGGATCTAAGACACGGCCTGTGACTCCCCTCTCATCGTTATAGACGTTTAAATATATTGCTGGAAGCGGGATCGAATTGGTCGCAGGTGTGCTCTGTAGATCACCGCTTTCCGAACGGTACCAGCTGTACGTAAAGCTTATAGGTCGACCTCGATGCGCTTCGTCCAGGAGGGGCCTTTGCTGAATACGGCGCCTTTATTTGAGCCTCAAGCCTTACATCGGTCGTGTGACTCCCGGGTAGCTTGATGATCATATCTAGTCGGGGAGATGGTTAGCCGACTCTTCTGTTGGACTCGGGCAATAAGAATCGGTGCTTCCGAAGCCGTGGGCAAATTCCTTCGGTGTAAAGCTACAAAGTTGGACTGCTACGCCCGCGATCTAGACCCCCAGTGACTCGTGCCGGTGGTGCATATGACGTTACCCTTTTCCCGCACGCATGCCCATGTTTGTTTCAAGGCCATGAACGCCCAGGAGTGGCGTGCAACTTTTAAGGCGATAACCAGCCAGACTCTTGATATCCCTATTTTGTCACAGTGCATAGAGGTGGTAGACTTTAGCAAACATGATAACACAAGGCATGCTTAACGATCGGGTAGTATGCTATGCTAACGCCGCCATCCGTGTTGTACGACATCTACTAACACCCTACGAGATAGGGGCCCTATTTCTCGGCTCGACTTTATTAACACTCACCGCGACTTCCTGACGCCATCGCCGGGCACGTCAACGGGTCAGAATACGTGGCTAGTGCTAAAAATCCGTCACCCAATCGGTTTGGGCGGCGGGAACGTGGCCGCCAGAGACCCGCTCAGAGTTGAGACACAGGTGGGCGATTTGACCACCAGAAGATAAACATCTTTCCCACCTAGCTTGTAGTCCGTGCAAATGAACCAACATCCATGAATCGGACTGCAGGGTTAACCCGTCAATTTCAACTTTTATGGCCTAAACCAAGTCATGAGTGCCTAGACATTTGCCTAGTGGGAGCCCGAAGGTAGCGAATTACGGCGAGGTCTAGCAGCGCAGCTGAGTCCTCTGGCGTGCCTGGGACACTGCTTTGACGCAATGCAAGGTGGAAGGCCGGGTATTGGTTTCTATATTGCCGTTGACTACAATTACCTGGCAAGCTCGAGCACCATAGTCAAGTTCCCTTAATTTACTCTTCGGATTGAAGCCACTCGTTCCTGGCTATGACATCGGGACCATTGTAGATACCTTGTCTTCGCTTGGAATACCTCTCCCACATGGCTACCCTTCTAAGTAACATATACAAAACCAGCAGTACACCTTTGGGTACCTTACTGAAAGTACCGTCCGACCACAGGGGCTTTTCCGTCAAGGAGGTTACGAGCTAACGAAGTTTAGTCCTCGTCTTATAAATGTAATCGGGTACTTGGTCTTTGGGAACCGGTGTTATGCCCATCCCATAGCAAGCCCGGAACGCGTAGCTGAATGGTTTCGCTAGAGCCCAACCGGTGCCGAGTGTTGCCTGCGTGCCATATAACGGGCGTAACATTGTTGACCTGTGCCGTCAGGCTCAAAGTGAGTAGATAAAAATTTCCAGTGTATGACGGGGCATGCTATTTGCTCGTCCCGCGGGCAAGAATAGTCTTCGCGCAAACAGAAATTCGAATCCTGAAAGGCTTACAGGGTTACCTGAGGAATGTACCCGGGCCTCTAAATACCGCCATCGGCAAACCGCCCATCTCTGGTTCGCGTAATCTGGCGTGTCCCGACATGTTCACGCCTAAACCCAGGTTTCAGAGGGCTAGACTGCCTGATCTACGACGGGGATGACTGGAACAGATACCTTGATTGATCAAGTTACGGTGAGTACGTCATCATCTCATATGAAAGAGGGAGTATCCACAACTTTACATTAAGGCCTGCGGGCCTCGGTAAAGAACAATCTGCATCGATCCGGTAGTCAGAATTCTACATCTGCGCCTGGTAACAGCACCTCGACTTTAACTTTCATTCAATTATGAATTCTTTAGCATATGTGCTTATGATGCATACTTGCTAACTCTGTAGCAAGGAAACAGGGCACGAACTCTATTATTCGGTATGTCACGGACTGAAGCCAGGCTCAGGATGGTTGACGAGTCGAACTTCTAAGCGCCTTTTTTTATGCGTTCGCTTGGGTTTCATGGGCGTGTACTTAATGTTGGCCTGCCTTTCGAGAACAGCGGTCACAGCCGCTAGTATACAAAAACTCACGCTACGTACGATAAGTTCTGACAACACCGAAGTGGGTCGGAAGTAGTTCTTCTAGAGATCGATATAAGTCCGATTCGTGCTAAAAGTAGTTAACACCGCAGCGACGATCGTATGGCTCGAGAGTGGGCTGGCTTGAATCTGACGTCGATGAGGCCGGACTCCCATAATGGAACTAACGGATGAGTGAATGTGTCTAACGCGAGCTAGGCACGGTGCGGCACGCTCCGAGGGCCGTATAAAATGAAAGAAAAGTAGTCCTACAGCGATTGGCGTTAGATACTATAGGTAAATAAGCATTCGCCCGTATTTCATTATCTCTACGGCGTTGAAGAATAACGATACGGAGCAAGCCGCAGGCTAGCTTTTCGTGCTGTCCGTTGGGCAAGCATAAAACGCCTCAATGGCTACTCTGTACGTTTCAAGTGCTCGATTGCCGCACACCAGAAGCAGGTAACGAGTCATTTGTTCCCAAAGAGTGTTCGCATCGGTTTTCTTTATTACGACAAAATATGCCCAGATAGCCAGCAGGGAGATTGGGTAGTAGGTCGCAAGGATACGTAGTCAGGATTCAACGAAGACTAAGAACCTCGTTCGCAGCAACGAATTCCCTAGACCGGCATGGGAATGGTGGAATAGTTACTACAAATCCCATCAGATGGGTAGCCTTCCTACCGGGTGGCATATTCGCAGGTTTGATTCCGTGTCTACAGGGGGGTAAACACGTGGAAAAGGGCTCGGCCGTTACACTGCTAGTTAAGCCTGTAAGGGCCGCATCTAAAGATCACGGCGAGCTGTCGAATCCGGTCTCACGTTACCGTCTTTATCTCACTTACTG"
# k = 100
# res = (string_composition(text, k))
# f = open("result.txt", "w")
# for item in res:
# f.write(item);
# f.write(" ")
# f.close()
# dna = ["CTCGATGAGTAGGAAAGTAGTTTCACTGGGCGAACCACCCCGGCGCTAATCCTAGTGCCC" ,"GCAATCCTACCCGAGGCCACATATCAGTAGGAACTAGAACCACCACGGGTGGCTAGTTTC", "GGTGTTGAACCACGGGGTTAGTTTCATCTATTGTAGGAATCGGCTTCAAATCCTACACAG"]
# k = 7
# # genome1 = 'CAGAAAGGAAGGTCCCCATACACCGACGCACCAGTTTA'
# # genome2 = 'CACGCCGTATGCATAAACGAGCCGCACGAACCAGAGAG'
# print(best_score_from_random_motif(dna, k))
# motifs = ["ATGGC", "GGTCA", "GGTCG", "GGGGC"]
# open and read the file after the appending:
# dna = ['CGCCCCTCTCGGGGGTGTTCAGTAACCGGCCA', 'GGGCGAGGTATGTGTAAGTGCCAAGGTGCCAG', 'TAGTACCGAGACCGAAAGAAGTATACAGGCGT', 'TAGATCAAGTTTCAGGTGCACGTCGGTGAACC', 'AATCCACCAGCTCCACGTGCAATGTTGGCCTA']
# dna = [
# "GTAGGTCAAACCGGGTGTACATACCCGCTCAATCGCCCAGCACTTCGGGCAGATCACCGGGTTTCCCCGGTATCACCAATACTGCCACCAAACACAGCAGGCGGGAAGGGGCGAAAGTCCCTTATCCGACAATAAAACTTCGCTTGTTCGACGCCCGGTTCACCCGATATGCACGGCGCCCAGCCATTCGTGACCGACGTCCCCAGCCCCAAGGCCGAACGACCCTAGGAGCCACGAGCAATTCACAGCG",
# "CTCGCTGGTCATAATGCATCCTTGCGATTGGGGTGTAACCTCCACAGACAAATCCGACCACCGCGCCGGAGCCCTCCGATAGGGCCTGAAGTCCCGGCCACTGGGGTCATTCGTCCGCCAGGGGCGAAAGGTCTGCCACATTGGGGACTTCCGGCCCTAACTGAGCCGGCGCCATCGAAGCCAGGGTTAGCGCATGGCTGTCCGGGCGGGGCGCGGAGTTCGCCGGAGTCCGAGACCCCGGATCGTGTCG",
# "GCGCCCCGCCCGGACAGCCATGCGCTAACCCTGGCTTCGATGGCGCCGGCTCAGTTAGGGCCGGAAGTCCCCAATGTGGCAGACCTTTCGCCCCTGGCGGACGAATGACCCCAGTGGCCGGGACTTCAGGCCCTATCGGAGGGCTCCGGCGCGGTGGTCGGATTTGTCTGTGGAGGTTACACCCCAATCGCAAGGATGCATTATGACCAGCGAGCTGAGCCTGGTCGCCACTGGAAAGGGGAGCAACATC",
# "GTACATGTCCAGAGCGAGCCTCAGCTTCTGCGCAGCGACGGAAACTGCCACACTCAAAGCCTACTGGGCGCACGTGTGGCAACGAGTCGATCCACACGAAATGCCGCCGTTGGGCCGCGGACTAGCCGAATTTTCCGGGTGGTGACACAGCCCACATTTGGCATGGGACTTTCGGCCCTGTCCGCGTCCGTGTCGGCCAGACAAGCTTTGGGCATTGGCCACAATCGGGCCACAATCGAAAGCCGAGCAG",
# "ACGGGACTGGCGACCGAGGTCAGCGATCCCGAGCAGGTTGCCCGCTACCAGCGGCTGCTACACCCGTGGGTGAACATGGCGATGGACACCGTGGTCGCGATCGAACCCGAGATCGTCACCGGCATCCGCATCGTTGCTGACTCGCGTACGCCGTAGCCGATTGGCCGCGGGCGGCCCGCACGCATCCGCACTATCTGATAAATTCTTCAACTCGTCAACCGATGTAACGCTGAAGCTCTCAGGAGACGCG",
# "GCATCCGCGTCCGGGCAACCAGGGACGTTTCGGCCACGTCGCAAGCGACAGCCCAGGGTGCCGGATGCCCAGCCGCTCGCGGTGGCGGGCCGTCCAGAGCCGTCGGCAGCCAGTGACCAGCTGGGGTATCCGCATGGGGTCGCCCAGCGGGTCCCGAGGGGACTTTTGGCCACCGGCGCTGGTGGCCTACTGCCCTCCCGCCGTTGCGCCGGGTGCGTGCACGATTGAAGTCCCCAAGGAAGGGACGCTC",
# "ACGATAGATCAACCGGACCACCGAGGGTTGATTATTGAGGTGCGCTCATCCGATGGTTCGCCGCCGTATGTGGTGCGCTGGCTCGAGACCGACCATGTGGCGACGGTGATTCCGGGTCCGGATGCGGTCGTGGTCACTGCGGAGGAGCAGAATGCGGCCGACGAGCGGGCGCAGCATCGGTTCGGCGCGGTTCAGTCGGCGATCCTCCATGCCAGGGGAACGTAGGCGATTCGCTCAAGCGACGAAGTCG",
# "CGATCACCTCAATCCGATGACCTATGAAGCGCGCGGGCCCGGCCGCCATCGGCCCGTCGATCCGAGTGCGCACGGCCGAAGTGAGCCACCACCGTAGCGCCGCCGAGTTCGCTTCCGCGGACGCAAGCCCGGGATTTGCGGAGTAGCGTACAAGGCGGGATGCGCAATCCGGTCGCTGAGGGACTGTCGTCAGTCGGCCCAGGGACCTAATTCCATATTTGGAGCCATGAGATGGCGACACACTCGGCAG",
# "GCAAACCGTGAGGCTAGGGAAGCGAGGAGCACATGGCCGCCGACCCGCAATGTACACGCTGCAAGCAAACCATCGAACCCGGATGGCTATACATCACCGCCCATCGCCGCGGTCAAGCCGGGATCGTCGATGACGGCGCAGTACTGATTCACGTGCCCGGTGAATGCCCGCACCCCGGGGAGCACGTTCCGCGAAGCTAGCCGGTTGGCATGCTAGGCAACGCAGCGTCAAGTGAGAGGGCAAAACCAAA",
# "CCGCTGGCGACGCTGTTCGCCGGCAGCGTGCGTGACGACTTCGAGCTGCCCGACTACACCTGGTGACCACCGCCGACGGGCACCTCTCCGCCAGGTAGGCACGGTTTGTCGCCGGCAATGTGACCTTTGGGCGCGGTCTTGAGGACCTTCGGCCCCACCCACGAGGCCGCCGCCGGCCGATCGTATGACGTGCAATGTACGCCATAGGGTGCGTGTTACGGCGATTACCTGAAGGCGGCGGTGGTCCGGA",
# "TCAGCACCATGACCGCCTGGCCACCAATCGCCCGTAACAAGCGGGACGTCCGCGACGACGCGTGCGCTAGCGCCGTGGCGGTGACAACGACCAGATATGGTCCGAGCACGCGGGCGAACCTCGTGTTCTGGCCTCGGCCAGTTGTGTAGAGCTCATCGCTGTCATCGAGCGATATCCGACCACTGATCCAAGTCGGGGGCTCTGGGGACCGAAGTCCCCGGGCTCGGAGCTATCGGACCTCACGATCACC",
# "ATGGCGATCCGCTCGATACTCCGCTGGCAGCCCGAGCGGGGGGTGTCCCCTACCCTGGCTGCCCCGGCTGACGGTATTCACCGGTATCCCATGATGCACTCAATCCCGATGCCTGCACCTTCTGTAAGCTCGCGACGGCTAACGGGGTTATCGGATGGCCCGGGCTCGAAGGAGGTTGGCACGTGTGGATGTTCCTGCCCGGTCGCCACGCGGCGGCGGCGATACCCCAGAGTCGTCAGGTGTTTAGCCG",
# "GGGTCAGGTATATTTATCGCACACTTGGGCACATGACACACAAGCGCCAGAATCCCGGACCGAACCGAGCACCGTGGGTGGGCAGCCTCCATACAGCGATGACCTGATCGATCATCGGCCAGGGCGCCGGGCTTCCAACCGTGGCCGTCTCAGTACCCAGCCTCATTGACCCTTCGACGCATCCACTGCGCGTAAGTCGGCTCAACCCTTTCAAACCGCTGGATTACCGACCGCAGAAAGGGGGCAGGAC",
# "TGTCCGCTGTCGAGTTTCGCGACATGGCCGAGACCTTTGAAGACGAGGAGCACCGGCGCTTTGGCGAGGCCGGTTTTCAATCGGTGGTCGACAAGGTCGCCGATATCGAAAAAAGCCTTGGCATCTACGACCTGAGCCAGTTCACCCCCAGCTAAAGACACTAATGCCCTTGGGTTAGGGACCATCGCCTCCTGACGCGATCGCGACAGCTGGCTAACGTCGGTAGTACACCCATGCAGAGGGGACGCCA",
# "TCCGGTTATCGTCGCCCGAATCCCCCAAGATCCGGCAGTGCCGGCCTGAGGGCCTGTGCGATCTGCTCGGGTGGTGCCCACCCGCGCGGAAAGCCCCGTCCGAACCGTGATTGGGCAACGTCGGGCCGGGCCAGCAGCGCTGGACCGTAGGTCCCTGCAGTGGATGACTTACGGCCCTGATCCACACCGGCGACCGTTAGGCAGGGTTGAGCCAACCGTCGGTTGAGCGTCTGGCTGCGAGGTGAGGTGA",
# "TTCGGCGACGGTTGACGTGATGGCCGACAGAATCGTTGCTAGAGCCGGGGGCAACTCCGACGCCACCGCCGAGATCGCGGCGGCCTTGGCCGCCCGGCAAGCGGACTGGGACACCGGGCACCGGATCGACACGGCGGGGCCGCGTGAGCGCTCCGTGGGACAGGCCTACCACATCTGGCGCAGCGCGATCTGAGAACGCCGAAAGGAAAACCGATGCCAACCATCACTGTCAGCAGCACATCGTCGCTGT",
# "CGTCCATGCCGCGGCGTGCACTCGCGTGGCCTTGTCGACCACGTTGTCGAGGCCGACGATGACAGCGTCGTGGTAGCGCCGGTCGATGGTGGGCGGCGGTCCGGTCACATTGACGACGATGCCACAGCGCACGCTTGGATTCGGGCGTCGAATCCAATCATGGTCGGTTGGCCGTCCGATTGGGGACTAAAGCCTCATGACCGGTGACTGTCCCGGTTCATAGCGTTGCTCGAGGCGAAGGGAGAATGAG",
# "CACCACGTGGACCACGGTCAGCGGAATGTTCCTCATCGCCGCATCGGTGGCACCCCAACAGGCGGCGGCATCCGATTCGAGCGAACCATCTACCCCGACGACAACTCCGTGCTGCTTGCGGGGTTTAGACATCTCATTCTCCCTTCGCCTCGAGCAACGCTATGAACCGGGACAGTCACCGGTCATGAGGCTTTAGTCCCCAATCGGACGGCCAACCGACCATGATTGGATTCGACGCCCGAATCCAAGC",
# "TTCCCGCGGATCAGATCTTGACCACCGGGAGTGTCGATGAACTTCTCGCGCTCTTGAAATGACGGGCTATCGTAAGTTTATGGCCTGGGGGAGCGTGAATCCCGCTGGCGGTCGGGTGAACCGCCCCGGTTTTCTTGCACCCCGCGTCGACGTGCCAGTGACGAACTTGACGAATAAGGCCTTTGGTCCTTTCCGGTAGGGGTCTTTGGATAGGCGCGATCCTCGGCATCGGGCCGGTAGCTTGCCGTTT",
# "GGCCAACTGCACCGCGCTCTTGATGACATCGGTGGTCACCATGGTGTCCGGCATGATCAACCTCCGCTGTTCGATATCACCCCGATCTTTCTGAACGGCGGTTGGCAGACAACAGGGTCAATGGTCCCCAAGTGGATCACCGACGGGCGCGGACAAATGGCCCGCGCTTCGGGGACTTCTGTCCCTAGCCCTGGCCACGATGGGCTGGTCGGATCAAAGGCATCCGTTTCCATCGATTAGGAGGCATCAA",
# "AAACTCGGGGAAGAGGGACCGCGGGTGGCGCTGAACGGGAAGGGTGGTGGCCATTTGATGCCTCCTAATCGATGGAAACGGATGCCTTTGATCCGACCAGCCCATCGTGGCCAGGGCTAGGGACAGAAGTCCCCGAAGCGCGGGCCATTTGTCCGCGCCCGTCGGTGATCCACTTGGGGACCATTGACCCTGTTGTCTGCCAACCGCCGTTCAGAAAGATCGGGGTGATATCGAACAGCGGAGGTTGATC",
# "CATCCATCGTGCCGCGCGTCGGCGAGTCCTGTTGATGCGCCACACATTCCGCAGGCATCGTGAACGCTTGACAGCCGTCGCCATTGTCGCGCACAAACCGCACGTCGATTCGTGATCCATTGAGGACCTAAGCCCGTTGGGCTAGTGACAAACGCCTCCTGAGCAAAACCCTCCTCCCCCGTTACCGTCGTGCGGTAGGGACAAGCCACATCGGCCGAGCGGGCGATCAGCCAACGACAGGAGGACCGCG",
# "GTCGATGCGAACGCAAGCATCCAGGAGATGCTCAACGTCATGGAAGAACATCAGGTCCGCCGTGTTCCGGTCATCTCAGAGCACCGCTTGGTCGGAATCGTCACCGAAGCCGACATCGCCCGACACCTGCCCGAGCACGCCATTGTGCAGTTCGTCAAGGCAATCTGCTCGCCCATGGCCCTCGCCAGCTAGCGACCTCGGATCATCCGCCGGCACGAAAGGCTTTATCCGCCAGCATCGGAGGTACCCA",
# "CCGATCGGCATCACTATCGGTCCTGCGGCCGCCCATAGCGCTATATCCGGCTGGTGAAATCAATTGACAACCTTCGACTTTGAGGTGGCCTACGGCGAGGACAAGCCAGGCAAGCCAGCTGCCTCAACGCGCGCCAGTACGGGTCCATCGACCCGCGGCCCACGGGTCAAACGACCCTAGTGTTCGCTACGACGTGGTCGTACCTTCGGCAGCAGATCAGCAATAGCACCCCGACTCGAGGAGGATCCCG",
# "ACCGTCGATGTGCCCGGTCGCGCCGCGTCCACCTCGGTCATCGACCCCACGATGAGGACGCCATCGGCCGCGACCAAGCCCCGTGAAACTCTGACGGCGTGCTGGCCGGGCTGCGGCACCTGATCACCTTAGGGCACTTGGGCCACCACAACGGGCCGCCGGTCTCGACAGTGGCCACCACCACACAGGTGACTTCCGGCGGGACGTAAGTCCCTAACGCGTCGTTCCGCACGCGGTTAGCTTTGCTGCC",
# "CACCTGTGTGGTGGTGGCCACTGTCGAGACCGGCGGCCCGTTGTGGTGGCCCAAGTGCCCTAAGGTGATCAGGTGCCGCAGCCCGGCCAGCACGCCGTCAGAGTTTCACGGGGCTTGGTCGCGGCCGATGGCGTCCTCATCGTGGGGTCGATGACCGAGGTGGACGCGGCGCGACCGGGCACATCGACGGTCCCCGGGGCTTTGTGGGCCAGTGAAGTGACGAAAGACCCCAGTGGACACGGACTTCGGC",
# "GTGCCACGACGCATAGAAGACATGGCCATGCCACACCCTGATAGCATTGCAGCAAGCTACATGTACTGCTCTACCAGGATCCTTATGGGCAACAGTGGGTTTGAGTTATGAAACCCGTGGGCACATACCCTTCCGCGTCGTACTGGTCAGTCTCGACAGCGAAGAGATCACCGGTTGATCCACCAAGCATGCATTGGCGGGCATCTGCATAAACGGTGACGTATCAGCACAAAACAGCGGAGAGAACAAC",
# "GCGAGTACGCTAGTTCAGGTGGGTGCCGTGCCGAAGGCGGTGTCACTCAACGAACTTCGGTTCTCGCAGGGTCGCCACGGCTGGCGATGTGCGGTAACGCTCGATGTGTGAATTGAGACCTGATTCATGAAAATCGTCGAGGAGACCCCATACCGGTTCCGGATCGAACAAGAGGGCGCGATGCGGGTGCCCGGGATCGTGTTCGCGTCCAGGTCGTTGCTGCCTCGTGACGAAGGCGACATGGCCCTTG",
# "CTTGAGGTCCAGTCCGGCGCAGAACACCGGATCGGCGCCGGTGAGGATGACGACGTCGATGTCGTCGTCGGCCTCGGCGTCGGCCAACGCCGCGAAAAACCGATCCCGTAGCGCCGCCGAGAGCGCGTTGCGGGACTGCGGCCGGTTGAGGGTGAGGGTTCGCACCCGTTCGTCGGTGTCGATCAGCAGGATGTCGTCGGTCATTCGATCACCGTAACAGGACCGTTAGACTGTCCTAATGACCAGAAAA",
# "CGCGCGAGATCTCATCGACAACCCACTTCCCATGCCTCACGACGGTCACCATGTCGCGGGCATATTTACGTGAGGCACCGAGGGTGTTTCGCGGGCATTCTTGGTGAGTCAAGTCGAACGGTTGAGCCATGATCGACGATTCCGTTACCGTGCTGTCAGAAGACGAAAGTTGGCACCGGCTGGGCAGCGTTGCACTCGGTCGGCTAGTTACCACCTTTGCTGATGAGCCTGGGATCTTCCAGTCAATTTC",
# "GGGGGCCCGGACGGCCAGGGTGAGAACCGTTCGCACGGTTTCGGCGTCCGGGAAATGGGTGTTCATGGCTGCCGGGCCTTTCCCATCAACGGCTGCGGTCATCTATTAAGGATCGCGCGTCGACAGCCGCGGTGGCAGAGCACGAAGGCTCGCCAGCGGAGGACCTTTGGCCCTGCCTCAACGGCTCCTCGCAGCGGAGAGTGGTACTGACGGTTGCAGATAATCCGGATCACCGGGGAAGGCGCTGACC",
# "ACTCACGTGCCGATCCACGTCTTCTGCCTTGAGAAACCCGGCGTCAAGTGTCGTTAGGTGATTCATGGTCAGCGCCTTCCCCGGTGATCCGGATTATCTGCAACCGTCAGTACCACTCTCCGCTGCGAGGAGCCGTTGAGGCAGGGCCAAAGGTCCTCCGCTGGCGAGCCTTCGTGCTCTGCCACCGCGGCTGTCGACGCGCGATCCTTAATAGATGACCGCAGCCGTTGATGGGAAAGGCCCGGCAGCC",
# "GTTGGCGCATGTACACCTGAGCCGTCGGCTCGCCCACTGGACCCGGCTCTACCCCGAGGTGCGGGTGGATCGGGCCATCGCCGGCGGCAGTGCGTGCCGTCATCTGGCCGCCAACGCAAAGCCGGGTCAGCTGTTCGTCGCGGACTCACACTCCGCGCACGAATTGTGCGGTGCATACCAGCCCGGATGCGCCGTACTTACGGTACGCAGTGCCAACTTGTAGGGAGCGGATCTTGGGAGTGGTGCCCTG",
# "GGCAGCTGTCGGCAACTGTAAGCCATTTCTGGGACTTTGCTGTGAAAAGCTGGGCGATGGTTGTGGACCTGGACGAGCCACCCGTGCGATAGGTGAGATTCATTCTCGCCCTGACGGGTTGCGTCTGTCATCGGTCGATAAGGACTAACGGCCCTCAGGTGGGGACCAACGCCCCTGGGAGATAGCGGTCCCCGCCAGTAACGTACCGCTGAACCGACGGGATGTATCCGCCCCAGCGAAGGAGACGGCG",
# "GTTGTCGATGAGATCTCGCGCGGAGGTCACCAGCAGGTACGCCAAGGCGTATGTGCAGGCTTTGAAGAAGAGCCGGGGCCGGATTTTCGACCAGGTGGTTGACCTGACGGGCTAGTCACGTGATAACGCGCGGCGCCGGCTTGTCGCAGCGGCCAAGCTATCGCCGGGGCTGGGCCGCAGTGTTGCCAAGCGGCGGCGCAAACCGCGTTCGCTGAAGTACTCCTATGACGCGCTGAAGGTGTTGCAGAGG",
# "TAGAGGGCTCCGACGTGCCGGTGCCAGCCGCCGCGTTCGAAACACAGCCCTAACGACACGCTGCCGAATATGACCCGTGTCGGAAATTAGGGCGACAAGAGTAATGCGGCTCAACATAGCCTTGCTTTACTTAGGCAAACCTGCCTTCAACCAGGAGGTTATTATCATCCTGTGGTAACTAGGAAAGCCTTTCCTGAGTAAGTATTGCCTTCGTTGCATACCGCCCTTTACCTGCGTTAATCTGCATTTT"]
# dna = ["CAAGGTCGTGGGACCGCCCTGATTGGTCAACCACAGATGTTGTCCGCACACTTGGGCGGTAAGGATGCATCACGCCGCAATGGACGCTGGAGGGACAGTGGAAGCAACCATTGTAATCTGTTTAGGTCACCTGCTAGACCCTGTGATGGTCACGTCGCGAAGCGACCCACCCATGAGTCATTGGGTCGAGATGGCTACGGCCTGGAGTATTTATGATCCCGTTGCGGGACATGTATGGAACGGCGAGAATAAGGCTGGCTACCTCGTTTAAGACTTACGACGTGTAACCAGATGTCCCTCGGGGACAAGGTCGTGGGACC", "GCCCTGATTGGTCAACCACAGATGTTGTCCGCACACTTGGGCGGTAAGGATGCATCACGCCGCAATGGACGCTGGAGGGACAGTGGAAGCAACCATTGTAATCTGTTTAGGTCACCTGCTAGACCCTGTGATGGTCACGTCGCGAAGCGACCCACCCATGAGTCATTGGGTCGAGATGGCTACGGCCTGGGCCCTAGGAGGCAAGAGTATTTATGATCCCGTTGCGGGACATGTATGGAACGGCGAGAATAAGGCTGGCTACCTCGTTTAAGACTTACGACGTGTAACCAGATGTCCCTCGGGGACAAGGTCGTGGGACC", "AGTAGAGTCTAACCTGGGGCACAACCCTACGCTACAGAGTCCCGCGTAAGTGCGCGACCGGACTAAAGGCGTACAAAGCGTTTCTCATAAGGTGAAACGGAGTTGCTCCGAAATAAGTACGCGGGCTGAGGCGATTGGCAAGAGGGGTAGCGGTATAATGTTGTGGCCTTTCCACCGACTCGCGACCTCGGTTGGCGCTTCGGATTAGGCCCGCTTATGCAGCGACGGTTTTCTATTTCTTTCTACGAAAGACTGGCGGGAGGAGGCAAAAGGTATGCGGACGTGAGCGCCGCTTCTTTCTTCTGGTTATAAAGATCTAC", "GCCCACGGTTGACGCCTGTATCGTGAGCTAACGGCGACAAAAGGGTCATATCGCGAACTAAACCGCCAAAATATACAGTGTAAGGTTCCAGCCTTGTCGCCTAAGGCATGGGCCGAAAGCCCAGCTGGGACGATGATTCCCACGCATCTGCCATCGACTGCCGTGTAGCCGAGAGTTCGCGCTCGAAGCACTAACTCCTAAAGGTTCATCACTCAGATTTGGGGAGCAGATAATGTGGCTTAAAGCAGGCTTAAGGACCGAACATGGGCTGGTTGGAACCGAACAGGTGTAATTGTAGGATCTATAGTGCCTATCCAGGT", "AGCATTAATCAGTTGCTCGTTACGTTACTCTGCGCATGACGCAACACGTGGTTGGGCACCCTCATCATTCCCTAGTCTAAACTAAAGCTACGGCGCTGTGAGCGCAGGTGAAGCGGAATGCGGCGAATATGCTAGTTCCCTGTCATCGTCAGTAGCCATCTATCCAGTTAATCCCCTGCGTCTCACTCGAACATTATTTCGAGCTGTGGTCTCGACACATAGTGCGACACAACCAATCGGTTTCGAAATCTGGAGGTAAGGAGGCAAGGTCGATAGTTGCGGAAGGGCCAGAGACGTACTACTACCTGGCCCGCAATGCC", "GATACATCGCATAATGATATGTTGCCCACCCGTGCCAGTCGCCCCGAATCCCGTAACTACAGGCCGACGCATCGATGCTAGGTTTAAGTTGTTCCCTTTTTCGACTGTGATGTAGAATCTAGGTCTGCGTACTGTAATGATCTTCTCGTTTCTGTACATGTATCAACACCGATGTCGTCAGGGATCACTAAGGGTGAGTATGCCCTGCCTCGGTCGCACTTAGGCTTAAGGAGGGGGAGTTTGGGCCCAAATTAGGAATCCATTCATACCTCCCCATGTGCCCCATATCGATTTGTATATTACGTTTTTTGGCGCCCGGC", "AAATATTTAGCAATATTCCCCATTTGCCTAGCAAAAAAGTTAGGGCACGGCCTTGAATTACCTACGCATAGCTATACCTGAAATGGAAACCCTCGCTTACCCTTAAGGAGGCATGATCTAGGTGTCTACGCGCAACACGGTCACTATCATTATCGATCGCAGTGTGCTAATAAGCCGGAACAACTGGCCTCGTGCTCGGTACGGAAGGTGATCCGGTCTTACATCTCCGCTAAAGAGTTCCGCGGGTACTCTTAGTAGATCGATATGAGTGTTGCGATCCTGGGTTGTCATAACATTCCTGAGCTTGGTTTCTTGGTAGT", "ATCCCCCATCGACGGGCTTAAGGAGCATACTTACACTATTTAACGTGCATAATGAGGGTTACTGTATTGACCGGGTCAAATCGCAGTCCACACTTACAGCAGATAACCCCAGATGTACCTATCCACAGAACGGGGATGGCAGTGCCGGATTGAAGGAGATATAATAGCCGGGGGATCGCTCGACGAATACTCCCCGGAAAAAAGCGGCTGTCTACGATGGAGCATTGACGTGGGGCTAAACACAGAAAGACTAGTGGTGTGATGAGACCCCTCTAGAATCTTTGTTTGGATCTCTGCCCCGCACCGCGGAGCCTGGTCGT", "AGCTACAATTTCTGTTTGGGGAGGTTTCAACACCGAGACATGGACTTCGGGTTCCCGATCGCTTCCTTGTAACGCCTGCGAGTCCCTAGGATTGGAGCCCGCTTTGCCGCCATTAATCTACAGGGGGAATGTAGTTTTACTTACCTCAACTCACCCTGGGTAGGAAGTCAGGGGCCTCCATGTCGGGGTAGTCCGGTTTATGGGTTGCGCACTCCTGCATTGTCGTTTCATTAAGGAGGCAACGAGTAATTCGTCCTCAGCGTGTATTATAGGTAAATAATACATTGTAGTCAACTGACATCTGGATGATTCGAACTCGT", "CACAACTATAATCCGTCCTCTTGGTCAACATAGGTGGACCGAGGATGATGCTAGGTGTTGATCATCCAGACTCACTACCTACCGTAACGCGTCCGCTGTGACAGATGTGGCTTATCCAGGCAATGCGCCACATGCCAAGGCGATGCGTCGACCCATACGATTGGTAGCAGTTTTACAGGTTACTGGAGTTACCTCTGACGCGTTGAACGTTAAGGCTCTCAGAACGCCAAAAAGCCCAGTATGGCAAGCTTGCATGTCGCTTCGACAAGGCTACAATTCTTCAGACCGACACGCACCTTTGGGTCGTGGAAGCGCAGCAA", "CTGTGCGGAATTTTGGAACTTCTATTCGCGCGGCTAAGATAGCGTAGCGAGTTGACTCCTGAGGATTGTTAGGAGCAAATTATGCAGGGCTCGATAGACTGATGAGTGATTAGTGTCGGTTTCGCAATATTTCTAAAAAAAGACTTTAAACAGGCTCGGTGTAGCTTTCCCTTGATGATGCGCTTAAAATTGTGGCCCAGTTTTACGACATACTCGTTACGGACACGTCCATAATCAGGCCCGATAAGGGGTGCTACTAAACCTGTATTTTTAATGGCTTAACTTGGCAAGAAAGGCCAGTACCTCCTTAAGAGGTTTGT", "CGAGATCCACGTCGGCGGTCATCGAGAGGCGGTCCCGCCCGAGGTCCACTCCGTCTACGGCCCCACTTCTAGCGGTCACAGGCATAGGCGCTGGTCCCACGGCAAAAGGCTTCAACAGAGAGAGGCTACGGGAGGCAAAGGAGGGACTATAGCGGTCACCCTAGCAACGGTCAAACCTTTGTGCTCCTAAAAAGCGCTACTTTCGCCAATTGATTTCGCCCGCTTTGAGGGCGTAGGGAGGTGTAGCCAGCAAGGGCAGTTTTACCATTGACCCGTCGAATCCTAGTTCGCTTAAGCCTAAGCACCCCTAAAAGCATAGG", "GTACCAGTATATGGCTTAAGTCTGCAAACGCTTCGCTGAGAAGTGATGGTACTGTAGAGAGGGGTCTCTATATACTAACGCGCCGCAACATGCAAGGACCTTTTAGGTCCTGTGATGTTCGACCGCACTCAAGTCTATCCGATGTGCTTCTATGAACTCCTCAGGTCATTTACTGCTATGGGATCCCAAACCCGCAGAGTTGCGTCAGATAGATCTCCGGCATGGTTTTGCGAAGTCTATGCCCAGAGACGTATAGGCATGAGTTCCAAAGGATGTATCGAGCTCGAGATGCTTGGGTGCTACCCTGTCTGCAGTTCGGA", "GGCTTATATAGGCAAACTCGCCCGTTTATTCTTATTGCCTCCATCAGCAGGGACTCGTGTGAACATTAGACTCAATGGAAGTCTCACTGGCGACGCATAAACGCGGCGCTGGGCCGTATGACATTCTGATCAGTGGTGCCAATTTCGACCGGGTGATTAGGCGACGGGACCGAAGTCAGTGGTACAAAGCGCCGCCGATTCTGAAAGACAATGGTCCCTGGTTTTAACGAATGCGAATCTGCTGATACCAACCGAACGGTAGGCGTTTCGTTCGGTGATTTGCAAGGGTGTTCATCTTGACCACGGGCATCCGCAAGCTC", "TCCCTTCCGCTGTACCAAATTTGTGCATCGGCCCCAAGCCCAGCTAAAGTTGAGTTGTGCGCGATATGCTTAAGGAGGCCGCTATCCCTGTTGGTGCGAAACGGCCGTCCCAGAAACAATTCTGGGAGAAAATATAAACCCTAAGGTTCGGCGGACGATTGCATTCAACCTCGCTATCACCCGAATTTAGTCACCCACCTTAGAGGCTTCAAACTCTAATCCCCCCCGTCCTGTCCTTGGCGACTGACTCTCTGGGAGATGGTAAAGCTGTAACCGCACTACGCGGTACTCTGGGTGAGATAGGTGGTCCGCACTATCAG", "TCCAGGGATTACAGTGTGCGGGTTCGCTTGTGATGGTTGACTAGCACACCGAGGTAGCCTACGGCGGAGTGCAATGGCGTCCGGCGAGGGAGGCTTCTCGAGGCAACCACAGCCTGATATCTGCACCGGGCTACGGGGGCGACCCCGTGGAAATTATTGGTTGAGTCTCTTTTCCCCACTATCAGAGGGGTCGCTCAATATGCTTTCATATTGAGGGTCATACAGCCCAATGGCCCGGATTACCAATTGCAAGCAGACACAGGAATGTGTCAAACCGCTCGCGCCAGTACAGTCTTGGGGCACCGAAGTCCTTTAATGGA", "GTAAAGTGGTACTTTGGTCGCATACCTATATCGACCAGCGTGGCTACGACGTTTCGTCTTTTGCGGTGTGAACACGACCGGAGGGAACCGTACTCGGTATAGTGCGTGCACCGATTGTCTCGTCGAGAATGGCCTTTGGTAGGGTGCGGGGAATACGGTGAAAGGAGGCAACGAACTCTAATGGAGCGTTGGAATGACGCGTAGATGCGAACCAAGACTGGTGAGGACTAGTTTTGCACTTCCAGTAGTGATCATGGGGGGCGCCATGAGTAAGTTACTCGGACTAATATGGGTCAACACCTTCATTCAGTGAAAATATC", "CAATTAGGGTAATGGCTCGTATATTTTCGCCGACATGATCCCGCCGGAAAAGTTGGCCGTGCGAGTGCAACCAATTTTTCCCGAGGCTTAAGGTCCCAATAGTCCGAAGAAAAGACACGAGAGCACGATACAACGCTGACAGGATACTGCACCGGGGGGTACTCTCAATGGACGTCGAGCCAGTTAACCAGTTCACCTGAACGCATTGGGGCCTCACTTTGATGCAATCCTCGATCAGGCATGCCAGTAGGTTTACCTACGGGAAGAGGCTGCACGTAGCAGGGGTGTAGAGTATTAATTATGACTGAGAGGAACGGCCG", "CGCCAAATTCGCGCGCTATCTAGTCTCAGAAACTTACAGCATTGTCTGGCATTGCCTAAGATGTATGTAGCCGACAGAGCGCGTCTTTTTTCACGGACGTAAAGTCCGAGGACAATCCAAAGAGCGAGATGTCGCAGCCTTGGTTTAATGCTCAGTCAACGGAGGACCTACTGACCCTTCCGACGGGGGGGTTAACTAAGGGGTTCGGATGGGGGGTGGTGGCGTATGGATGGGGTGGTCTGTGCCCAAAGGCGGACAGTCTCCAGTCAACATTCGGGGGCTTCGCGAGGCAAGTTGATGACCCGAGGTGCAGTGACGTC", "CATAACGACACTGTTGAGCCTAGCTTATGACGGCAGCGAGTGGGTTCACAGGATAGTTTGCAGTCTATGGTATGAGACTACAGATACCAATCTTCCCCGTCGTACGGGTTGGCTACTGGAGGCAACGAGGACTAGAGATAAGTGTAGGTTGGAGCGGGGCAGCTGGCAGGGTAAATGGAAGAGCACGTCTAACCCCCTACCGCCCTAATAGTAATCGGCCGGTCATGAAATTCTTCTAACATTGATTATTACTATTAACTAACAATTGGTTACCCCCATCAAGACGCGGTATAGCATTCATAGTGACTCCGGGAGTTCGG"]
# k = 20
# dna = ["GTAGGTCAAACCGGGTGTACATACCCGCTCAATCGCCCAGCACTTCGGGCAGATCACCGGGTTTCCCCGGTATCACCAATACTGCCACCAAACACAGCAGGCGGGAAGGGGCGAAAGTCCCTTATCCGACAATAAAACTTCGCTTGTTCGACGCCCGGTTCACCCGATATGCACGGCGCCCAGCCATTCGTGACCGACGTCCCCAGCCCCAAGGCCGAACGACCCTAGGAGCCACGAGCAATTCACAGCG", "CTCGCTGGTCATAATGCATCCTTGCGATTGGGGTGTAACCTCCACAGACAAATCCGACCACCGCGCCGGAGCCCTCCGATAGGGCCTGAAGTCCCGGCCACTGGGGTCATTCGTCCGCCAGGGGCGAAAGGTCTGCCACATTGGGGACTTCCGGCCCTAACTGAGCCGGCGCCATCGAAGCCAGGGTTAGCGCATGGCTGTCCGGGCGGGGCGCGGAGTTCGCCGGAGTCCGAGACCCCGGATCGTGTCG", "GCGCCCCGCCCGGACAGCCATGCGCTAACCCTGGCTTCGATGGCGCCGGCTCAGTTAGGGCCGGAAGTCCCCAATGTGGCAGACCTTTCGCCCCTGGCGGACGAATGACCCCAGTGGCCGGGACTTCAGGCCCTATCGGAGGGCTCCGGCGCGGTGGTCGGATTTGTCTGTGGAGGTTACACCCCAATCGCAAGGATGCATTATGACCAGCGAGCTGAGCCTGGTCGCCACTGGAAAGGGGAGCAACATC", "GTACATGTCCAGAGCGAGCCTCAGCTTCTGCGCAGCGACGGAAACTGCCACACTCAAAGCCTACTGGGCGCACGTGTGGCAACGAGTCGATCCACACGAAATGCCGCCGTTGGGCCGCGGACTAGCCGAATTTTCCGGGTGGTGACACAGCCCACATTTGGCATGGGACTTTCGGCCCTGTCCGCGTCCGTGTCGGCCAGACAAGCTTTGGGCATTGGCCACAATCGGGCCACAATCGAAAGCCGAGCAG", "ACGGGACTGGCGACCGAGGTCAGCGATCCCGAGCAGGTTGCCCGCTACCAGCGGCTGCTACACCCGTGGGTGAACATGGCGATGGACACCGTGGTCGCGATCGAACCCGAGATCGTCACCGGCATCCGCATCGTTGCTGACTCGCGTACGCCGTAGCCGATTGGCCGCGGGCGGCCCGCACGCATCCGCACTATCTGATAAATTCTTCAACTCGTCAACCGATGTAACGCTGAAGCTCTCAGGAGACGCG", "GCATCCGCGTCCGGGCAACCAGGGACGTTTCGGCCACGTCGCAAGCGACAGCCCAGGGTGCCGGATGCCCAGCCGCTCGCGGTGGCGGGCCGTCCAGAGCCGTCGGCAGCCAGTGACCAGCTGGGGTATCCGCATGGGGTCGCCCAGCGGGTCCCGAGGGGACTTTTGGCCACCGGCGCTGGTGGCCTACTGCCCTCCCGCCGTTGCGCCGGGTGCGTGCACGATTGAAGTCCCCAAGGAAGGGACGCTC", "ACGATAGATCAACCGGACCACCGAGGGTTGATTATTGAGGTGCGCTCATCCGATGGTTCGCCGCCGTATGTGGTGCGCTGGCTCGAGACCGACCATGTGGCGACGGTGATTCCGGGTCCGGATGCGGTCGTGGTCACTGCGGAGGAGCAGAATGCGGCCGACGAGCGGGCGCAGCATCGGTTCGGCGCGGTTCAGTCGGCGATCCTCCATGCCAGGGGAACGTAGGCGATTCGCTCAAGCGACGAAGTCG", "CGATCACCTCAATCCGATGACCTATGAAGCGCGCGGGCCCGGCCGCCATCGGCCCGTCGATCCGAGTGCGCACGGCCGAAGTGAGCCACCACCGTAGCGCCGCCGAGTTCGCTTCCGCGGACGCAAGCCCGGGATTTGCGGAGTAGCGTACAAGGCGGGATGCGCAATCCGGTCGCTGAGGGACTGTCGTCAGTCGGCCCAGGGACCTAATTCCATATTTGGAGCCATGAGATGGCGACACACTCGGCAG", "GCAAACCGTGAGGCTAGGGAAGCGAGGAGCACATGGCCGCCGACCCGCAATGTACACGCTGCAAGCAAACCATCGAACCCGGATGGCTATACATCACCGCCCATCGCCGCGGTCAAGCCGGGATCGTCGATGACGGCGCAGTACTGATTCACGTGCCCGGTGAATGCCCGCACCCCGGGGAGCACGTTCCGCGAAGCTAGCCGGTTGGCATGCTAGGCAACGCAGCGTCAAGTGAGAGGGCAAAACCAAA", "CCGCTGGCGACGCTGTTCGCCGGCAGCGTGCGTGACGACTTCGAGCTGCCCGACTACACCTGGTGACCACCGCCGACGGGCACCTCTCCGCCAGGTAGGCACGGTTTGTCGCCGGCAATGTGACCTTTGGGCGCGGTCTTGAGGACCTTCGGCCCCACCCACGAGGCCGCCGCCGGCCGATCGTATGACGTGCAATGTACGCCATAGGGTGCGTGTTACGGCGATTACCTGAAGGCGGCGGTGGTCCGGA", "TCAGCACCATGACCGCCTGGCCACCAATCGCCCGTAACAAGCGGGACGTCCGCGACGACGCGTGCGCTAGCGCCGTGGCGGTGACAACGACCAGATATGGTCCGAGCACGCGGGCGAACCTCGTGTTCTGGCCTCGGCCAGTTGTGTAGAGCTCATCGCTGTCATCGAGCGATATCCGACCACTGATCCAAGTCGGGGGCTCTGGGGACCGAAGTCCCCGGGCTCGGAGCTATCGGACCTCACGATCACC", "ATGGCGATCCGCTCGATACTCCGCTGGCAGCCCGAGCGGGGGGTGTCCCCTACCCTGGCTGCCCCGGCTGACGGTATTCACCGGTATCCCATGATGCACTCAATCCCGATGCCTGCACCTTCTGTAAGCTCGCGACGGCTAACGGGGTTATCGGATGGCCCGGGCTCGAAGGAGGTTGGCACGTGTGGATGTTCCTGCCCGGTCGCCACGCGGCGGCGGCGATACCCCAGAGTCGTCAGGTGTTTAGCCG", "GGGTCAGGTATATTTATCGCACACTTGGGCACATGACACACAAGCGCCAGAATCCCGGACCGAACCGAGCACCGTGGGTGGGCAGCCTCCATACAGCGATGACCTGATCGATCATCGGCCAGGGCGCCGGGCTTCCAACCGTGGCCGTCTCAGTACCCAGCCTCATTGACCCTTCGACGCATCCACTGCGCGTAAGTCGGCTCAACCCTTTCAAACCGCTGGATTACCGACCGCAGAAAGGGGGCAGGAC", "TGTCCGCTGTCGAGTTTCGCGACATGGCCGAGACCTTTGAAGACGAGGAGCACCGGCGCTTTGGCGAGGCCGGTTTTCAATCGGTGGTCGACAAGGTCGCCGATATCGAAAAAAGCCTTGGCATCTACGACCTGAGCCAGTTCACCCCCAGCTAAAGACACTAATGCCCTTGGGTTAGGGACCATCGCCTCCTGACGCGATCGCGACAGCTGGCTAACGTCGGTAGTACACCCATGCAGAGGGGACGCCA", "TCCGGTTATCGTCGCCCGAATCCCCCAAGATCCGGCAGTGCCGGCCTGAGGGCCTGTGCGATCTGCTCGGGTGGTGCCCACCCGCGCGGAAAGCCCCGTCCGAACCGTGATTGGGCAACGTCGGGCCGGGCCAGCAGCGCTGGACCGTAGGTCCCTGCAGTGGATGACTTACGGCCCTGATCCACACCGGCGACCGTTAGGCAGGGTTGAGCCAACCGTCGGTTGAGCGTCTGGCTGCGAGGTGAGGTGA", "TTCGGCGACGGTTGACGTGATGGCCGACAGAATCGTTGCTAGAGCCGGGGGCAACTCCGACGCCACCGCCGAGATCGCGGCGGCCTTGGCCGCCCGGCAAGCGGACTGGGACACCGGGCACCGGATCGACACGGCGGGGCCGCGTGAGCGCTCCGTGGGACAGGCCTACCACATCTGGCGCAGCGCGATCTGAGAACGCCGAAAGGAAAACCGATGCCAACCATCACTGTCAGCAGCACATCGTCGCTGT", "CGTCCATGCCGCGGCGTGCACTCGCGTGGCCTTGTCGACCACGTTGTCGAGGCCGACGATGACAGCGTCGTGGTAGCGCCGGTCGATGGTGGGCGGCGGTCCGGTCACATTGACGACGATGCCACAGCGCACGCTTGGATTCGGGCGTCGAATCCAATCATGGTCGGTTGGCCGTCCGATTGGGGACTAAAGCCTCATGACCGGTGACTGTCCCGGTTCATAGCGTTGCTCGAGGCGAAGGGAGAATGAG", "CACCACGTGGACCACGGTCAGCGGAATGTTCCTCATCGCCGCATCGGTGGCACCCCAACAGGCGGCGGCATCCGATTCGAGCGAACCATCTACCCCGACGACAACTCCGTGCTGCTTGCGGGGTTTAGACATCTCATTCTCCCTTCGCCTCGAGCAACGCTATGAACCGGGACAGTCACCGGTCATGAGGCTTTAGTCCCCAATCGGACGGCCAACCGACCATGATTGGATTCGACGCCCGAATCCAAGC", "TTCCCGCGGATCAGATCTTGACCACCGGGAGTGTCGATGAACTTCTCGCGCTCTTGAAATGACGGGCTATCGTAAGTTTATGGCCTGGGGGAGCGTGAATCCCGCTGGCGGTCGGGTGAACCGCCCCGGTTTTCTTGCACCCCGCGTCGACGTGCCAGTGACGAACTTGACGAATAAGGCCTTTGGTCCTTTCCGGTAGGGGTCTTTGGATAGGCGCGATCCTCGGCATCGGGCCGGTAGCTTGCCGTTT", "GGCCAACTGCACCGCGCTCTTGATGACATCGGTGGTCACCATGGTGTCCGGCATGATCAACCTCCGCTGTTCGATATCACCCCGATCTTTCTGAACGGCGGTTGGCAGACAACAGGGTCAATGGTCCCCAAGTGGATCACCGACGGGCGCGGACAAATGGCCCGCGCTTCGGGGACTTCTGTCCCTAGCCCTGGCCACGATGGGCTGGTCGGATCAAAGGCATCCGTTTCCATCGATTAGGAGGCATCAA", "AAACTCGGGGAAGAGGGACCGCGGGTGGCGCTGAACGGGAAGGGTGGTGGCCATTTGATGCCTCCTAATCGATGGAAACGGATGCCTTTGATCCGACCAGCCCATCGTGGCCAGGGCTAGGGACAGAAGTCCCCGAAGCGCGGGCCATTTGTCCGCGCCCGTCGGTGATCCACTTGGGGACCATTGACCCTGTTGTCTGCCAACCGCCGTTCAGAAAGATCGGGGTGATATCGAACAGCGGAGGTTGATC", "CATCCATCGTGCCGCGCGTCGGCGAGTCCTGTTGATGCGCCACACATTCCGCAGGCATCGTGAACGCTTGACAGCCGTCGCCATTGTCGCGCACAAACCGCACGTCGATTCGTGATCCATTGAGGACCTAAGCCCGTTGGGCTAGTGACAAACGCCTCCTGAGCAAAACCCTCCTCCCCCGTTACCGTCGTGCGGTAGGGACAAGCCACATCGGCCGAGCGGGCGATCAGCCAACGACAGGAGGACCGCG", "GTCGATGCGAACGCAAGCATCCAGGAGATGCTCAACGTCATGGAAGAACATCAGGTCCGCCGTGTTCCGGTCATCTCAGAGCACCGCTTGGTCGGAATCGTCACCGAAGCCGACATCGCCCGACACCTGCCCGAGCACGCCATTGTGCAGTTCGTCAAGGCAATCTGCTCGCCCATGGCCCTCGCCAGCTAGCGACCTCGGATCATCCGCCGGCACGAAAGGCTTTATCCGCCAGCATCGGAGGTACCCA", "CCGATCGGCATCACTATCGGTCCTGCGGCCGCCCATAGCGCTATATCCGGCTGGTGAAATCAATTGACAACCTTCGACTTTGAGGTGGCCTACGGCGAGGACAAGCCAGGCAAGCCAGCTGCCTCAACGCGCGCCAGTACGGGTCCATCGACCCGCGGCCCACGGGTCAAACGACCCTAGTGTTCGCTACGACGTGGTCGTACCTTCGGCAGCAGATCAGCAATAGCACCCCGACTCGAGGAGGATCCCG", "ACCGTCGATGTGCCCGGTCGCGCCGCGTCCACCTCGGTCATCGACCCCACGATGAGGACGCCATCGGCCGCGACCAAGCCCCGTGAAACTCTGACGGCGTGCTGGCCGGGCTGCGGCACCTGATCACCTTAGGGCACTTGGGCCACCACAACGGGCCGCCGGTCTCGACAGTGGCCACCACCACACAGGTGACTTCCGGCGGGACGTAAGTCCCTAACGCGTCGTTCCGCACGCGGTTAGCTTTGCTGCC", "CACCTGTGTGGTGGTGGCCACTGTCGAGACCGGCGGCCCGTTGTGGTGGCCCAAGTGCCCTAAGGTGATCAGGTGCCGCAGCCCGGCCAGCACGCCGTCAGAGTTTCACGGGGCTTGGTCGCGGCCGATGGCGTCCTCATCGTGGGGTCGATGACCGAGGTGGACGCGGCGCGACCGGGCACATCGACGGTCCCCGGGGCTTTGTGGGCCAGTGAAGTGACGAAAGACCCCAGTGGACACGGACTTCGGC", "GTGCCACGACGCATAGAAGACATGGCCATGCCACACCCTGATAGCATTGCAGCAAGCTACATGTACTGCTCTACCAGGATCCTTATGGGCAACAGTGGGTTTGAGTTATGAAACCCGTGGGCACATACCCTTCCGCGTCGTACTGGTCAGTCTCGACAGCGAAGAGATCACCGGTTGATCCACCAAGCATGCATTGGCGGGCATCTGCATAAACGGTGACGTATCAGCACAAAACAGCGGAGAGAACAAC", "GCGAGTACGCTAGTTCAGGTGGGTGCCGTGCCGAAGGCGGTGTCACTCAACGAACTTCGGTTCTCGCAGGGTCGCCACGGCTGGCGATGTGCGGTAACGCTCGATGTGTGAATTGAGACCTGATTCATGAAAATCGTCGAGGAGACCCCATACCGGTTCCGGATCGAACAAGAGGGCGCGATGCGGGTGCCCGGGATCGTGTTCGCGTCCAGGTCGTTGCTGCCTCGTGACGAAGGCGACATGGCCCTTG", "CTTGAGGTCCAGTCCGGCGCAGAACACCGGATCGGCGCCGGTGAGGATGACGACGTCGATGTCGTCGTCGGCCTCGGCGTCGGCCAACGCCGCGAAAAACCGATCCCGTAGCGCCGCCGAGAGCGCGTTGCGGGACTGCGGCCGGTTGAGGGTGAGGGTTCGCACCCGTTCGTCGGTGTCGATCAGCAGGATGTCGTCGGTCATTCGATCACCGTAACAGGACCGTTAGACTGTCCTAATGACCAGAAAA", "CGCGCGAGATCTCATCGACAACCCACTTCCCATGCCTCACGACGGTCACCATGTCGCGGGCATATTTACGTGAGGCACCGAGGGTGTTTCGCGGGCATTCTTGGTGAGTCAAGTCGAACGGTTGAGCCATGATCGACGATTCCGTTACCGTGCTGTCAGAAGACGAAAGTTGGCACCGGCTGGGCAGCGTTGCACTCGGTCGGCTAGTTACCACCTTTGCTGATGAGCCTGGGATCTTCCAGTCAATTTC", "GGGGGCCCGGACGGCCAGGGTGAGAACCGTTCGCACGGTTTCGGCGTCCGGGAAATGGGTGTTCATGGCTGCCGGGCCTTTCCCATCAACGGCTGCGGTCATCTATTAAGGATCGCGCGTCGACAGCCGCGGTGGCAGAGCACGAAGGCTCGCCAGCGGAGGACCTTTGGCCCTGCCTCAACGGCTCCTCGCAGCGGAGAGTGGTACTGACGGTTGCAGATAATCCGGATCACCGGGGAAGGCGCTGACC", "ACTCACGTGCCGATCCACGTCTTCTGCCTTGAGAAACCCGGCGTCAAGTGTCGTTAGGTGATTCATGGTCAGCGCCTTCCCCGGTGATCCGGATTATCTGCAACCGTCAGTACCACTCTCCGCTGCGAGGAGCCGTTGAGGCAGGGCCAAAGGTCCTCCGCTGGCGAGCCTTCGTGCTCTGCCACCGCGGCTGTCGACGCGCGATCCTTAATAGATGACCGCAGCCGTTGATGGGAAAGGCCCGGCAGCC", "GTTGGCGCATGTACACCTGAGCCGTCGGCTCGCCCACTGGACCCGGCTCTACCCCGAGGTGCGGGTGGATCGGGCCATCGCCGGCGGCAGTGCGTGCCGTCATCTGGCCGCCAACGCAAAGCCGGGTCAGCTGTTCGTCGCGGACTCACACTCCGCGCACGAATTGTGCGGTGCATACCAGCCCGGATGCGCCGTACTTACGGTACGCAGTGCCAACTTGTAGGGAGCGGATCTTGGGAGTGGTGCCCTG", "GGCAGCTGTCGGCAACTGTAAGCCATTTCTGGGACTTTGCTGTGAAAAGCTGGGCGATGGTTGTGGACCTGGACGAGCCACCCGTGCGATAGGTGAGATTCATTCTCGCCCTGACGGGTTGCGTCTGTCATCGGTCGATAAGGACTAACGGCCCTCAGGTGGGGACCAACGCCCCTGGGAGATAGCGGTCCCCGCCAGTAACGTACCGCTGAACCGACGGGATGTATCCGCCCCAGCGAAGGAGACGGCG", "GTTGTCGATGAGATCTCGCGCGGAGGTCACCAGCAGGTACGCCAAGGCGTATGTGCAGGCTTTGAAGAAGAGCCGGGGCCGGATTTTCGACCAGGTGGTTGACCTGACGGGCTAGTCACGTGATAACGCGCGGCGCCGGCTTGTCGCAGCGGCCAAGCTATCGCCGGGGCTGGGCCGCAGTGTTGCCAAGCGGCGGCGCAAACCGCGTTCGCTGAAGTACTCCTATGACGCGCTGAAGGTGTTGCAGAGG", "TAGAGGGCTCCGACGTGCCGGTGCCAGCCGCCGCGTTCGAAACACAGCCCTAACGACACGCTGCCGAATATGACCCGTGTCGGAAATTAGGGCGACAAGAGTAATGCGGCTCAACATAGCCTTGCTTTACTTAGGCAAACCTGCCTTCAACCAGGAGGTTATTATCATCCTGTGGTAACTAGGAAAGCCTTTCCTGAGTAAGTATTGCCTTCGTTGCATACCGCCCTTTACCTGCGTTAATCTGCATTTT"]
# print(many_runs_gibbs_sampler(dna, k))
# t = 36
# res = (many_runs_gibbs_sampler(dna, k))
# for item in res:
# print(item, end = ' ')
# pattern = 'AGCACC'
# dna = ["AGCGAGGTTTTGGACTACCAGCTTTTAGTCGAGGCTAGGTTTCTCATCACTTTGTCGTACAGGATGTTGGTGTTTGACCGCCGCAGTGCCAGGCTTGTATTG", "GACCTTGGTTTAGTCGGGAAGGGGGTAGTGTATCGACATGCGAGCTGTTGTTACGGAGTTTGGGTAATAATAATTACGGTAGCACCGGGGCCTGTCCGTTGA", "GGCTGGGTCGCTCACTTTAACAGCCAAATAAATGGTTATTTCAATGGTCGCTGGATCGTAATCACACGAAGATGTGATAGGGGGCCTCTGTTTTTTACTAGT", "CTTGGCACCTATGTAAACCTCATTGAAGAAACAATGGCACTACTTCGGAAACGGTAGTTCTGCAAGAGCGCATTAGGAAGGGGAACATTAATGGCAGACTTC", "TCCAGGACCAGGGAGCCAAGGATCCCAGTCTATCATCAACGATCTACTATACTTCCGATTTTGCACTTTCCGTGGGAATCTAGTGCGATGGGAGACCTGAAG", "GCGGAGGTTTGAAGTAGAGCATGTCAAAGCGGTTTGGAGATCGCACCGCGAGGTAAGAGTCTACGAGATGAGATCTCTCTATCTTTTTATTCGCTTCACTCT", "ACACATAGGGCGGCGCCTCTCCGATAAGCCAGCCGTTTTATTTATTATACGTCCCTGTTAACTACCTGCAAGCCCTCAGGGTCGCCCCTTCGGTCGTTACGG", "GTCACGTCTACCAGTCAGGACATTTTCTGGTCCATACCGAACAATCCGTCATGTTATAGCCGTCTCACGGGGGGTCCGAGCCGTTAGCGGATAAAGGAGGTG", "CGCTGCTTCGGCAAGGCGTGTAGGAAACACTCGGTAGCTCCCGGGTTAACGATCAACTCTTACCTCATCATCTCAAGTGGTTGACAGGTTGTCTTATCATAG", "GCGTAGCTAGGCGCCGTAACTGACTGACTCTTCACAAAAGGTCTCAGGAAATGTTCTAGGCCAAATGACTACGATCGCGGAGAGTTCGTGCTTGTCCGACCC", "GAGCATCACAACCAGTTATGTATTGGAGGCACGGCTAGGAGCACACGCTACGCTGTGTAGCATCTCGGGTTGCAGCTAACAGCTCAAAGGTTTGCCCGTTCC", "ACGATATGGGTGAGAAGTGCAGCGAAAGACAAATTGTCTTGCCGCGATTAAACCTCCTGGCGTAGTTCACTCGGCTTTCCAGAGATCATGGACTACAGTGCG", "GCCTAGGAGAACTGCCGGCTCCGCATTCGCGCCTCGGAGTAAACCATTTCCCCATCGGTTGTGCGTGCTAACTGCGGCAAGTACACAAGATGGTCAGTTCAG", "GTACACTACGTGTGAAATCAACCAAAACTCGATGGTACAAAGCTTTATGAGCACTCCAAGCAGCTTTTAAGCAATCTACCACAGATTAGAGGTCCGGCAGGT", "GGAAGATATATCTTTCGGAACCTCTTACAATAATATAGGGCGTTGTCTTTCCACCGGACACGCCTCGAAGATACCTGATAAAGTGAATACCCCCTTCTTAGC", "TTGGTCCTTAAAAATGGACGGGGTGAGCTAGGGGTATCACGGACGTCTAAGGACCGCTTCCCCGACATGTTCGGCAGAGTGGAAATTCGCTGGCATCGTGCA", "GGATCCAGGTGACTTTAGAGACCGTCCACCGGTACGTGGCGCTTAGAGTCAGACTGTTCGTGCCCCTTATTGGAATAAGTTAAATGTACGTGTGCGTGTGCG", "TGCGCAGGCAGTTTCTACTGTCCACTATGGGCCTCCCTCATTACAGAGGGATTAGAGCTGTAGCGCGACTACTATGTTCGATCGATTCCAGTCTTATCGCTA", "GGGGGCTTCGAACCACTTTCATTAAGGGTGACACTGGTTAATCCACATCATGTGCCCTGAGCCCACCTAACTACGCTTCCCCTAGGCCCTTGCGGCCGTGAT", "CTGGCCGAGGTACTAGCCGCTCGCTGCGGTCTTCTGACCCTTCGGGGTCAAATTGAATCCTAGGTAGGCTATGCAACGGGACTCGTGACACAATAGTATATT", "GCGCCGACCCCACAACGTAGTCGTGTTCGACGAACATGACGAAACTGAATAGTGAAGGGCGATGGAAGATCAAGTTCTACCCGTTCTCAGTAGTGTAGAGCA", "TAGGTATCTACGTAGGGGCGGGCCAGCTGCTATATGTTGTTGCTGCGTGGAAAAGCCATGTCTTCCCGGAAATGTAATCGCAGGTACTTTACGTACCCTCAG", "GCCTTCGATGCGTGGTATCGACCGAGAGGACCGCAGAGTGCTATACACATCCTGTCTAATATTAATTACAGTGTGGCTATCGGCACGGCACTTTAATAAAGG", "TAAACAGCAGTTAGTAGGCTAGGGCTCCCGAGTGAGGACGCGAGCGGATGGAGAACCGGCAACGAACAGAACAATAATCCACTTACCGATCAGGTTTCGGTG", "GCAATAGTGTGGTATTAGCACCTAGTGACGCGGCGGTATAAAAGAAATACTACGCTCTTTTATTGTCTGTCCACAGGGCTACTATGACCCATCTAACTAGCA", "AGGACGTCTAAAGAGCACGTTGGCTGTCAACGTGGCAAAAACGAGACGGGTATTATAGAAATCTCTAGACAGCCTTCGCTTCGACGTGAACAGGTCTACATT", "CGGGGAAGTAAAGGTCAGTCGGCTAAAGGCCCCAGAATAAATCCTTGCTGACATAGTTCACTTTGGTAATAGTTCAGGGTATTTATTATGATGTGCGGTCAT", "TTGTCCACTCGCCGTGATAGCCCTAACTGGGGGGAGGACTGTTGGGCTGAGCCCTAGCCGAAGTGCTGTTACGGGGTAAGGTAGGTACATGTGGCATGCCCT", "TGTCGGTCCGCTCTATCGCTTGTCTGGCAGCTCGCAATGCGACGTCTTGTTCGGCACAGCCGTCACTACGGAAGCACCCAGATTGATGTGGTACCGACATAA", "CTTGCTGGATTGCTTGCCGACAATTTTTTCGCAGAATTTAATAGTGATTGCAATTTCGAAAGAGCCAGAAAAGCTGGTAAGATGTTCAGTACTAAGCTTAGG", "ACAGATCTGATGAGGGTCCACCGCACGTTCCGACGCTACTCGTTTTTACGCTGTCCCCCCACAACCCGCCCGACGCCGTGTCCGGGGAGTGGACCACACAAA", "CTTGGCGAGATTCCGTTATACAAGAGCCTTTACGCGAGAGAGGCACTACTACTGACTACCTGGACCCCGAGGGGCTTAGTGGTATAATGCTTACCGCAGCAC", "TGGCCAGATGTGGGTGGGAGGTAAGGACCTGATCCCTAATAGGAAGTCGCCTGATGTTTGGATATGGGTCGATTTGGGTGCCCCCTTTAGTTCCAGGGATAC", "AAAACAGTCAGAATCATGGACCGGAATTATAGAAGTGGCCCGTGTATAGCTACTCCGGTAGTAAGCGCTAAGGGCAGCTAGTCGGCAGATAACATACAGTGA"]
# print(distance_between_pattern_and_string(pattern, dna))
# dna = ["TAGTTAAGGAGGATCACTGTCTTGGAGTCCTCCGCCCCCAACCCGTGTTGATTCCTACCCCCGCGCAACACACCTGAAATTTGTCTTTCCCGAATGGCGACTCACTATCAGCGCGGATGAAGCGATTATTCCCTAGTAGATTGTGCCCGGATGTCT", "TCGAACGTTTGGTAGTTGACGTGTAGCTGCTTGGACGCCTTGAATGCCCGAAAGCCTGATTGAATCAAAGGTCTCATCCCCAGTGTTAAGAACAGTCCTTTCTTCGAGGCGCTACAATTTGCATAGGCTAGGTTGCTGCTACGCGAGAGTAGAACA", "CGGAATGTGTATCACAACCCAGGTAAAACCTTTTTAAGAAACGCAATGAGTACATTTCGGGAAATGCTTGAATGCGACTCGATTAGTTATCGCGCCCCCTATAATCGGAGAATCTCAGGCGATACTACTAATTTGCGAGGGGTTGGTCGACGAAAA", "GAGGGAGCGTGCCCCATCCCGTGTCCTTTATGGCAGGATGCCTAATGAGAAGCCCTATCCCTTTATTCTCTATTCACATGCGTCTAACGGCAGTGTAAACCAGTCGGCCATTAGCTGCATCACATCCGAAGCGAATTCGTTTTCACTACACCCCTT", "ATAAGACAAAAATGGAGCGATCATGTACTCTCGGATCATCTGTCAGTACGGCGTACCGGGCGATTAAAGATTGAAAATGATATGTCGTGGATCTCTCACTACTGATACCGCGGGATGTACACCCAGTCTAACAAGGGCGACTCTCACATCCCCGGT", "CCGTCACTAGTTTGGCGGACTACCCCATATAAGGTCGCTTTCGTGTAGCTCACAGGTGGCTTGGGATTCCAGGGTAGCTGAATACCCAGCCCGTGTGGCTGATGAGGTTCGCAAAGTTATGGAACTTCTAGCTGGGGCGCTCTGATACAGAATGCC", "GCTCAGACTCAAGGCCAAGGACTCTACATCTACGCTTCAATCCTCCGCCGAGTTAATTTCACTGCTTGGCTTCTCATCCCCCGTCAGGCACATATTGGCTAATAGCACAACGCAATGTGCTATCTCCTCGAAACGCCGGACACTTCTGACCGGCCA", "GAATCGGTGAGGATTTGTAGGGCGAATAACTAATGGTGAGGCGGGGTTTAATATTGTATTATTTCTGAGCACGATCTAAGGTGCGGTAGTGGGCCTAGAGCGGTACGGTAGTAAGTGGCTGTTGGACCAAGTCGCTGTGACTGGCACAGCCCAAGT", "ACAACTACCTAACACGGTACCTTGCGGCATGCGGACTTTAACCTTGGCGGTGGTATTATTTCCCTCCTTCCCTAAAGAGCAAGGAAGCGGGTAATTGGTAGACGCACGTCAACGCCGCACGTTTGCTCATACTGTCTATAATAACACACCCCGTGT", "AAAGCACATAAGCTGTGCAGACGTACATGGATACTATGCCATAATGTCTCCTAGATGCGTGTTATCTCCGGACACACCCCCGGTATCTACACTGTTTAGAGCGCAGCACGGACTCACCGAATGACGCATAAGAAACGGAAAGGATTTTGGTATACA", "GCACTACATCGCAGGTTGCAAGTTACTGCAGGTCCAGCCTCACTGGGTATTCAGAACATTTCCATTGCGAGGACATGGAGCAAACCCTCATAGTGTCCGCGTAGCAACCGAGATGCTCTTTAAAAGACAGCCCGCAACCCTAGTCGTGACCCGCGG", "CTCAGCCCTAGTAGTCCCCCTCCTATTGTATACTTTAAATCTTTCGGCGCAGAGCCACCGTTGACATGGTAACAACGGCATTCGATCCTTACGATCCCGGTCGGCAATTCCAAAGAGACTCGGTAGGCGGCCTAGGCTCTTTCGGAGTCTCTTGCA", "CTCAGCCCTAGTCGGAATCGCAAGACGATAAGCAGTTGTGGACGACTATGGATTGACCAAGCACAGCAAACAACGTAGCTAGAACCTTTTGCCAGGGTCCAGGAGTACGCGCTATCTCCTTGTTAGCCCACAATCAGCAACAGGACCCCTCGAGCA", "CTCTCGCATTCGGTCCGAATTGTCCTCTTCCCGGTGAGGCTCTACAAGAATGTTCAATGAGTTTTGTAAACACATCCTCACCATAAAGGTCCGGTTTCGCGGATTTACCCCATCCCAAGTAGTTTGGATAATTGTCGCCCCGATATCCGTCCATCG", "GTTCTAGTTCGATCTATGGAGTGCTCACAGCTCGTCCTGTTGTAAGGCACTACAGATGTAATTCGATGAACGCTCAACCCCGGTATAAACGACTTTCCACACCTACAAGTATTTTTCGGACCTCCTGCAGGAAGCTTATACTCAGCCGCTACAAAT", "GGTTATTCGGTAAGCCCCCGCCGTATGTGAGGGCGGTCAAAAGTGCAACGGCCGTACCGGCCGTGTGAACAACCCACCCCTGGTACAGGCAGCAGAAAGGTTTCGCCGATTGTAAAGTTCCGAGTCTTTGGATTTAGCCAGCGAGAGCTCGAGATT", "AAGAATTCAACAGAATAGTGTACCAAGAGGACCACGCACAACCCAGGTAAAATTCAGCAAGACACCGAGCGAAGAACTCGACCATGGAAAGTCGCTACTGGTAGGCTTGACCTTCCAGATCTATAATTCCTAGACCCGGGAATGTCACACCCTTGT", "CCCTTGACCGTAATTCTGCCCGGGGACCTCTACGCAAAGAATCAGTTACTCATCCCTAGTGGAAGCCCGTAGTGTTTACGAACGTGAAAATATTAGTTAATGCAGCGACACATTCCCCAGGACGATGCGCGATGCATAAGCTCCCGGGGAGTAGCC", "TACCGGCATAAACACATATTAGCTACAATGCATAATTCTTCTGGACGCACGTTCCTTAGAGACTAAGCTCCCCCTATGAATACACAAGAGGAATCCCGCACCCCGCGTGCCTCACTCGTGTGTAGCAAATCCTCAGCGCCTGCTAGTTCTTCTAGC", "CGCAACCCGTGTAAAAATCCATTTTATCTCCATGTATATAGTAAACCGGGAAGGAGTCGGCTGCAGTGCCGGTACTGGGAGTCCCAAGGTCAGAGTGAGGTCGGACGGGCTACTCCCCCGTTCCGGTTTACTCCGGATTGGCCGAAGATTAAAGGT", "AGAGCTATATGAACCCGTATACAATATAAGAGTATTCTTCCGACGGCTCGATACTTCAAGCCCATCCCACGTATTCCCTACAGGCCACGCTGGTAACTAGTTTACAAATCTCCAAATCGCACGATACCGGTCATGTGGCCTGGCTAGGCAAGCGCG", "CTTAATAGTAACCGCAGCCCCGGTGCTGCAATATACCCAATCCAGTAACGCATCTCATACACGGGCACGTAATCATGACTTGTAACGACGCGCCAATGACAACTCGTTCTCAAATCCGCACATAGTACGCGATTATCGCCGAATTGTCCTGATATT", "TGCCCGTATCTCAATTCAGAGCCGCAGGTTAAGCTTCCCACCCCGTGTGGCGCCAAAGCATACCTACCCTGCCGGCCCACCGAGTACTGTTCTCTACCAGGAGAGTGCGCCAATGTGGCGCGCATTCCTAGACGAATAACACTCTCCTGAAATTCT", "GTCGTCTTCTTGAATCGAGGTATGCACAGAAAGATGCGTCCTTACTCATGTAAATCTTAACCTCCAACGGTGTAAGCCCGCTGGCCCATCCCATGTGGCGGTCTAACCCAGAAATGCTTCCAATCCAGGATAGTTCATGTTTTTCTCAACTGCGGG", "GGCCGGTCACGATTTGGTACGATGTTTTTTCGACCGACCTCCACTCTGTCGCAGCTTTTATAGGCTAGACCACGTACGAGGTAACCCAACCCGAGTGCGACCAATTGCTTTCGTGTGGCATGGGACCGGGTAATCCCTAGGAGCTCTCTTCTGTAC"]
# k = 12
# t = 25
# res = greedy_motif_search_with_pseudocounts(dna, k, t)
# for item in res:
# print(item, end=" ")
# dna = ["TTGAGTAGTATAAATTTCACGACAGACCGACAATCACTCTTA", "TTGATCGGCAGCGTCCTGTTGGTAGACGGATTGAATCCGTTG", "TTGATTTTCGTTACACCAATATCCACCCTGGACTGGAGAGTG", "CAATAATTGATTTCCTACCAATTGCACTATCTGCGGGAGGCT", "GACCCCAAGGTGAATTTTTTTGGCTTGAGTCAAACAGGTTTA", "TCTCTCTATTGGTCCGGTTTATCTATTATCTTGAATGAACAG", "TTGATTAGACTAGGTTGGGAGGAACTAACTTAGTGGATAACT", "CTGCGGTGAGGTAGTTTCTTGAGTCTTCTAACTCTTTGTGGC", "TTGAACGCTTGCGCACATACCAAGTATAACCAAAGCTTGAAT", "GTTATTAGGAGTAGATGGTTGAGTCAATAGTCGTCCGCGGGA"]
# k = 6
# print(median_string(dna, k))
# dna = ["GCACATCATTAAACGATTCGCCGCATTGCCTCGATTAACC", "TCATAACTGACACCTGCTCTGGCACCGCTCATCCAAGGCC", "AAGCGGGTATAGCCAGATAGTGCCAATAATTTCCTTAACC", "AGTCGGTGGTGAAGTGTGGGTTATGGGGAAAGGCAAGGCC", "AACCGGACGGCAACTACGGTTACAACGCAGCAAGTTAACC", "AGGCGTCTGTTGTTGCTAACACCGTTAAGCGACGAAGGCC", "AAGCTTCCAACATCGTCTTGGCATCTCGGTGTGTTTAACC", "AATTGAACATCTTACTCTTTTCGCTTTCAAAAAAAAGGCC"]
# k = 6
# motifs = best_score_from_random_motif(dna, k)
# for item in motifs:
# print(item, end=" ")
# dna = ['TTTCTACATCAGACTCTTGTCGGGTCTTCGGGTTTATAGGCC', 'TGCAGGAATTTACTTATCCATCAGTATGTGTCAGGGAAAGTG', 'TAATGGTGTGCGCACCGAGAGACGCACTCTCATCAGCGACTT', 'CAATGTGATCTTCATTAGCCTCATAAGGAACAAGTGGCACTC', 'CATAAGGTAGGTGACCGGTCGCTTTCTGCTGACAGCCCAATA', 'TGCCACTCGATACATCAGTACCCTCCATCCAGTGGGAGTCGC', 'CATAAGGAGATTTTGGCTTGCAGCGGTCGTTTCGCTCGCCCG', 'TCTACGATCCGGTGAGAGCATGAGTACACATAACCGAGTCTA', 'CCTTCAAAGACTATTCCGCATAAGGTCAGATCTTCCTATGGC', 'TACAGACATAAGGGCAGCTCCTTTAAAAAGCCCTCTCTCACT']
# k = 6
# print(median_string(dna, k))
# dna = ['TGGGCAGTTCCACGCTCCAGAGCTTTGTCGCAAAGTGACCTTTCCTCCAACAGGGTCTAGAGCCCCTCGGGCCCCCGATCTGGGCACGATGATTCCATTTACATCATTCTACGTCGTTTTTTTAGCTGTGGAATAAACTGACGGGAATAATAACTC', 'GCGAAACGCCTCTGCCTGGGTATTCGACTTTTTGATTGCTTACGACAACAAAAGGCCTTTAATATTCAACGAGCCCAATATGCGGGGTTTTTTTCTCACCCTAATTTCTATCGGTTGTCCCCACAAGTGCATCGAGCGGTTTGATTCCAAGCTCCT', 'CGCCCGCATCCTACCCCATGTGAGCTGCGTTCCGCGTATCTGAAAGAGTATTTCCAGGTACAAAGAAACGACACGTGATCTCACATGCCTATCGCTAGTCCCCTACTTCTTCGCTGCCTAACGTTAGATCGAATATATTCCTGAGTCAGCCCAATG', 'GGTTGCCCGGATGCGGCGGAGCGTTTTGGTTGACTTCTCTCCGGAGCACTTTATCTCCAGTCCCGATGTGCGTCGACTTTATAGCTAAGCAGTAGACCTACATCCCGGTCATGATAATACAAACTCAAGGCCGTGGTAAGCTCCTTTGACCGTCTT', 'TTTTGTATATCAGAATTACGGTACATCGGGGACCACATCCCGCCATATAGACAAGCTGCGGAAACCCTGTACGTTCTTTTCATTCAGGGTACCGGGCGGAGATTAAACGATCGCCATAGCTCACTCTAACATGGCGCAGCCCCCGCCCAATATGCG', 'TCCAATTCTAGTTAATTACCTAACTGATTACTTGTTCATACTCCAGAGGTAAATCTCGGCCCAAACCAGTGGGGTTTTTAAGTGACCCTATATGCGTCCGGCTCCGGACCATAACGGGGTGGCTCGTGAATGTAGATGCTGCACGACCATGCGCCG', 'ATGGGCGTCGTGAGTGACTAAGTTTATGATGTCGCAACCCCATGTGCGGCCATGGTTGGAAAGGTCCTAGGTAATGGTATCATGCCCATGTTCAAAGGCCGTTTTTGATCGCGACACGAGAGCTCTGAGCAGGCAGCTCTACATTACTTAGCTCAC', 'GCCCAATATGCGATGCTCCCTCATCCGTGAGAAAAAGATATAGGTGGTATTCTAATATGGGGGTATCTTGCTTTCATGTTGATCCTGTGGTCCATCGGCAGCGAACGGGGTTACATCGGTGAGTATCACTTGTTTCCCGATACCTCTCGGCAAGTA', 'CGTCTCTGATCAAGCGTTTCCTATATAACCCAATACTTTTATGGATTCAATACCCGCAGCCAGCGTTAGGTGTTTTCCTACATGAAGCCAGGTTCTGGACTATAGAAGGACATTACGTGTGCAATCTCACTATCCCCATCTGCGCTATTGCAGAAG', 'GAAGTTGACGGCACAGCGACAACGGAACGTCAGGTATCGAGATCCCATGCTATGCACCTGTGGGACTCGTAGCTTATCGAACGGAAGGTTATTCTAGTACCAGCAAGTAGCGAATTACATAGTGCGGTCAGCTCGGGAACAGTCGCCCAATGTGCG', 'GCCCTATCTGAGCACCTATGATCCGGAACTATAGATAGAATTGTGAGTCAATCTTCTAAGGCGTCTTGTGATGTCTAAAATCTCCAGTGAACTGTACGAAGCACAAACAATAAATGTTTGCGTAAGGGTAGCCACAACTGTCGCGTCACCGCCCAT', 'CAATAGCTTCAAATATTTCCCATAAGGGCGCGACACGGTGTCGCATTATGTTAGCCTGCACGCTCGTAGCATGCCCAATATGCGGGAGCTGAAAAAAAACAATAAGCGTATCTGTAATCGAACCATAATCCGCCTTATATGGAGAGAACCCCACAG', 'GCCCAATATGTGCACCGTTCGCACTCTTGTCATTCATGGCGCCGCCGTGCAAGCACGACCGCCTGGACACAAGTGTGGACCCTTTCCCTCGATCGTTGAGTGTCAAATCTTCAAATCTAAAATCTTTAACAGGCATATGAATACTCGCACATGTAC', 'TGACCCAGGCTCGTACTATTCGCCCGTTGCTAATTGCGTTATCGTGTCCGAGATCTGTAATTGAACAAGCAGGAACCCAAGAACTCCCAATCTGTGATGTGACCCCGTGACACTCCAGACCTACGTCCTTAATCCGACCCCAAGGAATTGGCCTCT', 'CCAGCCCCGGTAGTGCATGGGGGGAGACATGCATCTCTCGCCGTACTAGCCCAATCTGAGGAGGCGTGGCCCAGGCGACATGGATCTTAACAAGCGGATAGTATTGATCAGCTTGACTATGTAGGTTAGCCGCGATTCTTCCCAGGTCGTCATCCG', 'AAGGCACTATCGTGTTTTCGCTACGCCCCGGGCGGGGAACTTGATCATGACATATGCCGATCGTATCTTTTATAGGGCTATGCATCCCAATCTGTGGTAGCGGACCTAGAACCTCCGAAGCTCCCACGGAGTGGAATGCGACGGAGTATTAACTGG', 'CACTGAGCCACTGAGACCGCGGGCACCCTATATGCGACACCATGCGACGCTGCCAAACGTACGGAGTCGCGCACTAAGCTTCTCTCGAGAAGGACGCAATACGTGGCAGGTGCGACCCGCCATTCCCTTGCATCTAACTCGAGATTCTTGGGGGTC', 'GATCGACACTAACCCTTTTCGACGAATGCTCTCGCTTGGATCCTTAGCAGAAAGGCATAATGGTATCCTAAAACCCTATATGGGAACACCGGGTGGGCCGTTAAGACTTCAAACATAAGGGGCTTAAGAGTTTGTGCTTTTAGGCAAGCTCACATT', 'AGGAGAAACCTACGGAAGCGCTCCATACCTTCCCGCTTAAATTCTATGATCTAATATGCAACCCAATATGAGGTCAGACAAAGATTGTTTTAGATCACCTCGTTAACGGGCCCTCATCTACTCGCCTCGCCGAAGATTTCGTACATACCGGCCAGC', 'GATCAGAACATCATGTCCACGGGATCACACTCAAAGTTCACCTTTATCTGGCAGGACGCGAGAGGTCGTGTTGTTCTTCTCGCTGCCAAGTACAAGCAGAACATCCCGCCCCTTACTGGCTCCCCATCTGGGGTGCTTTTTTACAGAGAGGACACT', 'GCCCAATGTGGGTTGCCAGCGTCGGGGGGTCGGGTCTGTGGACCAAAGTCACAGTACTTCGACGTCGGCCTTATACGGTCGATCACGCCGATATAACGAGCTCAATCTAAGCCGAGCTACTTTCGGCTTGTTTGTTGCGCTCTCTAACATGAGCGA', 'CCTCCACACCCATAAGGTATTCGTCTCAATGAACTGGAACGGAAGTATCGCTCTCTCATAAAAATACATGTATCTGGAAAGCTCAGGGCACGGGGCTCCCCATCTGCGTTTTACACGTTGAACGCGACGTCTCTTCAAAATTGTTGTTCGAAGCGG', 'TCGGCGCCATACAGTATGAAAGTTGTTTTTTGAGGGTATCAGATTCGATCGCCGAGGCCTTTAGTAGAAGTACTACGCTTCGCGGGGGCGTTCGCACCCCTATTTGCGAAAGATGAACTGGGAGGTCTCATTCCGGGGCTTCTCAGTTGTATCCGA', 'GCCGCCACGAGTTCTTCCCTGTCTATGATTCACTAGTAGGCAATGCACGCTTGATTGGTCAGACCATGTTAGCCCCAATGTGGGCACGCTCAAACTGATGGTGCCAACCAGGGCTAATGGAGCTTCAGCTTCTATACCTTTCTCGCTGGCGGGCTC', 'TGGATCTTTATGGGGGATAATGATAGAATGGAGACTAACGACCCTACAGCGCGAACTCACGTAGCAGATCGAACGGTGATAGCGACTACGTTTTTTCATCGGATGAAGCATGTTTTACACAAAAGACCGAAGGGGGAAGGCTTAGCCCGATCTGCG']
# t = 25
# k = 12
# res = greedy_motif_search(dna, k, t)
# for item in res:
# print(item, end=" ")
# text = 'abcdefghi'
# k = 3
# matrix_profile = 'abcd'
# print(profile_most_probable_k_mer(text, k, matrix_profile))
# dna = ['AAATTGACGCAT', 'GACGACCACGTT', 'CGTCAGCGCCTG', 'GCTGAGCACCGG', 'AGTTCGGGACAG']
# k = 3
# print(median_string(dna, k))
# dna = ['TTACCTTAAC', 'GATATCTGTC', 'ACGGCGTTCG', 'CCCTAAAGAG', 'CGTCAGAGGT']
# pattern = 'AAA'
# print(distance_between_pattern_and_string(pattern, dna))
# input = ["TCATATTTTT",
# "CCCTATCCAC",
# "GGGGGGGGGG",
# "GGGGGGGGGG",
# "GTGGGGGGGG",
# "GGGGGGGGGT",
# "GAAAAAAAAA",
# "TCTCCCTTAT",
# "TTTTTTTTCA",
# "TTTTTTCCTA",
# "TATTCCACAC",
# "TCCTCCTTCC"]
# print(entropy_matrix(input))
# Dna = ["CTCGAACGTGTACTTCCTTCTCCCC", "CCCCCAATCTATTTCGGCCCCGGGC", "CGTCATCGCCCAACTCCTCCCGATT", "ATAGCTGACTGTTTGACCTGGCTCC", "CGCCAACACCGCCTAGCTAAGTGTG", "ACTGTATATCGGCAACACCCACCCC"]
# Dna = ['AAAAA', 'AAAAA', 'AAAAA']
# Dna = ['ACGT', 'ACGT', 'ACGT']
# Dna = ["TCTGAGCTTGCGTTATTTTTAGACC" ,"GTTTGACGGGAACCCGACGCCTATA" ,"TTTTAGATTTCCTCAGTCCACTATA" ,"CTTACAATTTCGTTATTTATCTAAT" ,"CAGTAGGAATAGCCACTTTGTTGTA" ,"AAATCCATTAAGGAAAGACGACCGT"]
# k = 5
# d = 2
# res = list(motif_enumeration1(Dna, k, d))
# for item in res:
# print(item, end=" ")
# genome = open('Salmonella.txt').read().replace('\n', '')
# # print(minimum_skew_value(genome))
# Text = genome[3764856: 3764856 + 500]
# print(Text)
# k = 9
# d = 1
# print(frequent_words_with_mismatches_and_rc(Text, k, d))
# pattern = 'GCGCCGTAC'
# d = 3
# result = neighbors(pattern, d)
# for item in result:
# print(item, end=" ")
# print(d1_neighbours("ATC"))
# print(neighbors("ACG", 1))
# with open('approx_pattern_count_data.txt') as f:
# genome = f.readline()
# pattern = 'TCGGA'
# mismatch_count = 3
# print(get_approximate_pattern_count(genome, pattern, mismatch_count))
# pattern = 'ACGAGCATA'
# mismatch_count = 4
# res = get_approximate_pattern_match(genome, pattern, mismatch_count)
# for item in res:
# print(item, end=" ")
# with open('hamming_mismatch_genome1_data.txt') as f:
# genome1 = f.readline()
# with open('hamming_mismatch_genome2_data.txt') as f:
# genome2 = f.readline()
# print(get_hamming_mismatch(genome1, genome2))
# with open('data.txt') as f:
# genome = f.readline()
# print(minimum_skew_value(genome))hamming_mismatch_genome1_data
# print(get_skew_diag_data(genome))
# print(get_hamming_mismatch('GGGCCGTTGGT', 'GGACCGTTGAC'))
# print(get_hamming_mismatch('abscabaxab', 'ab'))
# print(get_pattern_match('abscabaxab', 'ab'))
#print(get_approximate_pattern_count('TTTAGAGCCTTCAGAGG', 'GAGG', 2))
# import random
| 201.278226 | 9,153 | 0.931486 | 1,004 | 49,917 | 46.104582 | 0.39741 | 0.000864 | 0.00175 | 0.001555 | 0.412539 | 0.405193 | 0.401694 | 0.395213 | 0.392102 | 0.392102 | 0 | 0.001308 | 0.034898 | 49,917 | 247 | 9,154 | 202.093117 | 0.959543 | 0.959633 | 0 | 0 | 0 | 0 | 0.035516 | 0.029967 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0.1 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9fdaccb9df4cfe288c73d6eaa6653b0fcbfc0585 | 108,794 | py | Python | GUI/sourcecode.py | alokpurohit18/KBC-TheGame | 69fdaad093997dc45a706441f2ed0cf9ff4f654f | [
"MIT"
] | 2 | 2020-04-10T15:07:20.000Z | 2020-04-10T18:32:41.000Z | GUI/sourcecode.py | alokpurohit18/KBC-TheGame | 69fdaad093997dc45a706441f2ed0cf9ff4f654f | [
"MIT"
] | null | null | null | GUI/sourcecode.py | alokpurohit18/KBC-TheGame | 69fdaad093997dc45a706441f2ed0cf9ff4f654f | [
"MIT"
] | null | null | null | import os
from tkinter import *
from tkinter import messagebox
from PIL import ImageTk,Image
from pygame import mixer
from random import *
import random
import mysql.connector
from pathlib import Path
mydb = mysql.connector.connect(host='localhost', user='root', passwd='Alok1823!', database='kbc')
mycursor = mydb.cursor(buffered=True)
localcursor = mydb.cursor(buffered=True)
mixer.init()
path = Path(__file__).parent.parent
screen = Tk()
screen.title("KBC - The Quiz Game")
screen.iconbitmap(os.path.join(path, 'Images and Icons\kbc_logo.ico'))
screen.geometry('1920x1080+0+0')
screen.configure(background='#2e004d')
startframe = Frame(screen, bg='#2e004d')
ruleframe = Frame(screen, bg='#2e004d')
developerframe = Frame(screen, bg='#2e004d')
loginframe = Frame(screen, bg='#2e004d')
accountframe = Frame(screen, bg='#2e004d')
gameframe = Frame(screen, bg='#2e004d')
panel = Label(screen)
filename = os.path.join(path, 'Images and icons\kbc1.png')
iconfile = os.path.join(path, 'Images and icons\login.png')
icon= Image.open(iconfile)
ruleframedestroy = 0
developerframedestroy = 0
loginframedestroy = 0
accountframedestroy = 0
startframedestroy = 0
gameframedestroy=0
username = StringVar()
password = StringVar()
denomination=5000 # RULE CHANGE AFFECTED
optiona=Button()
optionb=Button()
optionc=Button()
optiond=Button()
friend1=StringVar()
friend2=StringVar()
friend3=StringVar()
friend1.set("friend1")
friend2.set("friend2")
friend3.set("friend3")
lifelinebutton1 = Button()
lifelinebutton2 = Button()
lifelinebutton3 = Button()
lifelinebutton4 = Button()
fiftyfiftydone=0
phonedone=0
audiencedone=0
expertdone=0
aactive=1
bactive=1
cactive=1
dactive=1
sameq_flag=0
def set_background():
global panel
img = Image.open(filename)
img = img.resize((1600, 900), Image.ANTIALIAS)
img = ImageTk.PhotoImage(img)
if ruleframedestroy != 0:
panel = Label(ruleframe, image=img, width=1920, height=1080)
if developerframedestroy != 0:
panel = Label(developerframe, image=img, width=1920, height=1080)
if loginframedestroy != 0:
panel = Label(loginframe, image=img, width=1920, height=1080)
if accountframedestroy != 0:
panel = Label(accountframe, image=img, width=1920, height=1080)
if startframedestroy!=0:
panel = Label(startframe, image=img, width=1920, height=1080)
if gameframedestroy!=0:
panel = Label(gameframe, image=img, width=1920, height=1080)
panel.image = img
panel.pack(side='top')
def set_icon():
global icon
icon= Image.open(iconfile)
icon = ImageTk.PhotoImage(icon)
def lifeline_icon():
global icon
icon = Image.open(iconfile)
icon = icon.resize((60, 48), Image.ANTIALIAS)
icon = ImageTk.PhotoImage(icon)
def startingwindow():
global startframe, ruleframedestroy, developerframedestroy, loginframedestroy, accountframedestroy,startframedestroy, gameframedestroy
global username, password, filename, iconfile
username.set('')
password.set('')
if gameframedestroy!=0:
gameframe.destroy()
if ruleframedestroy != 0:
ruleframe.destroy()
if developerframedestroy != 0:
developerframe.destroy()
if loginframedestroy != 0:
loginframe.destroy()
if accountframedestroy != 0:
accountframe.destroy()
ruleframedestroy = 0
developerframedestroy = 0
loginframedestroy = 0
accountframedestroy = 0
gameframedestroy = 0
mixer.music.load(os.path.join(path, 'Images and icons\kbc_tune.mp3'))
mixer.music.play()
startframe = Frame(screen, bg='#2e004d')
startframe.pack()
startframedestroy = 1
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
iconfile= os.path.join(path, 'Images and icons\login.png')
set_icon()
loginbutton = Button(panel, text='LOGIN ',image=icon,compound=RIGHT, font='Arial 38 bold', bg='black', fg='white', border=5,
command=loginwindow1)
loginbutton.image=icon
loginbutton.pack(side='top', pady=160, padx=50, ipadx=15)
rulebutton = Button(panel, text='RULES OF THE\nGAME', font='Arial 36 bold', bg='black', fg='white', border=5,
command=rulewindow1)
rulebutton.pack(side='left', pady=130, padx=80)
createuser = Button(panel, text='CREATE\nACCOUNT', font='Arial 36 bold', bg='black', fg='white', border=5,
command=createaccount)
createuser.pack(side='left', pady=130, padx=40)
infobutton = Button(panel, text='DEVELOPER\n INFORMATION ', font='Arial 36 bold', bg='black', fg='white',
border=5, command=developerinfo)
infobutton.pack(side='left', pady=130, padx=80)
def rulewindow1():
global ruleframedestroy, ruleframe, filename , startframedestroy
if startframedestroy == 1:
startframe.destroy()
startframedestroy = 0
if ruleframedestroy == 1:
ruleframe.destroy()
ruleframe = Frame(screen, bg='#2e004d')
ruleframe.pack()
ruleframedestroy = 1
counter = 1
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
heading = Label(panel, bg='#2e004d', fg='white', text='RULES', font='Arial 36 bold underline')
heading.pack(side='top', anchor=W, padx=70, pady=30)
mycursor.execute("select rdescription from rule")
for i in mycursor:
rulemessage = Label(panel, bg='#2e004d', fg='yellow', text=str(counter) + '.' + str(i)[2:-3],
font='Arial 16 bold')
rulemessage.pack(side='top', anchor=W, padx=70, pady=20)
counter = counter + 1
backbutton = Button(panel, text='BACK', font='Arial 32 bold', bg='black', fg='white', border=5,
command=startingwindow)
backbutton.pack(side='left', anchor=W, padx=80, pady=100)
nextbutton = Button(panel, text='NEXT', font='Arial 32 bold', bg='black', fg='white', border=5,
command=rulewindow2)
nextbutton.pack(side='left', anchor=E, padx=40, pady=100)
def rulewindow2():
global ruleframe, filename
ruleframe.destroy()
ruleframe = Frame(screen, bg='#2e004d')
ruleframe.pack()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
heading = Label(panel, bg='#2e004d', fg='white', text='CHECKPOINTS', font='Arial 36 bold underline')
heading.pack(side='top', anchor=W, pady=20, padx=120)
mycursor.execute("select distinct(denomination) from question order by denomination desc")
j = int(65 / 5) # RULE CHANGE AFFECTED
for i in mycursor:
if i == (10000,) or i == (320000,) or i == (10000000,): # RULE CHANGE AFFECTED
rulemessage = Label(panel, bg='#2e004d', fg='yellow', text=str(j) + ' # ' + (str(i)[1:-2]),
font='Arial 24 bold')
rulemessage.pack(side='top', anchor=W, padx=180)
j = j - 1
else:
rulemessage = Label(panel, bg='#2e004d', fg='white', text=str(j) + ' # ' + (str(i)[1:-2]),
font='Arial 24 bold')
rulemessage.pack(side='top', anchor=W, padx=180)
j = j - 1
backbutton = Button(panel, text='BACK', font='Arial 32 bold', bg='black', fg='white', border=5,
command=rulewindow1)
backbutton.pack(side='left', anchor=W, pady=35,padx=80)
nextbutton = Button(panel, text='NEXT', font='Arial 32 bold', bg='black', fg='white', border=5,
command=rulewindow3)
nextbutton.pack(side='left', anchor=W, pady=35)
def rulewindow3():
global ruleframe, filename, iconfile, icon
ruleframe.destroy()
ruleframe = Frame(screen, bg='#2e004d')
ruleframe.pack()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
heading = Label(panel, bg='#2e004d', fg='white', text='LIFELINES', font='Arial 36 bold underline')
heading.pack(side='top',padx=180, pady=42,anchor=W)
j = 1
mycursor.execute("select lname from lifeline")
for i in mycursor:
if j == 1:
iconfile = os.path.join(path, 'Images and icons\Audiencepoll.png')
icon = Image.open(iconfile)
icon = icon.resize((60, 48), Image.ANTIALIAS)
icon = ImageTk.PhotoImage(icon)
lifelinebutton1 = Button(panel, text=' '+str(i)[2:-3]+' ',image=icon,compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow',
border=5, command=lifelinedescription1)
lifelinebutton1.image = icon
lifelinebutton1.pack(side='top',padx=150, pady=20,anchor=W)
elif j == 2:
iconfile = os.path.join(path, 'Images and icons\phoneafriend.png')
icon = Image.open(iconfile)
icon = icon.resize((60, 48), Image.ANTIALIAS)
icon = ImageTk.PhotoImage(icon)
lifelinebutton2 = Button(panel, text=' '+str(i)[2:-3]+' ',image=icon,compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow',
border=5, command=lifelinedescription2)
lifelinebutton2.image=icon
lifelinebutton2.pack(side='top',padx=150, pady=20,anchor=W)
elif j == 3:
iconfile = os.path.join(path, 'Images and icons\expertadvice.png')
icon = Image.open(iconfile)
icon = icon.resize((65, 52), Image.ANTIALIAS)
icon = ImageTk.PhotoImage(icon)
lifelinebutton3 = Button(panel, text=' '+str(i)[2:-3]+' ',image=icon,compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow',
border=5, command=lifelinedescription3)
lifelinebutton3.image=icon
lifelinebutton3.pack(side='top',padx=150, pady=20,anchor=W)
else:
iconfile = os.path.join(path, 'Images and icons\\fifty-fifty.png')
icon = Image.open(iconfile)
icon = icon.resize((60, 48), Image.ANTIALIAS)
icon = ImageTk.PhotoImage(icon)
lifelinebutton4 = Button(panel, text=' '+str(i)[2:-3]+' ',image=icon,compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow',
border=5, command=lifelinedescription4)
lifelinebutton4.image=icon
lifelinebutton4.pack(side='top',padx=150, pady=20,anchor=W)
j = j + 1
backbutton = Button(panel, text='BACK', font='Arial 32 bold', bg='black', fg='white', border=5,
command=rulewindow2)
backbutton.pack(side='left', pady=70, padx=85,anchor=W)
iconfile = os.path.join(path, 'Images and icons\home.png')
set_icon()
homebutton = Button(panel, text=' HOME ',image=icon,compound=LEFT, font='Arial 32 bold', bg='black', fg='white', border=5,
command=startingwindow)
homebutton.image=icon
homebutton.pack(side='left',ipadx=10,ipady=10, pady=75,anchor=W)
def rulewindow4():
global ruleframe,filename
ruleframe.destroy()
ruleframe = Frame(screen, bg='#2e004d')
ruleframe.pack()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
backbutton = Button(panel, text='BACK', font='Arial 32 bold', bg='black', fg='white', border=5,command=rulewindow3)
backbutton.pack(side='bottom', anchor=W, padx=60, pady=250)
heading = Label(panel, bg='#2e004d', fg='white', text='LIFELINE DESCRIPTION', font='Arial 36 bold underline')
heading.pack(side='top', anchor=W, pady=50, padx=60)
def lifelinedescription1():
rulewindow4()
mycursor.execute("select ldescription from lifeline where lnumber=1")
for i in mycursor:
rulemessage1 = Label(panel, bg='#2e004d', fg='yellow', text='1.' + str(i)[2:100],
font='Arial 22 bold') # RULE CHANGE AFFECTED
rulemessage1.pack(side='top', anchor=W, padx=40)
rulemessage2 = Label(panel, bg='#2e004d', fg='yellow', text='2.' + str(i)[100:-3],
font='Arial 22 bold') # RULE CHANGE AFFECTED
rulemessage2.pack(side='top', anchor=W, padx=40)
def lifelinedescription2():
rulewindow4()
mycursor.execute("select ldescription from lifeline where lnumber=2")
for i in mycursor:
rulemessage1 = Label(panel, bg='#2e004d', fg='yellow', text='1.' + str(i)[2:96],
font='Arial 22 bold') # RULE CHANGE AFFECTED
rulemessage1.pack(side='top', anchor=W, padx=50)
rulemessage2 = Label(panel, bg='#2e004d', fg='yellow', text='2.' + str(i)[96:-3],
font='Arial 22 bold') # RULE CHANGE AFFECTED
rulemessage2.pack(side='top', anchor=W, padx=50)
def lifelinedescription3():
rulewindow4()
mycursor.execute("select ldescription from lifeline where lnumber=3")
for i in mycursor:
rulemessage1 = Label(panel, bg='#2e004d', fg='yellow', text='1.' + str(i)[2:105],
font='Arial 22 bold') # RULE CHANGE AFFECTED
rulemessage1.pack(side='top', anchor=W, padx=50)
rulemessage2 = Label(panel, bg='#2e004d', fg='yellow', text='2.' + str(i)[106:-3],
font='Arial 22 bold') # RULE CHANGE AFFECTED
rulemessage2.pack(side='top', anchor=W, padx=50)
def lifelinedescription4():
rulewindow4()
mycursor.execute("select ldescription from lifeline where lnumber=4")
for i in mycursor:
rulemessage1 = Label(panel, bg='#2e004d', fg='yellow', text='1.' + str(i)[2:61],
font='Arial 22 bold') # RULE CHANGE AFFECTED
rulemessage1.pack(side='top', anchor=W, padx=50)
rulemessage2 = Label(panel, bg='#2e004d', fg='yellow', text='2.' + str(i)[61:-3],
font='Arial 22 bold') # RULE CHANGE AFFECTED
rulemessage2.pack(side='top', anchor=W, padx=50)
def developerinfo():
global developerframedestroy,startframedestroy, developerframe , filename
developerframedestroy = 1
startframe.destroy()
startframedestroy=0
developerframe = Frame(screen, bg='#2e004d')
developerframe.pack()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 10 bold underline')
waste.pack(side='top', anchor=W,padx=840)
heading = Label(panel, bg='#2e004d', fg='white', text='DEVELOPER INFORMATION',
font='Arial 36 bold underline')
heading.pack(side='top', anchor=W, pady=50, padx=30)
developer1 = Label(panel, bg='#2e004d', fg='yellow',
text='NAME - ALOK PUROHIT\nPHONE - 7718064605\nEMAIL - alokpurohit18@gmail.com',
font='Arial 28 bold')
developer1.pack(side='top', pady=30,anchor=W,padx=45)
developer2 = Label(panel, bg='#2e004d', fg='yellow',
text='NAME - ANIKET RAMAN\nPHONE - 9769910607\nEMAIL - aniketraman@hotmail.com',
font='Arial 28 bold')
developer2.pack(side='top',anchor=W,padx=45)
backbutton = Button(panel, text='BACK', font='Arial 32 bold', bg='black', fg='white', border=5,
command=startingwindow)
backbutton.pack(side='top', pady=110,padx=250,anchor=W)
def loginwindow1():
global loginframe, loginframedestroy,startframedestroy,filename,iconfile
startframe.destroy()
startframedestroy=0
loginframe = Frame(screen, bg='#2e004d')
loginframe.pack()
loginframedestroy = 1
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
waste2 = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste2.pack(side='bottom', anchor=W, padx=840)
heading = Label(panel, bg='#2e004d', fg='white', text='LOGIN PAGE', font='Arial 36 bold underline')
heading.pack(side='top',padx=150, pady=50,anchor=W)
text1 = Label(panel, bg='#2e004d', fg='yellow', text='USERNAME', font='Arial 24 bold')
text1.pack(side='top',padx=210, pady=40,anchor=W)
usernameentry = Entry(panel, textvariable=username, bg='white', fg='black', font='Arial 24 bold')
usernameentry.pack(side='top',anchor=W,padx=120)
text2 = Label(panel, bg='#2e004d', fg='yellow', text='PASSWORD', font='Arial 24 bold')
text2.pack(side='top', pady=30,anchor=W,padx=210)
passwordentry = Entry(panel, textvariable=password, bg='white', fg='black', font='Arial 24 bold', show='*')
passwordentry.pack(side='top',anchor=W,padx=120)
iconfile = os.path.join(path, 'Images and icons\login.png')
set_icon()
loginbutton = Button(panel, text=' LOGIN ',image=icon,compound=LEFT, font='Arial 30 bold', bg='black', fg='white', border=5,
command=loginwindow2)
loginbutton.image = icon
loginbutton.pack(side='top', padx=200, pady=60,anchor=W)
backbutton = Button(panel, text='BACK', font='Arial 32 bold', bg='black', fg='white', border=5,
command=startingwindow)
backbutton.pack(side='left', padx=90,anchor=W,pady=20)
createaccountbutton = Button(panel, text='CREATE\nACCOUNT', font='Arial 22 bold', bg='black', fg='white',
border=5, command=createaccount)
createaccountbutton.pack(side='left',anchor=W,pady=20)
def loginwindow2():
flag = 0
mycursor.reset()
if username.get() == '' or password.get() == '':
messagebox.showinfo("KBC - The Quiz Game", "ALL FIELDS ARE REQUIRED.")
else:
mycursor.execute("select username,password from player")
for i in mycursor:
if i[0] == username.get() and i[1] == password.get():
messagebox.showinfo("KBC - The Quiz Game", "LOGIN SUCCESSFUL.")
flag = 2
break
elif i[0] == username.get() and i[1] != password.get():
messagebox.showinfo("KBC - The Quiz Game", "INVALID PASSWORD.")
flag = 1
break
else:
continue
if flag == 0:
messagebox.showinfo("KBC - The Quiz Game", "INVALID USERNAME.")
elif flag == 1:
pass
else:
accountwindow()
def createaccount():
global accountframedestroy, accountframe,startframedestroy,loginframedestroy,filename
user = StringVar()
passwd = StringVar()
age = IntVar()
name = StringVar()
gender = StringVar()
email = StringVar()
localpassword = StringVar()
age.set(12)
if accountframedestroy == 1:
accountframe.destroy()
elif loginframedestroy == 0:
startframe.destroy()
startframedestroy=0
else:
loginframe.destroy()
loginframedestroy=0
accountframe = Frame(screen, bg='#2e004d')
accountframe.pack()
accountframedestroy = 1
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
file = os.path.join(path, 'Images and icons\kbc.png')
img = Image.open(file)
img = img.resize((600, 300), Image.ANTIALIAS)
img = ImageTk.PhotoImage(img)
waste = Label(accountframe, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
panel = Label(accountframe, image=img, width=600, height=300)
panel.image = img
panel.pack(side='right', padx=120,anchor=W)
heading = Label(accountframe, bg='black', fg='white', text='NEW ACCOUNT', font='Arial 36 bold underline')
heading.pack(side='top', pady=40,anchor=W,padx=150)
text1 = Label(accountframe, bg='#2e004d', fg='yellow', text='FULL NAME', font='Arial 22 bold')
text1.pack(side='top', pady=30, padx=250,anchor=W)
nameentry = Entry(accountframe, textvariable=name, bg='white', fg='black', font='Arial 22 bold')
nameentry.pack(side='top', padx=170,anchor=W)
text2 = Label(accountframe, bg='#2e004d', fg='yellow', text='USERNAME', font='Arial 22 bold')
text2.pack(side='top', pady=30, padx=250,anchor=W)
usernameentry = Entry(accountframe, textvariable=user, bg='white', fg='black', font='Arial 22 bold')
usernameentry.pack(side='top', padx=170,anchor=W)
text3 = Label(accountframe, bg='#2e004d', fg='yellow', text='GENDER', font='Arial 22 bold')
text3.pack(side='top', padx=270, pady=30,anchor=W)
genderentry = OptionMenu(accountframe, gender, 'MALE', 'FEMALE', 'OTHER')
genderentry.configure(font='Arial 22 bold', bg='white', fg='black')
genderentry.pack(side='top', padx=270,anchor=W)
def nextwindow():
localcursor.reset()
flag = 0
if user.get() == '' or name.get() == '' or gender.get() == '':
messagebox.showinfo("KBC - The Quiz Game", "ALL FIELDS ARE REQUIRED.")
else:
localcursor.execute("select username from player")
for i in localcursor:
if str(i)[2:-3] == user.get():
flag = 1
break
if flag == 1:
messagebox.showinfo("KBC - The Quiz Game",
"USERNAME ALRAEDY EXISTS. PLEASE CHOOSE ANOTHER ONE.")
else:
global accountframe
accountframe.destroy()
accountframe = Frame(screen, bg='#2e004d')
accountframe.pack()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
file = os.path.join(path, 'Images and icons\kbc.png')
img = Image.open(file)
img = img.resize((600, 300), Image.ANTIALIAS)
img = ImageTk.PhotoImage(img)
waste = Label(accountframe, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
panel = Label(accountframe, image=img, width=600, height=300)
panel.image = img
panel.pack(side='right', padx=120, anchor=W)
heading = Label(accountframe, bg='black', fg='white', text='NEW ACCOUNT',
font='Arial 36 bold underline')
heading.pack(side='top', pady=25,anchor=W,padx=140)
text1 = Label(accountframe, bg='#2e004d', fg='yellow', text='AGE', font='Arial 22 bold')
text1.pack(side='top', pady=20,anchor=W,padx=290)
ageentry = Entry(accountframe, textvariable=age, bg='white', fg='black', font='Arial 22 bold')
ageentry.pack(side='top',anchor=W,padx=160)
text2 = Label(accountframe, bg='#2e004d', fg='yellow', text='EMAIL', font='Arial 22 bold')
text2.pack(side='top', pady=20,anchor=W,padx=280)
emailentry = Entry(accountframe, textvariable=email, bg='white', fg='black', font='Arial 22 bold')
emailentry.pack(side='top',anchor=W,padx=160)
text3 = Label(accountframe, bg='#2e004d', fg='yellow', text='PASSWORD', font='Arial 22 bold')
text3.pack(side='top', pady=20, padx=240,anchor=W)
passwordentry = Entry(accountframe, textvariable=passwd, bg='white', fg='black', font='Arial 22 bold',
show='*')
passwordentry.pack(side='top', padx=160,anchor=W)
text4 = Label(accountframe, bg='#2e004d', fg='yellow', text='CONFIRM\nPASSWORD', font='Arial 22 bold')
text4.pack(side='top', padx=240, pady=20,anchor=W)
passwordentry2 = Entry(accountframe, textvariable=localpassword, bg='white', fg='black',
font='Arial 22 bold', show='*')
passwordentry2.pack(side='top', padx=160,anchor=W)
backbutton = Button(accountframe, text='BACK', font='Arial 32 bold', bg='black', fg='white', border=5,
command=createaccount)
backbutton.pack(side='left', padx=90, pady=40,anchor=W,ipady=5)
createaccountbutton = Button(accountframe, text='CREATE\nACCOUNT', font='Arial 24 bold', bg='black',
fg='white', border=5, command=validateaccount)
createaccountbutton.pack(side='left', padx=10, pady=40,anchor=W)
def validateaccount():
mycursor.reset()
localcursor.reset()
k=[0]
localcursor.execute("select email from player")
for i in localcursor:
if email.get()==str(i)[2:-3]:
k[0]=1
break
if email.get() == '' or passwd.get() == '' or localpassword.get() == '':
messagebox.showinfo("KBC - The Quiz Game", "ALL FIELDS ARE REQUIRED.")
elif age.get() < 12:
messagebox.showinfo("KBC - The Quiz Game", "MINIMUM AGE REQUIRED FOR THIS GAME IS 12.")
elif k[0]==1:
messagebox.showinfo("KBC - The Quiz Game", "THERE IS ALREADY AN ACCOUNT WITH THIS EMAIL ID.")
elif passwd.get() != localpassword.get():
messagebox.showinfo("KBC - The Quiz Game", "PLEASE ENTER SAME PASSWORDS.")
else:
messagebox.showinfo("KBC - The Quiz Game", "ACCOUNT SUCCESSFULLY CREATED.")
if gender.get() == 'MALE':
g = 'M'
elif gender.get() == 'FEMALE':
g = 'F'
else:
g = 'O'
mycursor.execute(
"insert into player (username, password, age , name , gender, email, highest_score) value ('{}', '{}', '{}', '{}', '{}', '{}',0);".format(
user.get(), passwd.get(), age.get(), name.get(), g, email.get()))
mydb.commit()
startingwindow()
def previouswindow():
user.set('')
name.set('')
gender.set('')
startingwindow()
backbutton = Button(accountframe, text=' BACK', font='Arial 32 bold', bg='black', fg='white', border=5,anchor=W,
command=previouswindow)
backbutton.pack(side='left', padx=90, pady=60,ipady=5)
nextbutton = Button(accountframe, text=' NEXT', font='Arial 33 bold', bg='black', fg='white', border=5,anchor=W,
command=nextwindow)
nextbutton.pack(side='left', padx=20, pady=60,ipady=5)
def accountwindow():
global gameframedestroy,filename,loginframedestroy,gameframe,iconfile,denomination
mycursor.reset()
denomination=5000
if loginframedestroy==1:
loginframe.destroy()
loginframedestroy=0
if gameframedestroy==1:
gameframe.destroy()
gameframe = Frame(screen, bg='#2e004d')
gameframe.pack()
gameframedestroy=1
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
name = ['']
mycursor.execute("select name from player where username = '{}'".format(username.get()))
for i in mycursor:
name[0] = str(i)[2:-3]
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
iconfile = os.path.join(path, 'Images and icons\logout.png')
set_icon()
userheading = Label(panel, bg='#2e004d', fg='yellow', text=username.get()+"\n"+name[0], font='Arial 24 bold',border=10)
userheading.pack(side='left', anchor=NW,padx=20, pady=20)
logoutbutton = Button(panel, text=' LOGOUT ', image=icon,compound=LEFT, font='Arial 24 bold', bg='black', fg='white', border=5,command=startingwindow)
logoutbutton.image = icon
logoutbutton.pack(side='right', padx=20, anchor=NE, pady=20)
startgamebutton = Button(panel, text='START\nGAME', font='Arial 42 bold', bg='black',
fg='white', border=5,command=startgame1)
startgamebutton.pack(side='left',anchor=NW, padx=60, pady=320,ipadx=20)
highscorebutton = Button(panel, text='HIGH\nSCORES', font='Arial 42 bold', bg='black',
fg='white', border=5,command=highscores)
highscorebutton.pack(side='right', padx=60, pady=320,ipadx=20)
def highscores():
global filename, loginframedestroy, gameframe, iconfile
mycursor.reset()
gameframe.destroy()
gameframe = Frame(screen, bg='#2e004d')
gameframe.pack()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
iconfile = os.path.join(path, 'Images and icons\logout.png')
set_icon()
mycursor.reset()
name = ['']
mycursor.execute("select name from player where username = '{}'".format(username.get()))
for i in mycursor:
name[0] = str(i)[2:-3]
score = ['']
mycursor.reset()
mycursor.execute("select highest_score from player where username = '{}'".format(username.get()))
for i in mycursor:
score[0] = str(i)[1:-2]
userheading = Label(panel, bg='#2e004d', fg='yellow', text=username.get() + "\n" + name[0], font='Arial 24 bold',border=10)
userheading.pack(side='left', anchor=NW, padx=20, pady=20)
logoutbutton = Button(panel, text=' LOGOUT ', image=icon, compound=LEFT, font='Arial 24 bold', bg='black',fg='white', border=5, command=startingwindow)
logoutbutton.image = icon
logoutbutton.pack(side='right', padx=20, anchor=NE, pady=20)
localhighscore = Label(panel, bg='black', fg='yellow', text="YOUR HIGH SCORE = "+score[0], font='Arial 34 bold', border=10)
localhighscore.pack(side='top', padx=80, pady=180)
mycursor.reset()
mycursor.execute("select max(highest_score) from player")
for i in mycursor:
score[0] = str(i)[1:-2]
globalhighscore = Label(panel, bg='black', fg='yellow', text="GLOBAL HIGH SCORE = " + score[0], font='Arial 34 bold',border=10)
globalhighscore.pack(side='top', padx=80,pady=20)
backbutton = Button(panel, text='BACK', font='Arial 32 bold', bg='black', fg='white', border=5,
command=accountwindow)
backbutton.pack(side='top', padx=80, pady=90)
def startgame2(friend1,friend2,friend3):
global filename, gameframe, iconfile
gameframe.destroy()
gameframe = Frame(screen, bg='#2e004d')
gameframe.pack()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
iconfile = os.path.join(path, 'Images and icons\quit.png')
set_icon()
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
mycursor.reset()
name = ['']
mycursor.execute("select name from player where username = '{}'".format(username.get()))
for i in mycursor:
name[0] = str(i)[2:-3]
userheading = Label(panel, bg='#2e004d', fg='yellow', text=username.get() + "\n" + name[0],
font='Arial 24 bold',
border=10)
userheading.pack(side='left', anchor=NW, padx=20, pady=20)
quitbutton = Button(panel, text=' QUIT ', image=icon, compound=LEFT, font='Arial 24 bold', bg='black',
fg='white',
border=5, command=quitgame)
quitbutton.image = icon
quitbutton.pack(side='right', padx=40, anchor=NE, pady=40)
displayquestion(friend1,friend2,friend3)
def startgame1():
global gameframe,phonedone,fiftyfiftydone,audiencedone,expertdone
phonedone=0
fiftyfiftydone=0
audiencedone=0
expertdone=0
mycursor.reset()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
gameframe.destroy()
gameframe = Frame(screen, bg='#2e004d')
gameframe.pack()
waste = Label(gameframe, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
userheading = Label(gameframe, bg='#2e004d', fg='yellow', text='PLEASE ENTER THE NAMES OF THE 3 FRIENDS,\nWHO YOU WOULD WANT TO CALL FOR HELP.', font='Arial 32 bold',
border=10)
userheading.pack(side='top', padx=100, pady=40)
friendentry1 = Entry(gameframe, bg='white',text=friend1.get(),textvariable=friend1, fg='black', font='Arial 22 bold')
friendentry1.pack(side='top', padx=170, pady=40)
friendentry2 = Entry(gameframe, bg='white',text=friend2.get(),textvariable=friend2, fg='black', font='Arial 22 bold')
friendentry2.pack(side='top', padx=170, pady =40)
friendentry3 = Entry(gameframe, bg='white',text=friend3.get(),textvariable=friend3, fg='black', font='Arial 22 bold')
friendentry3.pack(side='top', padx=170, pady=40)
okbutton = Button(gameframe, text=' OK ', font='Arial 32 bold', bg='black', fg='white',
border=5, command=lambda :startgame2(friend1.get(),friend2.get(),friend3.get()))
okbutton.pack(side='top', padx=250, pady=40)
def displayquestion(friend1,friend2,friend3):
global denomination,optiona,optionb,optionc,optiond,sameq_flag
sameq_flag=0
if denomination==10000 or denomination==320000 or denomination==10000000:
userheading = Label(panel, bg='black', fg='yellow', text=str(denomination), font='Arial 28 bold underline',
border=10)
userheading.pack(side='top', padx=50, anchor=N, pady=40)
else:
userheading = Label(panel, bg='black', fg='white', text=str(denomination), font='Arial 28 bold underline',
border=10)
userheading.pack(side='top', padx=50, anchor=N,pady=40)
mixer.music.load(os.path.join(path, 'Images and icons\kbc.mp3'))
mixer.music.play()
mycursor.reset()
mycursor.execute("select qnumber from gets where username = '{}'".format(username.get()))
chr=mycursor.fetchall()
qnumber=0
mycursor.reset()
mycursor.execute("select qnumber from question where denomination = {}".format(denomination))
for i in mycursor:
if i not in chr:
qnumber=i[0]
break
mycursor.reset()
mycursor.execute("select description from question where qnumber = {}".format(qnumber))
j = mycursor.fetchone()
question = str(j)[2:-3]
mycursor.reset()
mycursor.execute(
"insert into gets(username, qnumber) values ('{}', {});".format(username.get(), qnumber))
mydb.commit()
mycursor.reset()
mycursor.execute("select max(gnumber) from game")
i = mycursor.fetchone()
gnumber = i[0] + 1
mycursor.reset()
mycursor.execute("insert into uses(gnumber, qnumber) values ({}, {});".format(gnumber, qnumber)) # RULE CHANGE AFFECTED
mydb.commit()
if len(question) < 81:
questionlabel = Label(panel, bg='black', fg='yellow', text='Q.'+question,
font='Arial 18 bold', border=5)
questionlabel.pack(side='top', anchor=N, padx=20, pady=40)
waste2 = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste2.pack(side='bottom', anchor=W, padx=840, pady=22)
else:
questionlabel = Label(panel, bg='black', fg='yellow', text='Q.' + str(j)[2:81]+'-\n-'+str(j)[81:-3],
font='Arial 18 bold', border=5)
questionlabel.pack(side='top', anchor=N, padx=10, pady=40)
waste2 = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste2.pack(side='bottom', anchor=W, padx=840, pady=7)
mycursor.reset()
mycursor.execute("select option_a,option_b,option_c,option_d from question where qnumber = '{}'".format(qnumber))
i=mycursor.fetchone()
optiona = Button(panel, text='A.'+i[0], font='Arial 18 bold', bg='black', fg='yellow',command= lambda: set_optionA(qnumber))
optiona.pack(side='top', padx=40,pady=20)
optionb = Button(panel, text='B.'+i[1], font='Arial 18 bold', bg='black', fg='yellow',command= lambda: set_optionB(qnumber))
optionb.pack(side='top', padx=40,pady=20)
optionc = Button(panel, text='C.'+i[2], font='Arial 18 bold', bg='black', fg='yellow',command= lambda: set_optionC(qnumber))
optionc.pack(side='top', padx=40,pady=20)
optiond = Button(panel, text='D.'+i[3], font='Arial 18 bold', bg='black', fg='yellow',command= lambda : set_optionD(qnumber))
optiond.pack(side='top', padx=40,pady=20)
displaylifelines(friend1,friend2,friend3,qnumber)
def displaylifelines(friend1,friend2,friend3,qnumber):
global lifelinebutton4,lifelinebutton2,lifelinebutton1,lifelinebutton4,icon,iconfile
mycursor.reset()
mycursor.execute("select correct from question where qnumber = '{}'".format(qnumber))
i = mycursor.fetchone()
correct=i[0]
if audiencedone==0:
iconfile = os.path.join(path, 'Images and icons\Audiencepoll.png')
lifeline_icon()
lifelinebutton1 = Button(panel, image=icon, compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow',
border=5,command= lambda :audiencepoll(correct))
else:
iconfile = os.path.join(path, 'Images and icons\Audiencepoll-done.png')
lifeline_icon()
lifelinebutton1 = Button(panel, image=icon, compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow',
border=5)
lifelinebutton1.image = icon
lifelinebutton1.pack(side='left', anchor=NW, padx=90, pady=40)
if phonedone==0:
iconfile = os.path.join(path, 'Images and icons\phoneafriend.png')
lifeline_icon()
lifelinebutton2 = Button(panel, image=icon, compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow', border=5,
command=lambda : phoneafriend(friend1,friend2,friend3,correct))
else:
iconfile = os.path.join(path, 'Images and icons\phoneafriend-done.png')
lifeline_icon()
lifelinebutton2 = Button(panel, image=icon, compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow', border=5)
lifelinebutton2.image = icon
lifelinebutton2.pack(side='left', anchor=NW, padx=90, pady=40)
if expertdone==0:
iconfile = os.path.join(path, 'Images and icons\expertadvice.png')
lifeline_icon()
lifelinebutton3 = Button(panel, image=icon, compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow', border=5,command= lambda: expertadvice(correct))
else:
iconfile = os.path.join(path, 'Images and icons\expertadvice-done.png')
lifeline_icon()
lifelinebutton3 = Button(panel, image=icon, compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow',
border=5)
lifelinebutton3.image = icon
lifelinebutton3.pack(side='left', anchor=NW, padx=90, pady=40)
if fiftyfiftydone==0:
iconfile = os.path.join(path, 'Images and icons\\fifty-fifty.png')
lifeline_icon()
lifelinebutton4 = Button(panel, image=icon, compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow', border=5,
command= lambda : fiftyfifty(correct))
else:
iconfile = os.path.join(path, 'Images and icons\\fifty-fifty-done.png')
lifeline_icon()
lifelinebutton4 = Button(panel, image=icon, compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow',border=5)
lifelinebutton4.image = icon
lifelinebutton4.pack(side='left', anchor=NE, padx=90, pady=40)
def set_optionA(qnumber):
response = messagebox.askyesno("KBC - The Quiz Game", "Do you want to lock option A ?")
if response == 1:
answer = 'a'
validateanswer(answer,qnumber)
def set_optionB(qnumber):
response = messagebox.askyesno("KBC - The Quiz Game", "Do you want to lock option B ?")
if response == 1:
answer = 'b'
validateanswer(answer, qnumber)
def set_optionC(qnumber):
response = messagebox.askyesno("KBC - The Quiz Game", "Do you want to lock option C ?")
if response == 1:
answer = 'c'
validateanswer(answer, qnumber)
def set_optionD(qnumber):
response = messagebox.askyesno("KBC - The Quiz Game", "Do you want to lock option D ?")
if response == 1:
answer = 'd'
validateanswer(answer, qnumber)
def validateanswer(answer,qnumber):
global filename, gameframe, iconfile, denomination
mycursor.reset()
gameframe.destroy()
gameframe = Frame(screen, bg='#2e004d')
gameframe.pack()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
mycursor.execute("select correct from question where qnumber = '{}'".format(qnumber))
correct = mycursor.fetchone()
if answer == correct[0]:
if denomination == 10000:
winningmessage = Label(panel, bg='black', fg='yellow',
text='THAT IS THE CORRECT ANSWER!!!\nCONGRATULATIONS!!!\nYOU HAVE WON RS ' + str(
denomination)+'.\n\nYOU HAVE CLEARED THE FIRST CHECKPOINT.\nYOU WILL ATLEAST WIN RS '
+str(denomination)+ ' FROM HERE.', font='Arial 28 bold',
border=10)
winningmessage.pack(side='top', padx=300, pady=120)
okbutton = Button(panel, text=' OK ', font='Arial 32 bold', bg='black', fg='white', border=5,
command=lambda:startgame2(friend1.get(),friend2.get(),friend3.get()))
okbutton.pack(side='top', padx=80, pady=102)
elif denomination == 320000:
winningmessage = Label(panel, bg='black', fg='yellow',
text='THAT IS THE CORRECT ANSWER!!!\nCONGRATULATIONS!!!\nYOU HAVE WON RS ' + str(
denomination)+'.\n\nYOU HAVE CLEARED THE SECOND CHECKPOINT.\nYOU WILL ATLEAST WIN RS '
+str(denomination)+ ' FROM HERE.', font='Arial 28 bold',
border=10)
winningmessage.pack(side='top', padx=300, pady=120)
okbutton = Button(panel, text=' OK ', font='Arial 32 bold', bg='black', fg='white', border=5,
command=lambda :startgame2(friend1.get(),friend2.get(),friend3.get()))
okbutton.pack(side='top', padx=80, pady=102)
elif denomination == 10000000:
winningmessage = Label(panel, bg='black', fg='yellow',
text='THAT IS THE CORRECT ANSWER!!!\nCONGRATULATIONS!!!\nYOU HAVE WON RS ' + str(
denomination)+'.\n\nYOU HAVE CLEARED THE LAST CHECKPOINT.\nYOU ARE A CROREPATI !!!\nYOU WILL ATLEAST WIN RS '
+str(denomination)+ ' FROM HERE.', font='Arial 28 bold',
border=10)
winningmessage.pack(side='top', padx=300, pady=100)
okbutton = Button(panel, text=' OK ', font='Arial 32 bold', bg='black', fg='white', border=5,
command=lambda :startgame2(friend1.get(),friend2.get(),friend3.get()))
okbutton.pack(side='top', padx=80, pady=102)
elif denomination == 50000000:
winningmessage = Label(panel, bg='black', fg='yellow',
text='THAT IS THE CORRECT ANSWER!!!\nCONGRATULATIONS!!!\nYOU HAVE WON RS ' + str(
denomination)+'.\n\nYOU HAVE WON THE JACKPOT!!!\nYOU HAVE COMPLETED THE GAME.YOU NOW OWN RS 50000000 !!!'
, font='Arial 28 bold',
border=10)
winningmessage.pack(side='top', padx=150, pady=120)
okbutton = Button(panel, text=' OK ', font='Arial 32 bold', bg='black', fg='white', border=5,
command=lambda :startgame2(friend1.get(),friend2.get(),friend3.get()))
okbutton.pack(side='top', padx=80, pady=102)
else:
winningmessage = Label(panel, bg='black', fg='yellow', text='THAT IS THE CORRECT ANSWER!!!\nCONGRATULATIONS!!!\nYOU HAVE WON RS '+str(denomination)+'.', font='Arial 28 bold',
border=10)
winningmessage.pack(side='top', padx=440, anchor=NW, pady=180)
okbutton = Button(panel, text=' OK ', font='Arial 32 bold', bg='black', fg='white', border=5,
command=lambda :startgame2(friend1.get(),friend2.get(),friend3.get()))
okbutton.pack(side='top', padx=80, pady=110)
else:
winning=[0]
if denomination<=10000: #RULE CHANGE AFFECTED
winning[0]=0
elif denomination>10000 and denomination<=320000:
winning[0]=10000
elif denomination>320000 and denomination<=10000000:
winning[0]=320000
else:
winning[0]=10000000
winningmessage = Label(panel, bg='black', fg='yellow',
text='SORRY, THAT IS THE WRONG ANSWER...\nYOU HAVE WON RS ' + str(
winning[0]) + '.\n\nTHANK YOU FOR PLAYING THE GAME!!!', font='Arial 28 bold',
border=10)
winningmessage.pack(side='top', padx=300, pady=200)
okbutton = Button(panel, text=' OK ', font='Arial 32 bold', bg='black', fg='white', border=5,
command=accountwindow)
okbutton.pack(side='top', padx=80, pady=65)
mycursor.reset()
mycursor.execute("select max(gnumber) from game")
i=mycursor.fetchone()
gnumber= i[0] + 1
mycursor.reset()
mycursor.execute("select highest_score from player where username='{}'".format(username.get()))
i=mycursor.fetchone()
highscore=i[0]
if winning[0]>highscore:
mycursor.reset()
mycursor.execute( "update player set highest_score = '{}' where username = '{}';".format(winning[0], username.get()))
mydb.commit()
mycursor.reset()
mycursor.execute("insert into game(gnumber, username, amount_won, quit, friend1, friend2, friend3) values ({}, '{}', {}, 'F', '{}', '{}', '{}');"
.format(gnumber, username.get(), winning[0], friend1.get(), friend2.get(), friend3.get()))
mydb.commit()
i=1
while i<=6:
mycursor.reset()
mycursor.execute(
"insert into follow(gnumber, rnumber, game_year) values ({}, {}, {});".format(gnumber, i, 2020)) # RULE CHANGE AFFECTED
mydb.commit()
i=i+1
if audiencedone==1:
mycursor.reset()
mycursor.execute(
"insert into used(lnumber,gnumber) values ({}, {});".format(1, gnumber)) # RULE CHANGE AFFECTED
mydb.commit()
if phonedone==1:
mycursor.reset()
mycursor.execute(
"insert into used(lnumber,gnumber) values ({}, {});".format(2, gnumber)) # RULE CHANGE AFFECTED
mydb.commit()
if expertdone==1:
mycursor.reset()
mycursor.execute(
"insert into used(lnumber,gnumber) values ({}, {});".format(3, gnumber)) # RULE CHANGE AFFECTED
mydb.commit()
if fiftyfiftydone==1:
mycursor.reset()
mycursor.execute(
"insert into used(lnumber,gnumber) values ({}, {});".format(4, gnumber)) # RULE CHANGE AFFECTED
mydb.commit()
if denomination==640000: #RULE CHANGE AFFECTED
denomination=1250000
elif denomination==10000000:
denomination=50000000
else:
denomination = denomination*2
def quitgame():
global filename, gameframe, iconfile, denomination
winning=[0]
response = messagebox.askyesno("KBC - The Quiz Game","Do you want to quit the game?")
if response==1:
if denomination==1250000:
winning[0]=640000
elif denomination==50000000:
winning[0]=10000000
elif denomination==5000:
winning[0]=0
else:
winning[0] = int(denomination/2)
mycursor.reset()
gameframe.destroy()
gameframe = Frame(screen, bg='#2e004d')
gameframe.pack()
mixer.music.load(os.path.join(path, 'Images and icons\play.mp3'))
mixer.music.play()
filename = os.path.join(path, 'Images and icons\kbc1.png')
set_background()
waste = Label(panel, bg='#2e004d', fg='white', text='',
font='Arial 3 bold underline')
waste.pack(side='top', anchor=W, padx=840)
winningmessage = Label(panel, bg='black', fg='yellow',
text='YOU HAVE QUIT THE GAME .....\nYOU HAVE WON RS ' + str(
winning[0]) + '.\n\nTHANK YOU FOR PLAYING THE GAME!!!', font='Arial 28 bold',
border=10)
winningmessage.pack(side='top', padx=300, pady=200)
okbutton = Button(panel, text=' OK ', font='Arial 32 bold', bg='black', fg='white', border=5,
command=accountwindow)
okbutton.pack(side='top', padx=80, pady=65)
mycursor.reset()
mycursor.execute("select max(gnumber) from game")
i = mycursor.fetchone()
gnumber = i[0] + 1
mycursor.reset()
mycursor.execute(
"insert into game(gnumber, username, amount_won, quit, friend1, friend2, friend3) values ({}, '{}', {}, 'T', '{}', '{}', '{}');"
.format(gnumber, str(username.get()), winning[0], str(friend1.get()), str(friend2.get()), str(friend3.get())))
mydb.commit()
mycursor.reset()
mycursor.execute("select highest_score from player where username='{}'".format(username.get()))
i = mycursor.fetchone()
highscore = i[0]
if winning[0] > highscore:
mycursor.reset()
mycursor.execute(
"update player set highest_score = '{}' where username = '{}';".format(winning[0], username.get()))
mydb.commit()
i = 1
while i <= 6:
mycursor.reset()
mycursor.execute(
"insert into follow(gnumber, rnumber, game_year) values ({}, {}, {});".format(gnumber, i,
2020)) # RULE CHANGE AFFECTED
mydb.commit()
i = i + 1
if audiencedone == 1:
mycursor.reset()
mycursor.execute(
"insert into used(lnumber,gnumber) values ({}, {});".format(1, gnumber)) # RULE CHANGE AFFECTED
mydb.commit()
if phonedone == 1:
mycursor.reset()
mycursor.execute(
"insert into used(lnumber,gnumber) values ({}, {});".format(2, gnumber)) # RULE CHANGE AFFECTED
mydb.commit()
if expertdone == 1:
mycursor.reset()
mycursor.execute(
"insert into used(lnumber,gnumber) values ({}, {});".format(3, gnumber)) # RULE CHANGE AFFECTED
mydb.commit()
if fiftyfiftydone == 1:
mycursor.reset()
mycursor.execute(
"insert into used(lnumber,gnumber) values ({}, {});".format(4, gnumber)) # RULE CHANGE AFFECTED
mydb.commit()
def fiftyfifty(correct):
global fiftyfiftydone,aactive, bactive,cactive,dactive,iconfile,lifelinebutton4,sameq_flag
sameq_flag=1
fiftyfiftydone=1
if (correct == 'a'):
randnumber = randint(2, 4)
if (randnumber == 2):
optionc.destroy()
optiond.destroy()
cactive=0
dactive=0
if (randnumber == 3):
optionb.destroy()
optiond.destroy()
bactive = 0
dactive = 0
if (randnumber == 4):
optionb.destroy()
optionc.destroy()
cactive = 0
bactive = 0
if (correct == 'b'):
randnumber = randint(2, 4)
if (randnumber == 2):
optionc.destroy()
optiond.destroy()
cactive = 0
dactive = 0
if (randnumber == 3):
optiona.destroy()
optiond.destroy()
aactive = 0
dactive = 0
if (randnumber == 4):
optiona.destroy()
optionc.destroy()
cactive = 0
aactive = 0
if (correct == 'c'):
randnumber = randint(2, 4)
if (randnumber == 2):
optionb.destroy()
optiond.destroy()
bactive = 0
dactive = 0
if (randnumber == 3):
optiona.destroy()
optiond.destroy()
aactive = 0
dactive = 0
if (randnumber == 4):
optionb.destroy()
optiona.destroy()
bactive = 0
aactive = 0
if (correct == 'd'):
randnumber = randint(2, 4)
if (randnumber == 2):
optionb.destroy()
optionc.destroy()
cactive = 0
bactive = 0
if (randnumber == 3):
optiona.destroy()
optionc.destroy()
cactive = 0
aactive = 0
if (randnumber == 4):
optionb.destroy()
optiona.destroy()
aactive = 0
bactive = 0
lifelinebutton4.destroy()
iconfile = os.path.join(path, 'Images and icons\\fifty-fifty.png')
lifeline_icon()
lifelinebutton4 = Button(panel, image=icon, compound=LEFT, font='Arial 24 bold', bg='black', fg='yellow', border=5)
lifelinebutton4.image = icon
lifelinebutton4.pack(side='left', anchor=NE, padx=90, pady=40)
def phoneafriend(friend1,friend2,friend3,correct):
global phonedone,fiftyfiftydone,iconfile
correct=str(correct)
correct=correct.upper()
if phonedone==0:
tgui = Tk()
tgui.title("Phone A Friend")
tgui.geometry('600x400+480+270')
tgui.iconbitmap(os.path.join(path, 'Images and icons\kbc_logo.ico'))
tgui.configure(bg='#2e004d')
if (fiftyfiftydone == 1 and sameq_flag==1):
friendanswer1 = ["I am confident the answer is ", "I am sure the correct option is ", "My answer would be "]
friendanswer2 = ["I am confused between ", "According to me the answer might be ",
"Maybe the answer is either "]
friendanswer3 = ["I am sorry but I have no idea about this question. ",
"I apologize, I don't have any answer for this question. ",
"I regret to say that I don't the answer to this question. "]
msg = Label(tgui, text="Who do you want to call?", font='Arial 16 bold', bg="#2e004d", fg='yellow', border=5)
msg.pack(side='top', padx=100, pady=40)
def friendbuttonpress1():
msg.destroy()
FRIEND1.destroy()
FRIEND2.destroy()
FRIEND3.destroy()
msg1 = Label(tgui, text="Connecting to " + friend1, font='Arial 16 bold', bg="#2e004d", fg='yellow',
border=5)
msg1.pack(side='top', padx=100, pady=10)
msg2 = Label(tgui, text="Connected...", font='Arial 16 bold', bg="#2e004d", fg='yellow', border=5)
msg2.pack(side='top', padx=100, pady=10)
def friendbuttonpress2():
msg.destroy()
FRIEND1.destroy()
FRIEND2.destroy()
FRIEND3.destroy()
msg1 = Label(tgui, text="Connecting to " + friend2, font='Arial 16 bold', bg="#2e004d", fg='yellow',
border=5)
msg1.pack(side='top', padx=100, pady=10)
msg2 = Label(tgui, text="Connected...", font='Arial 16 bold', bg="#2e004d", fg='yellow', border=5)
msg2.pack(side='top', padx=100, pady=10)
def friendbuttonpress3():
msg.destroy()
FRIEND1.destroy()
FRIEND2.destroy()
FRIEND3.destroy()
msg1 = Label(tgui, text="Connecting to " + friend3, font='Arial 16 bold', bg="#2e004d", fg='yellow',
border=5)
msg1.pack(side='top', padx=100, pady=10)
msg2 = Label(tgui, text="Connected...", font='Arial 16 bold', bg="#2e004d", fg='yellow', border=5)
msg2.pack(side='top', padx=100, pady=10)
def buttonpressuni():
if (phonedone == 0):
if (denomination < 5000000):
randnumber3 = randint(1, 2)
if (randnumber3 == 1):
randstring = random.choice(friendanswer1)
friendanswers = Label(tgui, text=randstring + correct,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
friendanswers.pack(side='top', padx=20, pady=10)
if (randnumber3 == 2):
randstring = random.choice(friendanswer3)
friendanswers = Label(tgui, text=randstring,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
friendanswers.pack(side='top', padx=20, pady=10)
else:
randstring = random.choice(friendanswer3)
friendanswers = Label(tgui, text=randstring,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
friendanswers.pack(side='top', padx=20, pady=10)
def disconnect():
global phonedone
msg3 = Label(tgui, text="Disconnected...", font='Arial 16 bold', bg="#2e004d", fg='yellow', border=5)
msg3.pack(side='top', padx=200, pady=10)
phonedone = 1
OKbutton = Button(tgui, text=' OK ', font='Arial 22 bold', bg="black", fg='yellow', border=5,
command=tgui.destroy)
OKbutton.pack(side='top', padx=200, pady=30)
FRIEND1 = Button(tgui, text=friend1, font='Arial 16 bold', bg="black", fg='yellow', border=5,
command=lambda: [friendbuttonpress1(), buttonpressuni(), disconnect()])
FRIEND1.pack(side='top', padx=200, pady=10)
FRIEND2 = Button(tgui, text=friend2, font='Arial 16 bold', bg="black", fg='yellow', border=5,
command=lambda: [friendbuttonpress2(), buttonpressuni(), disconnect()])
FRIEND2.pack(side='top', padx=200, pady=10)
FRIEND3 = Button(tgui, text=friend3, font='Arial 16 bold', bg="black", fg='yellow', border=5,
command=lambda: [friendbuttonpress3(), buttonpressuni(), disconnect()])
FRIEND3.pack(side='top', padx=200, pady=10)
else:
friendanswer1 = ["I am confident the answer is ", "I am sure the correct option is ", "My answer would be "]
friendanswer2 = ["I am confused between ", "According to me the answer might be ",
"Maybe the answer is either "]
friendanswer3 = ["I am sorry but I have no idea about this question. ",
"I apologize, I don't have any answer for this question. ",
"I regret to say that I don't the answer to this question. "]
msg = Label(tgui, text="Who do you want to call?",font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
msg.pack(side='top', padx=100, pady=40)
def friendbuttonpress1():
msg.destroy()
FRIEND1.destroy()
FRIEND2.destroy()
FRIEND3.destroy()
msg1 = Label(tgui, text="Connecting to " + friend1,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
msg1.pack(side='top', padx=100, pady=10)
msg2 = Label(tgui, text="Connected...",font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
msg2.pack(side='top', padx=100, pady=10)
def friendbuttonpress2():
msg.destroy()
FRIEND1.destroy()
FRIEND2.destroy()
FRIEND3.destroy()
msg1 = Label(tgui, text="Connecting to " + friend2,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
msg1.pack(side='top', padx=100, pady=10)
msg2 = Label(tgui, text="Connected...",font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
msg2.pack(side='top', padx=100, pady=10)
def friendbuttonpress3():
msg.destroy()
FRIEND1.destroy()
FRIEND2.destroy()
FRIEND3.destroy()
msg1 = Label(tgui, text="Connecting to " + friend3,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
msg1.pack(side='top', padx=100, pady=10)
msg2 = Label(tgui, text="Connected...",font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
msg2.pack(side='top', padx=100, pady=10)
def buttonpressuni():
randnumber2a=0
randnumber3a=0
if (phonedone == 0):
if (denomination < 160001):
randnumber2 = randint(1, 2)
if (randnumber2 == 1):
randstring = random.choice(friendanswer1)
friendanswers = Label(tgui, text=randstring + correct,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
friendanswers.pack(side='top', padx=20, pady=10)
if (randnumber2 == 2):
if (correct == 'A'):
randnumber2a = randint(1, 4)
while (randnumber2a == 1):
randnumber2a = randint(1, 4)
if (correct == 'B'):
randnumber2a = randint(1, 4)
while (randnumber2a == 2):
randnumber2a = randint(1, 4)
if (correct == 'C'):
randnumber2a = randint(1, 4)
while (randnumber2a == 3):
randnumber2a = randint(1, 4)
if (correct == 'D'):
randnumber2a = randint(1, 4)
while (randnumber2a == 4):
randnumber2a = randint(1, 4)
randstring2a = (str)(randnumber2a)
if (randstring2a == '1'):
randstring2a = 'A'
if (randstring2a == '2'):
randstring2a = 'B'
if (randstring2a == '3'):
randstring2a = 'C'
if (randstring2a == '4'):
randstring2a = 'D'
randstring = random.choice(friendanswer2)
friendanswers = Label(tgui, text=randstring + correct + "," + randstring2a,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
friendanswers.pack(side='top', padx=20, pady=10)
elif denomination>=2500000:
randstring = random.choice(friendanswer3)
friendanswers = Label(tgui, text=randstring, font='Arial 16 bold', bg="#2e004d", fg='yellow',
border=5)
friendanswers.pack(side='top', padx=20, pady=10)
else:
randnumber3 = randint(2, 3)
if (randnumber3 == 2):
if (correct == 'A'):
randnumber3a = randint(1, 4)
while (randnumber3a == 1):
randnumber3a = randint(1, 4)
if (correct == 'B'):
randnumber3a = randint(1, 4)
while (randnumber3a == 2):
randnumber3a = randint(1, 4)
if (correct == 'C'):
randnumber3a = randint(1, 4)
while (randnumber3a == 3):
randnumber3a = randint(1, 4)
if (correct == 'D'):
randnumber3a = randint(1, 4)
while (randnumber3a == 4):
randnumber3a = randint(1, 4)
randstring3a = (str)(randnumber3a)
if (randstring3a == '1'):
randstring3a = 'A'
if (randstring3a == '2'):
randstring3a = 'B'
if (randstring3a == '3'):
randstring3a = 'C'
if (randstring3a == '4'):
randstring3a = 'D'
randstring = random.choice(friendanswer2)
friendanswers = Label(tgui, text=randstring + correct + "," + randstring3a,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
friendanswers.pack(side='top', padx=20, pady=10)
if (randnumber3 == 3):
randstring = random.choice(friendanswer3)
friendanswers = Label(tgui, text=randstring,font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
friendanswers.pack(side='top', padx=20, pady=10)
def disconnect():
global phonedone
msg3 = Label(tgui, text="Disconnected...",font='Arial 16 bold',bg="#2e004d",fg='yellow',border=5)
msg3.pack(side='top', padx=200, pady=10)
phonedone = 1
OKbutton = Button(tgui, text=' OK ', font='Arial 22 bold', bg="black", fg='yellow', border=5,
command=tgui.destroy)
OKbutton.pack(side='top', padx=200, pady=30)
FRIEND1 = Button(tgui, text=friend1,font='Arial 16 bold',bg="black",fg='yellow',border=5, command=lambda: [friendbuttonpress1(), buttonpressuni(), disconnect()])
FRIEND1.pack(side='top', padx=200, pady=10)
FRIEND2 = Button(tgui, text=friend2,font='Arial 16 bold',bg="black",fg='yellow',border=5,command=lambda: [friendbuttonpress2(), buttonpressuni(), disconnect()])
FRIEND2.pack(side='top', padx=200, pady=10)
FRIEND3 = Button(tgui, text=friend3,font='Arial 16 bold',bg="black",fg='yellow',border=5,command=lambda: [friendbuttonpress3(), buttonpressuni(), disconnect()])
FRIEND3.pack(side='top', padx=200, pady=10)
tgui.mainloop()
def audiencepoll(correct):
global audiencedone
global fiftyfiftydone
global audiencedone
global aactive,bactive,cactive,dactive
correct=str(correct)
correct=correct.upper()
if audiencedone==0:
if(fiftyfiftydone==1 and sameq_flag==1):
audiencedone=1
tgui = Tk()
tgui.title("Audience Poll")
tgui.geometry('960x400+300+270')
tgui.iconbitmap(os.path.join(path, 'Images and icons\kbc_logo.ico'))
tgui.configure(bg='#2e004d')
APoll = randint(70, 100)
BPoll=100-APoll
if(correct=='A'):
printa = Label(tgui, text="A",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa.grid(row=15, column=0,padx=40,pady=10)
printa1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
APoll = (str)(APoll)
printa2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
if(bactive==1):
printb=Label(tgui,text="B",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb.grid(row=16,column=0,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
if (cactive == 1):
printc = Label(tgui, text="C",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc.grid(row=16, column=0,padx=40,pady=10)
printc1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printc2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=16, column=2,padx=40,pady=10)
if (dactive == 1):
printd = Label(tgui, text="D",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd.grid(row=16, column=0,padx=40,pady=10)
printd1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printd2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=16, column=2,padx=40,pady=10)
if (correct == 'B'):
printb = Label(tgui, text="B",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb.grid(row=15, column=0,padx=40,pady=10)
printb1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=15, column=1,padx=40,pady=10)
APoll = (str)(APoll)
printb2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=15, column=2,padx=40,pady=10)
if (aactive == 1):
printa = Label(tgui, text="A",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa.grid(row=16, column=0,padx=40,pady=10)
printa1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printa2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=16, column=2,padx=40,pady=10)
if (cactive == 1):
printc = Label(tgui, text="C",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc.grid(row=16, column=0,padx=40,pady=10)
printc1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printc2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=16, column=2,padx=40,pady=10)
if (dactive == 1):
printd = Label(tgui, text="D",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd.grid(row=16, column=0,padx=40,pady=10)
printd1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printd2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=16, column=2,padx=40,pady=10)
if (correct == 'C'):
printc = Label(tgui, text="C",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc.grid(row=15, column=0,padx=40,pady=10)
printc1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=15, column=1,padx=40,pady=10)
APoll = (str)(APoll)
printc2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=15, column=2,padx=40,pady=10)
if (bactive == 1):
printb = Label(tgui, text="B",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb.grid(row=16, column=0,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
if (aactive == 1):
printa = Label(tgui, text="A",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa.grid(row=16, column=0,padx=40,pady=10)
printa1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printa2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=16, column=2,padx=40,pady=10)
if (dactive == 1):
printd = Label(tgui, text="D",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd.grid(row=16, column=0,padx=40,pady=10)
printd1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printd2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=16, column=2,padx=40,pady=10)
if (correct == 'D'):
printd = Label(tgui, text="D",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd.grid(row=15, column=0,padx=40,pady=10)
printd1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=15, column=1,padx=40,pady=10)
APoll = (str)(APoll)
printd2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=15, column=2,padx=40,pady=10)
if (bactive == 1):
printb = Label(tgui, text="B",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb.grid(row=16, column=0,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
if (cactive == 1):
printc = Label(tgui, text="C",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc.grid(row=16, column=0,padx=40,pady=10)
printc1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printc2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=16, column=2,padx=40,pady=10)
if (aactive == 1):
printa = Label(tgui, text="A",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa.grid(row=16, column=0,padx=40,pady=10)
printa1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=16, column=1,padx=40,pady=10)
BPoll = (str)(BPoll)
printa2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=16, column=2,padx=40,pady=10)
OKbutton = Button(tgui, text=' OK ', font='Arial 22 bold', bg="black", fg='yellow', border=5,
command=tgui.destroy)
OKbutton.grid(row=21, column=1, padx=40, pady=40)
tgui.mainloop()
else:
audiencedone=1
tgui = Tk()
tgui.title("Audience Poll")
tgui.geometry('960x400+300+270')
tgui.iconbitmap(os.path.join(path, 'Images and icons\kbc_logo.ico'))
tgui.configure(bg='#2e004d')
if (denomination < 40001):
APoll = randint(70, 100)
ALeft = 100 - APoll
BPoll = randint(0, ALeft)
BLeft = ALeft - BPoll
CPoll = randint(0, BLeft)
DPoll = BLeft - CPoll
printa = Label(tgui, text="A",bg='#2e004d',fg='yellow',font='Arial 16 bold',border=5)
printa.grid(row=15, column=0,padx=40,pady=10)
printb = Label(tgui, text="B",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb.grid(row=16, column=0,padx=40,pady=10)
printc = Label(tgui, text="C",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc.grid(row=17, column=0,padx=40,pady=10)
printd = Label(tgui, text="D",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd.grid(row=18, column=0,padx=40,pady=10)
if (correct == 'A'):
printa1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
if (correct == 'B'):
printa1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
if (correct == 'C'):
printa1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
if (correct == 'D'):
printa1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
elif (denomination < 640001):
APoll = randint(50, 70)
ALeft = 100 - APoll
BPoll = randint(0, ALeft)
BLeft = ALeft - BPoll
CPoll = randint(0, BLeft)
DPoll = BLeft - CPoll
printa = Label(tgui, text="A",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa.grid(row=15, column=0,padx=40,pady=10)
printb = Label(tgui, text="B",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb.grid(row=16, column=0,padx=40,pady=10)
printc = Label(tgui, text="C",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc.grid(row=17, column=0,padx=40,pady=10)
printd = Label(tgui, text="D",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd.grid(row=18, column=0,padx=40,pady=10)
if (correct == 'A'):
printa1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
if (correct == 'B'):
printa1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
if (correct == 'C'):
printa1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
if (correct == 'D'):
printa1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
else:
APoll = randint(30, 60)
ALeft = 100 - APoll
BPoll = randint(0, ALeft)
BLeft = ALeft - BPoll
CPoll = randint(0, BLeft)
DPoll = BLeft - CPoll
printa = Label(tgui, text="A",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa.grid(row=15, column=0,padx=40,pady=10)
printb = Label(tgui, text="B",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb.grid(row=16, column=0,padx=40,pady=10)
printc = Label(tgui, text="C",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc.grid(row=17, column=0,padx=40,pady=10)
printd = Label(tgui, text="D",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd.grid(row=18, column=0,padx=40,pady=10)
if (correct == 'A'):
printa1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
if (correct == 'B'):
printa1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
if (correct == 'C'):
printa1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
if (correct == 'D'):
printa1 = Label(tgui, text="|" * DPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa1.grid(row=15, column=1,padx=40,pady=10)
printb1 = Label(tgui, text="|" * BPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb1.grid(row=16, column=1,padx=40,pady=10)
printc1 = Label(tgui, text="|" * CPoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc1.grid(row=17, column=1,padx=40,pady=10)
printd1 = Label(tgui, text="|" * APoll,fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd1.grid(row=18, column=1,padx=40,pady=10)
APoll = (str)(APoll)
BPoll = (str)(BPoll)
CPoll = (str)(CPoll)
DPoll = (str)(DPoll)
printa2 = Label(tgui, text=DPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printa2.grid(row=15, column=2,padx=40,pady=10)
printb2 = Label(tgui, text=BPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printb2.grid(row=16, column=2,padx=40,pady=10)
printc2 = Label(tgui, text=CPoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printc2.grid(row=17, column=2,padx=40,pady=10)
printd2 = Label(tgui, text=APoll + " %",fg='yellow',font='Arial 16 bold',border=5,bg='#2e004d')
printd2.grid(row=18, column=2,padx=40,pady=10)
OKbutton = Button(tgui, text=' OK ', font='Arial 22 bold', bg="black", fg='yellow', border=5,
command=tgui.destroy)
OKbutton.grid(row=21,column=1, padx=40, pady=40)
tgui.mainloop()
def expertadvice(correct):
global randnumber2a,fiftyfiftydone,expertdone,randnumber3a
expertanswer1 = ["I am confident the answer is ", "I am sure the correct option is ", "My answer would be "]
expertanswer2 = ["I am confused between ", "According to me the answer might be ",
"Maybe the answer is either "]
expertanswer3 = ["I am sorry but I have no idea about this question. ",
"I apologize, I don't have any answer for this question. ",
"I regret to say that I don't the answer to this question. "]
if expertdone==0:
tgui = Tk()
tgui.title("Expert Advice")
tgui.geometry('600x400+480+270')
tgui.iconbitmap(os.path.join(path, 'Images and icons\kbc_logo.ico'))
tgui.configure(bg='#2e004d')
correct=str(correct)
correct=correct.upper()
expertintro = Label(tgui, text="Hello . My name is Mrs. Kavya Dixit\nand I am the expert for today's game.", bg='#2e004d', fg='yellow', font='Arial 16 bold', border=5)
expertintro.grid(row=6, columnspan=1, padx=80, pady=20)
expertdone = 1
if (denomination < 320001 or (fiftyfiftydone == 1 and sameq_flag==1)):
randstring = random.choice(expertanswer1)
expertanswers = Label(tgui, text=randstring + correct+'.',bg='#2e004d',fg='yellow',font='Arial 16 bold',border=5)
expertanswers.grid(row=10, columnspan=2,padx=80,pady=10)
elif (denomination < 2500001):
randnumber2 = randint(1, 2)
if (randnumber2 == 1):
randstring = random.choice(expertanswer1)
expertanswers = Label(tgui, text=randstring + correct+'.',bg='#2e004d',fg='yellow',font='Arial 16 bold',border=5)
expertanswers.grid(row=10, columnspan=2,padx=80,pady=10)
if (randnumber2 == 2):
if (correct == 'A'):
randnumber2a = randint(1, 4)
while (randnumber2a == 1):
randnumber2a = randint(1, 4)
if (correct == 'B'):
randnumber2a = randint(1, 4)
while (randnumber2a == 2):
randnumber2a = randint(1, 4)
if (correct == 'C'):
randnumber2a = randint(1, 4)
while (randnumber2a == 3):
randnumber2a = randint(1, 4)
if (correct == 'D'):
randnumber2a = randint(1, 4)
while (randnumber2a == 4):
randnumber2a = randint(1, 4)
randstring2a = (str)(randnumber2a)
if (randstring2a == '1'):
randstring2a = 'A'
if (randstring2a == '2'):
randstring2a = 'B'
if (randstring2a == '3'):
randstring2a = 'C'
if (randstring2a == '4'):
randstring2a = 'D'
randstring = random.choice(expertanswer2)
expertanswers = Label(tgui, text=randstring + correct + "," + randstring2a,bg='#2e004d',fg='yellow',font='Arial 16 bold',border=5)
expertanswers.grid(row=10, columnspan=2,padx=80,pady=10)
else:
randnumber3 = randint(1, 3)
if (randnumber3 == 1):
randstring = random.choice(expertanswer1)
expertanswers = Label(tgui, text=randstring + correct+'.',bg='#2e004d',fg='yellow',font='Arial 16 bold',border=5)
expertanswers.grid(row=10, columnspan=1,padx=80,pady=10)
if (randnumber3 == 2):
if (correct == 'A'):
randnumber3a = randint(1, 4)
while (randnumber3a == 1):
randnumber3a = randint(1, 4)
if (correct == 'B'):
randnumber3a = randint(1, 4)
while (randnumber3a == 2):
randnumber3a = randint(1, 4)
if (correct == 'C'):
randnumber3a = randint(1, 4)
while (randnumber3a == 3):
randnumber3a = randint(1, 4)
if (correct == 'D'):
randnumber3a = randint(1, 4)
while (randnumber3a == 4):
randnumber3a = randint(1, 4)
randstring3a = (str)(randnumber3a)
if (randstring3a == '1'):
randstring3a = 'A'
if (randstring3a == '2'):
randstring3a = 'B'
if (randstring3a == '3'):
randstring3a = 'C'
if (randstring3a == '4'):
randstring3a = 'D'
randstring = random.choice(expertanswer2)
expertanswers = Label(tgui, text=randstring + correct + "," + randstring3a,bg='#2e004d',fg='yellow',font='Arial 16 bold',border=5)
expertanswers.grid(row=10, columnspan=2,padx=80,pady=10)
if (randnumber3 == 3):
randstring = random.choice(expertanswer3)
expertanswers = Label(tgui, text=randstring,bg='#2e004d',fg='yellow',font='Arial 16 bold',border=5)
expertanswers.grid(row=10, columnspan=2,padx=80,pady=10)
OKbutton = Button(tgui, text=' OK ', font='Arial 22 bold', bg="black", fg='yellow', border=5,
command=tgui.destroy)
OKbutton.grid(row=15, padx=80, pady=40)
tgui.mainloop()
startingwindow()
screen.mainloop()
| 47.6331 | 186 | 0.548698 | 12,844 | 108,794 | 4.642168 | 0.045391 | 0.048756 | 0.035791 | 0.048806 | 0.841423 | 0.815863 | 0.793506 | 0.767577 | 0.743761 | 0.719777 | 0 | 0.070469 | 0.299171 | 108,794 | 2,283 | 187 | 47.653964 | 0.711526 | 0.004605 | 0 | 0.724947 | 0 | 0.002665 | 0.158403 | 0.0069 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023987 | false | 0.013859 | 0.004797 | 0 | 0.028785 | 0.166311 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b064a1fbb8aaeb8aa54e59aa5a425880ec2ffa39 | 7,135 | py | Python | empleados/forms.py | exildev/AutoLavadox | 6e486e06ac5aea3a5f84f48948c7bce52b3d86b2 | [
"MIT"
] | 1 | 2019-04-09T04:37:06.000Z | 2019-04-09T04:37:06.000Z | empleados/forms.py | exildev/AutoLavadox | 6e486e06ac5aea3a5f84f48948c7bce52b3d86b2 | [
"MIT"
] | 9 | 2017-06-15T21:14:00.000Z | 2017-08-02T18:41:18.000Z | empleados/forms.py | exildev/AutoLavadox | 6e486e06ac5aea3a5f84f48948c7bce52b3d86b2 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from django.contrib.auth.forms import UserCreationForm
from django import forms
from exileui.widgets import DatePickerWidget
import models
from django.contrib.admin import widgets
class OperarioForm(UserCreationForm):
def __init__(self, *args, **kwargs):
super(OperarioForm, self).__init__(*args, **kwargs)
self.fields['password1'].label = "Contraseña"
self.fields['password2'].label = "Confirmar contraseña"
self.fields['email'].label = "Correo Electrtónico"
self.fields['first_name'].label = "Nombre"
self.fields['nacimiento'].widget = DatePickerWidget(
attrs={'class': 'dateopera'},
format="%d/%m/%Y")
self.fields['telefono'].widget = forms.NumberInput()
# end def
class Media:
js = ('/static/empleados/js/dateoperario.js',)
# end class
class Meta:
model = models.Empleado
fields = ['username', 'password1', 'password2', 'email', 'first_name',
'last_name', 'identificacion', 'direccion', 'telefono', 'nacimiento', 'imagen']
# end class
def save(self, commit=True):
operario = super(OperarioForm, self).save(commit)
operario.is_staff = True
operario.is_superuser = True
operario.save()
return operario
# end def
# end class
class OperarioFormEdit(forms.ModelForm):
def __init__(self, *args, **kwargs):
super(OperarioFormEdit, self).__init__(*args, **kwargs)
self.fields['email'].label = "Correo Electrtónico"
self.fields['first_name'].label = "Nombre"
self.fields['last_name'].label = "Apellidos"
self.fields['nacimiento'].widget = DatePickerWidget(
attrs={'class': 'dateopera'},
format="%d/%m/%Y")
self.fields['telefono'].widget = forms.NumberInput()
# end def
class Media:
js = ('/static/empleados/js/dateoperario.js',)
# end class
class Meta:
model = models.Empleado
exclude = ['password1', 'password2', ]
fields = ['username', 'email', 'first_name',
'last_name', 'identificacion', 'direccion', 'telefono', 'nacimiento', 'imagen']
# end class
def save(self, commit=True):
operario = super(OperarioFormEdit, self).save(commit)
operario.is_staff = True
operario.is_superuser = True
operario.save()
return operario
# end def
# end class
class RecepcionistaForm(UserCreationForm):
def __init__(self, *args, **kwargs):
super(RecepcionistaForm, self).__init__(*args, **kwargs)
self.fields['password1'].label = "Contraseña"
self.fields['password2'].label = "Confirmar contraseña"
self.fields['email'].label = "Correo Electrtónico"
self.fields['first_name'].label = "Nombre"
self.fields['last_name'].label = "Apellidos"
self.fields['nacimiento'].widget = DatePickerWidget(
attrs={'class': 'dateopera'},
format="%d/%m/%Y")
self.fields['telefono'].widget = forms.NumberInput()
# end def
class Meta:
model = models.Recepcionista
fields = ['username', 'password1', 'password2', 'email', 'first_name',
'last_name', 'identificacion', 'direccion', 'telefono', 'nacimiento', 'imagen']
# end class
class Media:
js = ('/static/empleados/js/dateoperario.js',)
# end class
def save(self, commit=True):
recepcionista = super(RecepcionistaForm, self).save(commit)
recepcionista.is_staff = True
recepcionista.is_superuser = True
recepcionista.save()
return recepcionista
# end def
# end class
class RecepcionistaFormEdit(forms.ModelForm):
def __init__(self, *args, **kwargs):
super(RecepcionistaFormEdit, self).__init__(*args, **kwargs)
self.fields['email'].label = "Correo Electrtónico"
self.fields['first_name'].label = "Nombre"
self.fields['last_name'].label = "Apellidos"
self.fields['nacimiento'].widget = DatePickerWidget(
attrs={'class': 'dateopera'},
format="%d/%m/%Y")
self.fields['telefono'].widget = forms.NumberInput()
# end def
class Meta:
model = models.Recepcionista
exclude = ['password1', 'password2', ]
fields = ['username', 'email', 'first_name',
'last_name', 'identificacion', 'direccion', 'telefono', 'nacimiento', 'imagen']
# end class
class Media:
js = ('/static/empleados/js/dateoperario.js',)
# end class
def save(self, commit=True):
recepcionista = super(RecepcionistaFormEdit, self).save(commit)
recepcionista.is_staff = True
recepcionista.is_superuser = True
recepcionista.save()
return recepcionista
# end def
# end class
class CajeroForm(UserCreationForm):
def __init__(self, *args, **kwargs):
super(CajeroForm, self).__init__(*args, **kwargs)
self.fields['password1'].label = "Contraseña"
self.fields['password2'].label = "Confirmar contraseña"
self.fields['email'].label = "Correo Electrtónico"
self.fields['first_name'].label = "Nombre"
self.fields['last_name'].label = "Apellidos"
self.fields['nacimiento'].widget = DatePickerWidget(
attrs={'class': 'dateopera'},
format="%d/%m/%Y")
self.fields['telefono'].widget = forms.NumberInput()
# end def
class Media:
js = ('/static/empleados/js/dateoperario.js',)
# end class
class Meta:
model = models.Cajero
fields = ['username', 'password1', 'password2', 'email', 'first_name',
'last_name', 'identificacion', 'direccion', 'telefono', 'nacimiento', 'imagen']
# end class
def save(self, commit=True):
cajero = super(CajeroForm, self).save(commit)
cajero.is_staff = True
cajero.is_superuser = True
cajero.save()
return cajero
# end def
# end class
class CajeroFormEdit(forms.ModelForm):
def __init__(self, *args, **kwargs):
super(CajeroFormEdit, self).__init__(*args, **kwargs)
self.fields['email'].label = "Correo Electrtónico"
self.fields['first_name'].label = "Nombre"
self.fields['last_name'].label = "Apellidos"
self.fields['nacimiento'].widget = DatePickerWidget(
attrs={'class': 'dateopera'},
format="%d/%m/%Y")
self.fields['telefono'].widget = forms.NumberInput()
# end def
class Meta:
model = models.Cajero
exclude = ['password1', 'password2', ]
fields = ['username', 'email', 'first_name',
'last_name', 'identificacion', 'direccion', 'telefono', 'nacimiento', 'imagen']
# end class
class Media:
js = ('/static/empleados/js/dateoperario.js',)
# end class
def save(self, commit=True):
cajero = super(CajeroFormEdit, self).save(commit)
cajero.is_staff = True
cajero.is_superuser = True
cajero.save()
return cajero
# end def
# end class
| 34.468599 | 97 | 0.611493 | 720 | 7,135 | 5.944444 | 0.106944 | 0.081776 | 0.033411 | 0.021028 | 0.896963 | 0.892991 | 0.892991 | 0.863551 | 0.835514 | 0.830374 | 0 | 0.00353 | 0.24555 | 7,135 | 206 | 98 | 34.635922 | 0.791566 | 0.041626 | 0 | 0.841379 | 0 | 0 | 0.222157 | 0.031737 | 0 | 0 | 0 | 0 | 0 | 1 | 0.082759 | false | 0.082759 | 0.034483 | 0 | 0.282759 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
c6718201ad56837c5baf89d14314f8abb81e250f | 23,155 | py | Python | remote/worker.py | Schille/weimar-graphstore | 76b47f98fba419ec6290628b56a202c60d8f2d46 | [
"MIT"
] | 2 | 2016-08-27T04:51:01.000Z | 2020-09-05T01:34:41.000Z | remote/worker.py | Schille/weimar-graphstore | 76b47f98fba419ec6290628b56a202c60d8f2d46 | [
"MIT"
] | null | null | null | remote/worker.py | Schille/weimar-graphstore | 76b47f98fba419ec6290628b56a202c60d8f2d46 | [
"MIT"
] | null | null | null | '''
Created on Mar 17, 2014
@author: mschilonka
'''
import Pyro4
import config
import random
import signal, sys, time
import multiprocessing as mp
from graph.graphelement import Vertex, Edge
from graph.elementtype import EdgeType, VertexType
from graph.requestgraphelements import RequestEdgeType, RequestVertexType, RequestVertex
from graph.hyperdexgraph import HyperDexGraph
import traceback
def start_worker(num_threads):
worker_p = []
for i in xrange(0, num_threads):
worker_p.append(WorkerProcess())
time.sleep(1)
def signal_handler(signum, frame):
print('\nShutting down Worker...')
[p.shutdown() for p in worker_p]
time.sleep(5)
for process in worker_p:
if process.is_alive():
process.terminate()
registry.unregister(process.name)
print('\nShutting down Worker...Done')
sys.exit()
signal.signal(signal.SIGINT, signal_handler)
ns = Pyro4.naming.locateNS(host=config.WEIMAR_ADDRESS_INSIDE, port=config.WEIMAR_PORT_INSIDE)
registry_uri = ns.lookup('weimar.worker.registry')
registry = Pyro4.Proxy(registry_uri)
print('Started {} work_p processes'.format(len(worker_p)))
Pyro4.config.COMMTIMEOUT=3.5
while(True):
for process in worker_p:
if not process.is_alive():
print('Worker process {} stopped working unexpectedly.'.format(process.name))
try:
registry.unregister(process.name)
except Pyro4.errors.ConnectionClosedError:
print('[Warn] Weimar server is no longer available.')
except Exception, ex:
print(ex)
worker_p.remove(process)
time.sleep(2)
if(len(worker_p) == 0):
break
class WorkerProcess(mp.Process):
def __init__(self):
signal.signal(signal.SIGINT, signal.SIG_IGN)
self.ns = Pyro4.naming.locateNS(host=config.WEIMAR_ADDRESS_INSIDE, port=config.WEIMAR_PORT_INSIDE)
#get relevant information from the weimar sever
registry_uri = self.ns.lookup('weimar.worker.registry')
self.registry = Pyro4.Proxy(registry_uri)
self.workername = self.registry.register()
super(WorkerProcess, self).__init__(name=self.workername)
self._running = mp.Value('i', 1)
self.start()
def run(self):
#Pyro4.config.COMMTIMEOUT=5.5
#create process
my_ip = Pyro4.socketutil.getIpAddress(None, workaround127=True)
self.daemon = Pyro4.core.Daemon(my_ip)
hyperdex_uri = self.ns.lookup('hyperdex.properties')
hyperdex = Pyro4.Proxy(hyperdex_uri)
hyperdex_ip = hyperdex.get_ip()
hyperdex_port = hyperdex.get_port()
#create worker object for Pyro
self.worker = Worker(self.workername,hyperdex_ip, hyperdex_port, self)
worker_uri = self.daemon.register(self.worker)
#register object at the weimar server
print(self.workername)
self.ns.register('weimar.worker.{}'.format(self.workername), worker_uri)
self.daemon.requestLoop(loopCondition=lambda:self._running.value)
#shutdown issued
self.daemon.close()
self.registry.unregister(self.workername)
def shutdown(self):
self._running.value = 0
print('[Info] Shutting down: ' + self.name)
class Worker(object):
'''
classdocs
'''
def __init__(self, worker_name, hyperdex_ip, hyperdex_port, process):
'''
Constructor
'''
self.name = worker_name
self.graphs = {}
self.hyperdex_ip = hyperdex_ip
self.hyperdex_port = hyperdex_port
self._process = process
def say_hello(self):
return 'Hello from {}'.format(self.name)
def shutdown(self, code):
print('[Warn] Worker {} is requested to shut down. Server code: {}'.format(self.name, code))
#todo handle server code
if(code == 1000):
self._process.shutdown()
def get_vertex_type(self, graph_name, vertex_type):
if(self.graphs.has_key(graph_name)):
try:
self.graphs[graph_name].get_vertex_type(vertex_type)
return True
except:
return False
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
try:
self.graphs[graph_name].get_vertex_type(vertex_type)
return True
except:
return False
def get_edge_type(self,graph_name , edge_type):
if(self.graphs.has_key(graph_name)):
try:
return self.graphs[graph_name].get_edge_type(edge_type)
return True
except:
return False
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
try:
return self.graphs[graph_name].get_edge_type(edge_type)
return True
except:
return False
def create_vertex_type(self, graph_name, type_name, vertex_type_def):
print('[Info] Create vertex type: ' + type_name)
if(self.graphs.has_key(graph_name)):
try:
self.graphs[graph_name].create_vertex_type(RequestVertexType(type_name, *vertex_type_def))
return True
except:
traceback.print_exc()
return False
else:
print('Create new graph {} on HyperDex at {}:{}'.format(graph_name, self.hyperdex_ip, self.hyperdex_port))
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
try:
self.graphs[graph_name].create_vertex_type(RequestVertexType(type_name, *vertex_type_def))
return True
except Exception, e:
traceback.print_exc()
print(e)
return False
def create_edge_type(self, graph_name, type_name, edge_type_def):
print('[Info] Create edge type: ' + type_name)
if(self.graphs.has_key(graph_name)):
print(graph_name)
try:
self.graphs[graph_name].create_edge_type(RequestEdgeType(type_name, *edge_type_def))
return True
except:
return False
else:
print('Create new graph {} on HyperDex at {}:{}'.format(graph_name, self.hyperdex_ip, self.hyperdex_port))
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
try:
self.graphs[graph_name].create_edge_type(RequestEdgeType(type_name, *edge_type_def))
return True
except:
return False
def insert_vertex(self, graph_name, vertex_type, attributes):
if(self.graphs.has_key(graph_name)):
return self.graphs[graph_name].insert_vertex(\
RequestVertex(VertexType(self.graphs[graph_name]._storage , vertex_type.split(':', 1)[1]),attributes))._uid
else:
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
return self.graphs[graph_name].insert_vertex(\
RequestVertex(VertexType(self.graphs[graph_name]._storage , vertex_type.split(':', 1)[1]),attributes))._uid
def get_vertex(self, graph_name, uid, vertex_type):
if(self.graphs.has_key(graph_name)):
print(graph_name)
try:
self.graphs[graph_name].get_vertex(uid, vertex_type)
return True
except:
return False
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
try:
self.graphs[graph_name].get_vertex(uid, vertex_type)
return True
except:
return False
def get_type_definition(self, graph_name, type_name):
print(type_name)
t = type_name.split(':', 1)
if(self.graphs.has_key(graph_name)):
print(graph_name)
if(t[0] == 'vertex'):
return self.graphs[graph_name].get_vertex_type(t[1]).get_type_definition()
elif(t[0] == 'edge'):
return self.graphs[graph_name].get_edge_type(t[1]).get_type_definition()
else:
#TODO
raise Exception()
else:
#TODO may the graph has been delete
raise Exception()
def count(self, graph_name, type_name):
t = type_name.split(':', 1)
if(self.graphs.has_key(graph_name)):
print(graph_name)
if(t[0] == 'vertex'):
return self.graphs[graph_name].get_vertex_type(t[1]).count()
elif(t[0] == 'edge'):
return self.graphs[graph_name].get_edge_type(t[1]).count()
else:
#TODO
raise Exception()
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
if(t[0] == 'vertex'):
return self.graphs[graph_name].get_vertex_type(t[1]).count()
elif(t[0] == 'edge'):
return self.graphs[graph_name].get_edge_type(t[1]).count()
else:
#TODO
raise Exception()
def get_elements(self, graph_name, type_name):
t = type_name.split(':', 1)
if(self.graphs.has_key(graph_name)):
print(graph_name)
if(t[0] == 'vertex'):
return self.graphs[graph_name].get_vertex_type(t[1]).get_vertices()
elif(t[0] == 'edge'):
return self.graphs[graph_name].get_edge_type(t[1]).get_edges()
else:
#TODO
raise Exception()
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
if(t[0] == 'vertex'):
return self.graphs[graph_name].get_vertex_type(t[1]).get_vertices()
elif(t[0] == 'edge'):
return self.graphs[graph_name].get_edge_type(t[1]).get_edges()
else:
#TODO
raise Exception()
def remove_type(self, graph_name, type_name):
t = type_name.split(':', 1)
print('[Info] Removing: ' + t[1])
if(self.graphs.has_key(graph_name)):
if(t[0] == 'vertex'):
self.graphs[graph_name].get_vertex_type(t[1]).remove()
elif(t[0] == 'edge'):
self.graphs[graph_name].get_edge_type(t[1]).remove()
else:
#TODO
raise Exception('Something went wrong.')
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
if(t[0] == 'vertex'):
self.graphs[graph_name].get_vertex_type(t[1]).remove()
elif(t[0] == 'edge'):
self.graphs[graph_name].get_edge_type(t[1]).remove()
else:
#TODO
raise Exception()
def get_property(self, graph_name, uid, type_name, key):
print('get property')
t = type_name.split(':', 1)
if(self.graphs.has_key(graph_name)):
if(t[0] == 'vertex'):
return Vertex(uid, t[1], self.graphs[graph_name]._storage).get_property(key)
elif(t[0] == 'edge'):
result = Edge(uid, t[1], self.graphs[graph_name]._storage).get_property(key)
print(result)
return result
else:
#TODO
raise Exception()
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
if(t[0] == 'vertex'):
return Vertex(uid, t[1], self.graphs[graph_name]._storage).get_property(key)
elif(t[0] == 'edge'):
return Edge(uid, t[1], self.graphs[graph_name]._storage).get_property(key)
else:
#TODO
raise Exception()
def set_property(self, graph_name, uid, type_name, key, value):
t = type_name.split(':', 1)
if(self.graphs.has_key(graph_name)):
#TODO this is a workaround
if(t[0] == 'vertex'):
return Vertex(uid, t[1], self.graphs[graph_name]._storage).set_property(key, value)
elif(t[0] == 'edge'):
return Edge(uid, t[1], self.graphs[graph_name]._storage).set_property(key, value)
else:
#TODO
raise Exception()
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
if(t[0] == 'vertex'):
return Vertex(uid, t[1], self.graphs[graph_name]._storage).set_property(key, value)
elif(t[0] == 'edge'):
return Edge(uid, t[1], self.graphs[graph_name]._storage).set_property(key, value)
else:
#TODO
raise Exception()
def get_property_keys(self, graph_name, uid, type_name):
t = type_name.split(':', 1)
if(self.graphs.has_key(graph_name)):
#TODO this is a workaround
if(t[0] == 'vertex'):
return Vertex(uid, t[1], self.graphs[graph_name]._storage).get_property_keys()
elif(t[0] == 'edge'):
return Edge(uid, t[1], self.graphs[graph_name]._storage).get_property_keys()
else:
#TODO
raise Exception()
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
if(t[0] == 'vertex'):
return Vertex(uid, t[1], self.graphs[graph_name]._storage).get_property_keys()
elif(t[0] == 'edge'):
return Edge(uid, t[1], self.graphs[graph_name]._storage).get_property_keys()
else:
#TODO
raise Exception()
def v_add_edge(self, graph_name, uid, type_name, target_type, target_uid,\
edge_type, struct_attr, unstruc_attr):
print('add edge from {}:{} to {}:{}'.format(uid, type_name, target_uid, target_type))
print(struct_attr)
if(self.graphs.has_key(graph_name)):
return Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.add_edge(Vertex(target_uid, target_type, self.graphs[graph_name]._storage),\
edge_type, struct_attr, unstruc_attr)._uid
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
return Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.add_edge(Vertex(target_uid, target_type, self.graphs[graph_name]._storage),\
edge_type, struct_attr, unstruc_attr)._uid
def v_rm_edge(self, graph_name, uid, type_name, edge_uid, edge_type):
#remove edge
if(self.graphs.has_key(graph_name)):
print(graph_name)
Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.rm_edge(Edge(edge_uid, edge_type, self.graphs[graph_name]._storage))
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.rm_edge(Edge(edge_uid, edge_type, self.graphs[graph_name]._storage))
def v_get_outgoing_edges(self, graph_name, uid, type_name, edge_type):
print('outgoing edges')
print(type_name)
if(self.graphs.has_key(graph_name)):
print(graph_name)
if(edge_type is None):
result = Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.get_outgoing_edges()
return {x._uid : x._element_type for x in result}
else:
print('Debug')
result = Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.get_outgoing_edges(EdgeType(self.graphs[graph_name]._storage, edge_type))
print(result)
return {x._uid : x._element_type for x in result}
else:
print('Create new graph ' + graph_name)
#hyperdex = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
self.graphs[graph_name] = None
if(edge_type is None):
result = Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.get_outgoing_edges()
return {x._uid : x._element_type for x in result}
else:
result = Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.get_outgoing_edges(EdgeType(self.graphs[graph_name]._storage, edge_type))
return {x._uid : x._element_type for x in result}
def v_get_incoming_edges(self, graph_name, uid, type_name, edge_type):
#return edges
print('incoming edges')
print(type_name)
if(self.graphs.has_key(graph_name)):
print(graph_name)
if(edge_type is None):
result = Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.get_incoming_edges()
return {x._uid : x._element_type for x in result}
else:
print('Debug')
result = Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.get_incoming_edges(EdgeType(self.graphs[graph_name]._storage, edge_type))
print(result)
return {x._uid : x._element_type for x in result}
else:
print('Create new graph ' + graph_name)
#hyperdex = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
self.graphs[graph_name] = None
if(edge_type is None):
result = Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.get_incoming_edges()
return {x._uid : x._element_type for x in result}
else:
result = Vertex(uid, type_name, self.graphs[graph_name]._storage)\
.get_incoming_edges(EdgeType(self.graphs[graph_name]._storage, edge_type))
return {x._uid : x._element_type for x in result}
def e_get_target(self, graph_name, uid, type_name):
#return the edge object's target vertices
if(self.graphs.has_key(graph_name)):
print(graph_name)
result = Edge(uid, type_name, self.graphs[graph_name]._storage).get_target()
return {x._uid : x._element_type for x in result}
else:
print('Create new graph ' + graph_name)
#hyperdex = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
result = Edge(uid, type_name, self.graphs[graph_name]._storage).get_target()
return {x._uid : x._element_type for x in result}
def e_get_source(self, graph_name, uid, type_name):
#return the edge object's source vertex
if(self.graphs.has_key(graph_name)):
print(graph_name)
v = Edge(uid, type_name, self.graphs[graph_name]._storage).get_source()
return (v._uid, v._element_type)
else:
print('Create new graph ' + graph_name)
self.graphs[graph_name] = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
v = Edge(uid, type_name, self.graphs[graph_name]._storage).get_source()
return (v._uid, v._element_type)
def e_add_target(self, graph_name, uid, type_name, vertex_uid, vertex_type):
#add a target vertex to edge
if(self.graphs.has_key(graph_name)):
print(graph_name)
Edge(uid, type_name, self.graphs[graph_name]._storage)\
.add_target(Vertex(vertex_uid, vertex_type, self.graphs[graph_name]._storage))
else:
print('Create new graph ' + graph_name)
#hyperdex = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
self.graphs[graph_name] = None
Edge(uid, type_name, self.graphs[graph_name]._storage)\
.add_target(Vertex(vertex_uid, vertex_type, self.graphs[graph_name]._storage))
def e_rm_target(self, graph_name, uid, type_name, vertex_uid, vertex_type):
#remove target vertex from edge
if(self.graphs.has_key(graph_name)):
print(graph_name)
Edge(uid, type_name, self.graphs[graph_name]._storage)\
.remove_target(Vertex(vertex_uid, vertex_type, self.graphs[graph_name]._storage))
else:
print('Create new graph ' + graph_name)
#hyperdex = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
self.graphs[graph_name] = None
Edge(uid, type_name, self.graphs[graph_name]._storage)\
.remove_target(Vertex(vertex_uid, vertex_type, self.graphs[graph_name]._storage))
def remove(self, graph_name, uid, type_name):
t = type_name.split(':', 1)
if(self.graphs.has_key(graph_name)):
#TODO this is a workaround
if(t[0] == 'vertex'):
return Vertex(uid, t[1], self.graphs[graph_name]._storage).remove()
elif(t[0] == 'edge'):
return Edge(uid, t[1], self.graphs[graph_name]._storage).remove()
else:
#TODO
raise Exception()
else:
print('Create new graph ' + graph_name)
#hyperdex = HyperDexGraph(self.hyperdex_ip, self.hyperdex_port, graph_name)
self.graphs[graph_name] = None
if(t[0] == 'vertex'):
return Vertex(uid, t[1], self.graphs[graph_name]._storage).remove()
elif(t[0] == 'edge'):
return Edge(uid, t[1], self.graphs[graph_name]._storage).remove()
else:
#TODO
raise Exception()
| 42.959184 | 123 | 0.581214 | 2,770 | 23,155 | 4.612274 | 0.072924 | 0.135958 | 0.112711 | 0.142768 | 0.785379 | 0.753366 | 0.740686 | 0.733485 | 0.725188 | 0.715169 | 0 | 0.006745 | 0.308486 | 23,155 | 539 | 124 | 42.959184 | 0.791157 | 0.041589 | 0 | 0.726437 | 0 | 0 | 0.048088 | 0.001996 | 0 | 0 | 0 | 0.001855 | 0 | 0 | null | null | 0 | 0.022989 | null | null | 0.137931 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c691a8dfdabfb7468b85c71501274941559f3731 | 107,892 | py | Python | test/unit/test_transit_gateway_apis_v1.py | mauriceDevsM/networking-python-sdk | 7eed5185f1e93a57e43d0d7a1e83ee8c708179e0 | [
"Apache-2.0"
] | null | null | null | test/unit/test_transit_gateway_apis_v1.py | mauriceDevsM/networking-python-sdk | 7eed5185f1e93a57e43d0d7a1e83ee8c708179e0 | [
"Apache-2.0"
] | null | null | null | test/unit/test_transit_gateway_apis_v1.py | mauriceDevsM/networking-python-sdk | 7eed5185f1e93a57e43d0d7a1e83ee8c708179e0 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# (C) Copyright IBM Corp. 2021.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from datetime import datetime, timezone
from ibm_cloud_sdk_core.authenticators.no_auth_authenticator import NoAuthAuthenticator
import inspect
import json
import pytest
import re
import requests
import responses
import urllib
from ibm_cloud_networking_services.transit_gateway_apis_v1 import *
version = 'testString'
service = TransitGatewayApisV1(
authenticator=NoAuthAuthenticator(),
version=version
)
base_url = 'https://transit.cloud.ibm.com/v1'
service.set_service_url(base_url)
##############################################################################
# Start of Service: TransitConnections
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for list_connections
#-----------------------------------------------------------------------------
class TestListConnections():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# list_connections()
#--------------------------------------------------------
@responses.activate
def test_list_connections_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/connections')
mock_response = '{"connections": [{"base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "name": "Transit_Service_SJ_DL", "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "transit_gateway": {"crn": "crn:v1:bluemix:public:transit:us-south:a/123456::gateway:456f58c1-afe7-123a-0a0a-7f3d720f1a44", "id": "456f58c1-afe7-123a-0a0a-7f3d720f1a44", "name": "my-transit-gw100"}, "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}], "first": {"href": "https://transit.cloud.ibm.com/v1/connections?limit=50"}, "limit": 50, "next": {"href": "https://transit.cloud.ibm.com/v1/connections?start=MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa&limit=50", "start": "MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa"}}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
limit = 1
start = 'testString'
network_id = 'testString'
# Invoke method
response = service.list_connections(
limit=limit,
start=start,
network_id=network_id,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
# Validate query params
query_string = responses.calls[0].request.url.split('?',1)[1]
query_string = requests.utils.unquote(query_string)
assert 'limit={}'.format(limit) in query_string
assert 'start={}'.format(start) in query_string
assert 'network_id={}'.format(network_id) in query_string
#--------------------------------------------------------
# test_list_connections_required_params()
#--------------------------------------------------------
@responses.activate
def test_list_connections_required_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/connections')
mock_response = '{"connections": [{"base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "name": "Transit_Service_SJ_DL", "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "transit_gateway": {"crn": "crn:v1:bluemix:public:transit:us-south:a/123456::gateway:456f58c1-afe7-123a-0a0a-7f3d720f1a44", "id": "456f58c1-afe7-123a-0a0a-7f3d720f1a44", "name": "my-transit-gw100"}, "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}], "first": {"href": "https://transit.cloud.ibm.com/v1/connections?limit=50"}, "limit": 50, "next": {"href": "https://transit.cloud.ibm.com/v1/connections?start=MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa&limit=50", "start": "MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa"}}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Invoke method
response = service.list_connections()
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
#--------------------------------------------------------
# test_list_connections_value_error()
#--------------------------------------------------------
@responses.activate
def test_list_connections_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/connections')
mock_response = '{"connections": [{"base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "name": "Transit_Service_SJ_DL", "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "transit_gateway": {"crn": "crn:v1:bluemix:public:transit:us-south:a/123456::gateway:456f58c1-afe7-123a-0a0a-7f3d720f1a44", "id": "456f58c1-afe7-123a-0a0a-7f3d720f1a44", "name": "my-transit-gw100"}, "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}], "first": {"href": "https://transit.cloud.ibm.com/v1/connections?limit=50"}, "limit": 50, "next": {"href": "https://transit.cloud.ibm.com/v1/connections?start=MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa&limit=50", "start": "MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa"}}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Pass in all but one required param and check for a ValueError
req_param_dict = {
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.list_connections(**req_copy)
# endregion
##############################################################################
# End of Service: TransitConnections
##############################################################################
##############################################################################
# Start of Service: TransitGateways
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for list_transit_gateways
#-----------------------------------------------------------------------------
class TestListTransitGateways():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# list_transit_gateways()
#--------------------------------------------------------
@responses.activate
def test_list_transit_gateways_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways')
mock_response = '{"first": {"href": "https://transit.cloud.ibm.com/v1/transit_gateways?limit=50"}, "limit": 50, "next": {"href": "https://transit.cloud.ibm.com/v1/transit_gateways?start=MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa&limit=50", "start": "MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa"}, "transit_gateways": [{"id": "ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "crn": "crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "name": "my-transit-gateway-in-TransitGateway", "location": "us-south", "created_at": "2019-01-01T12:00:00", "global": true, "resource_group": {"id": "56969d6043e9465c883cb9f7363e78e8", "href": "https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8"}, "status": "available", "updated_at": "2019-01-01T12:00:00"}]}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
limit = 1
start = 'testString'
# Invoke method
response = service.list_transit_gateways(
limit=limit,
start=start,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
# Validate query params
query_string = responses.calls[0].request.url.split('?',1)[1]
query_string = requests.utils.unquote(query_string)
assert 'limit={}'.format(limit) in query_string
assert 'start={}'.format(start) in query_string
#--------------------------------------------------------
# test_list_transit_gateways_required_params()
#--------------------------------------------------------
@responses.activate
def test_list_transit_gateways_required_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways')
mock_response = '{"first": {"href": "https://transit.cloud.ibm.com/v1/transit_gateways?limit=50"}, "limit": 50, "next": {"href": "https://transit.cloud.ibm.com/v1/transit_gateways?start=MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa&limit=50", "start": "MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa"}, "transit_gateways": [{"id": "ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "crn": "crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "name": "my-transit-gateway-in-TransitGateway", "location": "us-south", "created_at": "2019-01-01T12:00:00", "global": true, "resource_group": {"id": "56969d6043e9465c883cb9f7363e78e8", "href": "https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8"}, "status": "available", "updated_at": "2019-01-01T12:00:00"}]}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Invoke method
response = service.list_transit_gateways()
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
#--------------------------------------------------------
# test_list_transit_gateways_value_error()
#--------------------------------------------------------
@responses.activate
def test_list_transit_gateways_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways')
mock_response = '{"first": {"href": "https://transit.cloud.ibm.com/v1/transit_gateways?limit=50"}, "limit": 50, "next": {"href": "https://transit.cloud.ibm.com/v1/transit_gateways?start=MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa&limit=50", "start": "MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa"}, "transit_gateways": [{"id": "ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "crn": "crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "name": "my-transit-gateway-in-TransitGateway", "location": "us-south", "created_at": "2019-01-01T12:00:00", "global": true, "resource_group": {"id": "56969d6043e9465c883cb9f7363e78e8", "href": "https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8"}, "status": "available", "updated_at": "2019-01-01T12:00:00"}]}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Pass in all but one required param and check for a ValueError
req_param_dict = {
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.list_transit_gateways(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for create_transit_gateway
#-----------------------------------------------------------------------------
class TestCreateTransitGateway():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# create_transit_gateway()
#--------------------------------------------------------
@responses.activate
def test_create_transit_gateway_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways')
mock_response = '{"id": "ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "crn": "crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "name": "my-transit-gateway-in-TransitGateway", "location": "us-south", "created_at": "2019-01-01T12:00:00", "global": true, "resource_group": {"id": "56969d6043e9465c883cb9f7363e78e8", "href": "https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8"}, "status": "available", "updated_at": "2019-01-01T12:00:00"}'
responses.add(responses.POST,
url,
body=mock_response,
content_type='application/json',
status=201)
# Construct a dict representation of a ResourceGroupIdentity model
resource_group_identity_model = {}
resource_group_identity_model['id'] = '56969d6043e9465c883cb9f7363e78e8'
# Set up parameter values
location = 'us-south'
name = 'Transit_Service_BWTN_SJ_DL'
global_ = True
resource_group = resource_group_identity_model
# Invoke method
response = service.create_transit_gateway(
location,
name,
global_=global_,
resource_group=resource_group,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 201
# Validate body params
req_body = json.loads(str(responses.calls[0].request.body, 'utf-8'))
assert req_body['location'] == 'us-south'
assert req_body['name'] == 'Transit_Service_BWTN_SJ_DL'
assert req_body['global'] == True
assert req_body['resource_group'] == resource_group_identity_model
#--------------------------------------------------------
# test_create_transit_gateway_value_error()
#--------------------------------------------------------
@responses.activate
def test_create_transit_gateway_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways')
mock_response = '{"id": "ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "crn": "crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "name": "my-transit-gateway-in-TransitGateway", "location": "us-south", "created_at": "2019-01-01T12:00:00", "global": true, "resource_group": {"id": "56969d6043e9465c883cb9f7363e78e8", "href": "https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8"}, "status": "available", "updated_at": "2019-01-01T12:00:00"}'
responses.add(responses.POST,
url,
body=mock_response,
content_type='application/json',
status=201)
# Construct a dict representation of a ResourceGroupIdentity model
resource_group_identity_model = {}
resource_group_identity_model['id'] = '56969d6043e9465c883cb9f7363e78e8'
# Set up parameter values
location = 'us-south'
name = 'Transit_Service_BWTN_SJ_DL'
global_ = True
resource_group = resource_group_identity_model
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"location": location,
"name": name,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.create_transit_gateway(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for delete_transit_gateway
#-----------------------------------------------------------------------------
class TestDeleteTransitGateway():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# delete_transit_gateway()
#--------------------------------------------------------
@responses.activate
def test_delete_transit_gateway_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString')
responses.add(responses.DELETE,
url,
status=204)
# Set up parameter values
id = 'testString'
# Invoke method
response = service.delete_transit_gateway(
id,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 204
#--------------------------------------------------------
# test_delete_transit_gateway_value_error()
#--------------------------------------------------------
@responses.activate
def test_delete_transit_gateway_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString')
responses.add(responses.DELETE,
url,
status=204)
# Set up parameter values
id = 'testString'
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"id": id,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.delete_transit_gateway(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for get_transit_gateway
#-----------------------------------------------------------------------------
class TestGetTransitGateway():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# get_transit_gateway()
#--------------------------------------------------------
@responses.activate
def test_get_transit_gateway_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString')
mock_response = '{"id": "ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "crn": "crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "name": "my-transit-gateway-in-TransitGateway", "location": "us-south", "created_at": "2019-01-01T12:00:00", "global": true, "resource_group": {"id": "56969d6043e9465c883cb9f7363e78e8", "href": "https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8"}, "status": "available", "updated_at": "2019-01-01T12:00:00"}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
id = 'testString'
# Invoke method
response = service.get_transit_gateway(
id,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
#--------------------------------------------------------
# test_get_transit_gateway_value_error()
#--------------------------------------------------------
@responses.activate
def test_get_transit_gateway_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString')
mock_response = '{"id": "ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "crn": "crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "name": "my-transit-gateway-in-TransitGateway", "location": "us-south", "created_at": "2019-01-01T12:00:00", "global": true, "resource_group": {"id": "56969d6043e9465c883cb9f7363e78e8", "href": "https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8"}, "status": "available", "updated_at": "2019-01-01T12:00:00"}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
id = 'testString'
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"id": id,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.get_transit_gateway(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for update_transit_gateway
#-----------------------------------------------------------------------------
class TestUpdateTransitGateway():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# update_transit_gateway()
#--------------------------------------------------------
@responses.activate
def test_update_transit_gateway_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString')
mock_response = '{"id": "ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "crn": "crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "name": "my-transit-gateway-in-TransitGateway", "location": "us-south", "created_at": "2019-01-01T12:00:00", "global": true, "resource_group": {"id": "56969d6043e9465c883cb9f7363e78e8", "href": "https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8"}, "status": "available", "updated_at": "2019-01-01T12:00:00"}'
responses.add(responses.PATCH,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
id = 'testString'
global_ = True
name = 'my-transit-gateway'
# Invoke method
response = service.update_transit_gateway(
id,
global_=global_,
name=name,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
# Validate body params
req_body = json.loads(str(responses.calls[0].request.body, 'utf-8'))
assert req_body['global'] == True
assert req_body['name'] == 'my-transit-gateway'
#--------------------------------------------------------
# test_update_transit_gateway_value_error()
#--------------------------------------------------------
@responses.activate
def test_update_transit_gateway_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString')
mock_response = '{"id": "ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "crn": "crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4", "name": "my-transit-gateway-in-TransitGateway", "location": "us-south", "created_at": "2019-01-01T12:00:00", "global": true, "resource_group": {"id": "56969d6043e9465c883cb9f7363e78e8", "href": "https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8"}, "status": "available", "updated_at": "2019-01-01T12:00:00"}'
responses.add(responses.PATCH,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
id = 'testString'
global_ = True
name = 'my-transit-gateway'
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"id": id,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.update_transit_gateway(**req_copy)
# endregion
##############################################################################
# End of Service: TransitGateways
##############################################################################
##############################################################################
# Start of Service: TransitGatewaysNetworkConnections
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for list_transit_gateway_connections
#-----------------------------------------------------------------------------
class TestListTransitGatewayConnections():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# list_transit_gateway_connections()
#--------------------------------------------------------
@responses.activate
def test_list_transit_gateway_connections_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections')
mock_response = '{"connections": [{"name": "Transit_Service_BWTN_SJ_DL", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}]}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
transit_gateway_id = 'testString'
# Invoke method
response = service.list_transit_gateway_connections(
transit_gateway_id,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
#--------------------------------------------------------
# test_list_transit_gateway_connections_value_error()
#--------------------------------------------------------
@responses.activate
def test_list_transit_gateway_connections_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections')
mock_response = '{"connections": [{"name": "Transit_Service_BWTN_SJ_DL", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}]}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
transit_gateway_id = 'testString'
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"transit_gateway_id": transit_gateway_id,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.list_transit_gateway_connections(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for create_transit_gateway_connection
#-----------------------------------------------------------------------------
class TestCreateTransitGatewayConnection():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# create_transit_gateway_connection()
#--------------------------------------------------------
@responses.activate
def test_create_transit_gateway_connection_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections')
mock_response = '{"name": "Transit_Service_BWTN_SJ_DL", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}'
responses.add(responses.POST,
url,
body=mock_response,
content_type='application/json',
status=201)
# Construct a dict representation of a ZoneIdentityByName model
zone_identity_model = {}
zone_identity_model['name'] = 'us-south-1'
# Set up parameter values
transit_gateway_id = 'testString'
network_type = 'vpc'
base_connection_id = '975f58c1-afe7-469a-9727-7f3d720f2d32'
local_gateway_ip = '192.168.100.1'
local_tunnel_ip = '192.168.129.2'
name = 'Transit_Service_BWTN_SJ_DL'
network_account_id = '28e4d90ac7504be694471ee66e70d0d5'
network_id = 'crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b'
remote_bgp_asn = '65010'
remote_gateway_ip = '10.242.63.12'
remote_tunnel_ip = '192.168.129.1'
zone = zone_identity_model
# Invoke method
response = service.create_transit_gateway_connection(
transit_gateway_id,
network_type,
base_connection_id=base_connection_id,
local_gateway_ip=local_gateway_ip,
local_tunnel_ip=local_tunnel_ip,
name=name,
network_account_id=network_account_id,
network_id=network_id,
remote_bgp_asn=remote_bgp_asn,
remote_gateway_ip=remote_gateway_ip,
remote_tunnel_ip=remote_tunnel_ip,
zone=zone,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 201
# Validate body params
req_body = json.loads(str(responses.calls[0].request.body, 'utf-8'))
assert req_body['network_type'] == 'vpc'
assert req_body['base_connection_id'] == '975f58c1-afe7-469a-9727-7f3d720f2d32'
assert req_body['local_gateway_ip'] == '192.168.100.1'
assert req_body['local_tunnel_ip'] == '192.168.129.2'
assert req_body['name'] == 'Transit_Service_BWTN_SJ_DL'
assert req_body['network_account_id'] == '28e4d90ac7504be694471ee66e70d0d5'
assert req_body['network_id'] == 'crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b'
assert req_body['remote_bgp_asn'] == '65010'
assert req_body['remote_gateway_ip'] == '10.242.63.12'
assert req_body['remote_tunnel_ip'] == '192.168.129.1'
assert req_body['zone'] == zone_identity_model
#--------------------------------------------------------
# test_create_transit_gateway_connection_value_error()
#--------------------------------------------------------
@responses.activate
def test_create_transit_gateway_connection_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections')
mock_response = '{"name": "Transit_Service_BWTN_SJ_DL", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}'
responses.add(responses.POST,
url,
body=mock_response,
content_type='application/json',
status=201)
# Construct a dict representation of a ZoneIdentityByName model
zone_identity_model = {}
zone_identity_model['name'] = 'us-south-1'
# Set up parameter values
transit_gateway_id = 'testString'
network_type = 'vpc'
base_connection_id = '975f58c1-afe7-469a-9727-7f3d720f2d32'
local_gateway_ip = '192.168.100.1'
local_tunnel_ip = '192.168.129.2'
name = 'Transit_Service_BWTN_SJ_DL'
network_account_id = '28e4d90ac7504be694471ee66e70d0d5'
network_id = 'crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b'
remote_bgp_asn = '65010'
remote_gateway_ip = '10.242.63.12'
remote_tunnel_ip = '192.168.129.1'
zone = zone_identity_model
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"transit_gateway_id": transit_gateway_id,
"network_type": network_type,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.create_transit_gateway_connection(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for delete_transit_gateway_connection
#-----------------------------------------------------------------------------
class TestDeleteTransitGatewayConnection():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# delete_transit_gateway_connection()
#--------------------------------------------------------
@responses.activate
def test_delete_transit_gateway_connection_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections/testString')
responses.add(responses.DELETE,
url,
status=204)
# Set up parameter values
transit_gateway_id = 'testString'
id = 'testString'
# Invoke method
response = service.delete_transit_gateway_connection(
transit_gateway_id,
id,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 204
#--------------------------------------------------------
# test_delete_transit_gateway_connection_value_error()
#--------------------------------------------------------
@responses.activate
def test_delete_transit_gateway_connection_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections/testString')
responses.add(responses.DELETE,
url,
status=204)
# Set up parameter values
transit_gateway_id = 'testString'
id = 'testString'
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"transit_gateway_id": transit_gateway_id,
"id": id,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.delete_transit_gateway_connection(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for get_transit_gateway_connection
#-----------------------------------------------------------------------------
class TestGetTransitGatewayConnection():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# get_transit_gateway_connection()
#--------------------------------------------------------
@responses.activate
def test_get_transit_gateway_connection_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections/testString')
mock_response = '{"name": "Transit_Service_BWTN_SJ_DL", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
transit_gateway_id = 'testString'
id = 'testString'
# Invoke method
response = service.get_transit_gateway_connection(
transit_gateway_id,
id,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
#--------------------------------------------------------
# test_get_transit_gateway_connection_value_error()
#--------------------------------------------------------
@responses.activate
def test_get_transit_gateway_connection_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections/testString')
mock_response = '{"name": "Transit_Service_BWTN_SJ_DL", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
transit_gateway_id = 'testString'
id = 'testString'
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"transit_gateway_id": transit_gateway_id,
"id": id,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.get_transit_gateway_connection(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for update_transit_gateway_connection
#-----------------------------------------------------------------------------
class TestUpdateTransitGatewayConnection():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# update_transit_gateway_connection()
#--------------------------------------------------------
@responses.activate
def test_update_transit_gateway_connection_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections/testString')
mock_response = '{"name": "Transit_Service_BWTN_SJ_DL", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}'
responses.add(responses.PATCH,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
transit_gateway_id = 'testString'
id = 'testString'
name = 'Transit_Service_BWTN_SJ_DL'
# Invoke method
response = service.update_transit_gateway_connection(
transit_gateway_id,
id,
name=name,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
# Validate body params
req_body = json.loads(str(responses.calls[0].request.body, 'utf-8'))
assert req_body['name'] == 'Transit_Service_BWTN_SJ_DL'
#--------------------------------------------------------
# test_update_transit_gateway_connection_value_error()
#--------------------------------------------------------
@responses.activate
def test_update_transit_gateway_connection_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections/testString')
mock_response = '{"name": "Transit_Service_BWTN_SJ_DL", "network_id": "crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b", "network_type": "vpc", "id": "1a15dca5-7e33-45e1-b7c5-bc690e569531", "base_connection_id": "975f58c1-afe7-469a-9727-7f3d720f2d32", "created_at": "2019-01-01T12:00:00", "local_bgp_asn": 64490, "local_gateway_ip": "192.168.100.1", "local_tunnel_ip": "192.168.129.2", "mtu": 9000, "network_account_id": "28e4d90ac7504be694471ee66e70d0d5", "remote_bgp_asn": 65010, "remote_gateway_ip": "10.242.63.12", "remote_tunnel_ip": "192.168.129.1", "request_status": "pending", "status": "attached", "updated_at": "2019-01-01T12:00:00", "zone": {"name": "us-south-1"}}'
responses.add(responses.PATCH,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
transit_gateway_id = 'testString'
id = 'testString'
name = 'Transit_Service_BWTN_SJ_DL'
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"transit_gateway_id": transit_gateway_id,
"id": id,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.update_transit_gateway_connection(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for create_transit_gateway_connection_actions
#-----------------------------------------------------------------------------
class TestCreateTransitGatewayConnectionActions():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# create_transit_gateway_connection_actions()
#--------------------------------------------------------
@responses.activate
def test_create_transit_gateway_connection_actions_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections/testString/actions')
responses.add(responses.POST,
url,
status=204)
# Set up parameter values
transit_gateway_id = 'testString'
id = 'testString'
action = 'approve'
# Invoke method
response = service.create_transit_gateway_connection_actions(
transit_gateway_id,
id,
action,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 204
# Validate body params
req_body = json.loads(str(responses.calls[0].request.body, 'utf-8'))
assert req_body['action'] == 'approve'
#--------------------------------------------------------
# test_create_transit_gateway_connection_actions_value_error()
#--------------------------------------------------------
@responses.activate
def test_create_transit_gateway_connection_actions_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/transit_gateways/testString/connections/testString/actions')
responses.add(responses.POST,
url,
status=204)
# Set up parameter values
transit_gateway_id = 'testString'
id = 'testString'
action = 'approve'
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"transit_gateway_id": transit_gateway_id,
"id": id,
"action": action,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.create_transit_gateway_connection_actions(**req_copy)
# endregion
##############################################################################
# End of Service: TransitGatewaysNetworkConnections
##############################################################################
##############################################################################
# Start of Service: TransitLocation
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for list_gateway_locations
#-----------------------------------------------------------------------------
class TestListGatewayLocations():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# list_gateway_locations()
#--------------------------------------------------------
@responses.activate
def test_list_gateway_locations_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/locations')
mock_response = '{"locations": [{"billing_location": "us", "name": "us-south", "type": "region"}]}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Invoke method
response = service.list_gateway_locations()
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
#--------------------------------------------------------
# test_list_gateway_locations_value_error()
#--------------------------------------------------------
@responses.activate
def test_list_gateway_locations_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/locations')
mock_response = '{"locations": [{"billing_location": "us", "name": "us-south", "type": "region"}]}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Pass in all but one required param and check for a ValueError
req_param_dict = {
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.list_gateway_locations(**req_copy)
#-----------------------------------------------------------------------------
# Test Class for get_gateway_location
#-----------------------------------------------------------------------------
class TestGetGatewayLocation():
# Preprocess the request URL to ensure the mock response will be found.
def preprocess_url(self, request_url: str):
request_url = urllib.parse.quote(request_url, safe=':/')
if re.fullmatch('.*/+', request_url) is None:
return request_url
else:
return re.compile(request_url.rstrip('/') + '/+')
#--------------------------------------------------------
# get_gateway_location()
#--------------------------------------------------------
@responses.activate
def test_get_gateway_location_all_params(self):
# Set up mock
url = self.preprocess_url(base_url + '/locations/testString')
mock_response = '{"billing_location": "us", "name": "us-south", "type": "region", "local_connection_locations": [{"display_name": "Dallas", "name": "us-south", "type": "region"}]}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
name = 'testString'
# Invoke method
response = service.get_gateway_location(
name,
headers={}
)
# Check for correct operation
assert len(responses.calls) == 1
assert response.status_code == 200
#--------------------------------------------------------
# test_get_gateway_location_value_error()
#--------------------------------------------------------
@responses.activate
def test_get_gateway_location_value_error(self):
# Set up mock
url = self.preprocess_url(base_url + '/locations/testString')
mock_response = '{"billing_location": "us", "name": "us-south", "type": "region", "local_connection_locations": [{"display_name": "Dallas", "name": "us-south", "type": "region"}]}'
responses.add(responses.GET,
url,
body=mock_response,
content_type='application/json',
status=200)
# Set up parameter values
name = 'testString'
# Pass in all but one required param and check for a ValueError
req_param_dict = {
"name": name,
}
for param in req_param_dict.keys():
req_copy = {key:val if key is not param else None for (key,val) in req_param_dict.items()}
with pytest.raises(ValueError):
service.get_gateway_location(**req_copy)
# endregion
##############################################################################
# End of Service: TransitLocation
##############################################################################
##############################################################################
# Start of Model Tests
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for ResourceGroupIdentity
#-----------------------------------------------------------------------------
class TestResourceGroupIdentity():
#--------------------------------------------------------
# Test serialization/deserialization for ResourceGroupIdentity
#--------------------------------------------------------
def test_resource_group_identity_serialization(self):
# Construct a json representation of a ResourceGroupIdentity model
resource_group_identity_model_json = {}
resource_group_identity_model_json['id'] = '56969d6043e9465c883cb9f7363e78e8'
# Construct a model instance of ResourceGroupIdentity by calling from_dict on the json representation
resource_group_identity_model = ResourceGroupIdentity.from_dict(resource_group_identity_model_json)
assert resource_group_identity_model != False
# Construct a model instance of ResourceGroupIdentity by calling from_dict on the json representation
resource_group_identity_model_dict = ResourceGroupIdentity.from_dict(resource_group_identity_model_json).__dict__
resource_group_identity_model2 = ResourceGroupIdentity(**resource_group_identity_model_dict)
# Verify the model instances are equivalent
assert resource_group_identity_model == resource_group_identity_model2
# Convert model instance back to dict and verify no loss of data
resource_group_identity_model_json2 = resource_group_identity_model.to_dict()
assert resource_group_identity_model_json2 == resource_group_identity_model_json
#-----------------------------------------------------------------------------
# Test Class for ResourceGroupReference
#-----------------------------------------------------------------------------
class TestResourceGroupReference():
#--------------------------------------------------------
# Test serialization/deserialization for ResourceGroupReference
#--------------------------------------------------------
def test_resource_group_reference_serialization(self):
# Construct a json representation of a ResourceGroupReference model
resource_group_reference_model_json = {}
resource_group_reference_model_json['id'] = '56969d6043e9465c883cb9f7363e78e8'
resource_group_reference_model_json['href'] = 'https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8'
# Construct a model instance of ResourceGroupReference by calling from_dict on the json representation
resource_group_reference_model = ResourceGroupReference.from_dict(resource_group_reference_model_json)
assert resource_group_reference_model != False
# Construct a model instance of ResourceGroupReference by calling from_dict on the json representation
resource_group_reference_model_dict = ResourceGroupReference.from_dict(resource_group_reference_model_json).__dict__
resource_group_reference_model2 = ResourceGroupReference(**resource_group_reference_model_dict)
# Verify the model instances are equivalent
assert resource_group_reference_model == resource_group_reference_model2
# Convert model instance back to dict and verify no loss of data
resource_group_reference_model_json2 = resource_group_reference_model.to_dict()
assert resource_group_reference_model_json2 == resource_group_reference_model_json
#-----------------------------------------------------------------------------
# Test Class for TSCollection
#-----------------------------------------------------------------------------
class TestTSCollection():
#--------------------------------------------------------
# Test serialization/deserialization for TSCollection
#--------------------------------------------------------
def test_ts_collection_serialization(self):
# Construct dict forms of any model objects needed in order to build this model.
ts_location_basic_model = {} # TSLocationBasic
ts_location_basic_model['billing_location'] = 'us'
ts_location_basic_model['name'] = 'us-south'
ts_location_basic_model['type'] = 'region'
# Construct a json representation of a TSCollection model
ts_collection_model_json = {}
ts_collection_model_json['locations'] = [ts_location_basic_model]
# Construct a model instance of TSCollection by calling from_dict on the json representation
ts_collection_model = TSCollection.from_dict(ts_collection_model_json)
assert ts_collection_model != False
# Construct a model instance of TSCollection by calling from_dict on the json representation
ts_collection_model_dict = TSCollection.from_dict(ts_collection_model_json).__dict__
ts_collection_model2 = TSCollection(**ts_collection_model_dict)
# Verify the model instances are equivalent
assert ts_collection_model == ts_collection_model2
# Convert model instance back to dict and verify no loss of data
ts_collection_model_json2 = ts_collection_model.to_dict()
assert ts_collection_model_json2 == ts_collection_model_json
#-----------------------------------------------------------------------------
# Test Class for TSLocalLocation
#-----------------------------------------------------------------------------
class TestTSLocalLocation():
#--------------------------------------------------------
# Test serialization/deserialization for TSLocalLocation
#--------------------------------------------------------
def test_ts_local_location_serialization(self):
# Construct a json representation of a TSLocalLocation model
ts_local_location_model_json = {}
ts_local_location_model_json['display_name'] = 'Dallas'
ts_local_location_model_json['name'] = 'us-south'
ts_local_location_model_json['type'] = 'region'
# Construct a model instance of TSLocalLocation by calling from_dict on the json representation
ts_local_location_model = TSLocalLocation.from_dict(ts_local_location_model_json)
assert ts_local_location_model != False
# Construct a model instance of TSLocalLocation by calling from_dict on the json representation
ts_local_location_model_dict = TSLocalLocation.from_dict(ts_local_location_model_json).__dict__
ts_local_location_model2 = TSLocalLocation(**ts_local_location_model_dict)
# Verify the model instances are equivalent
assert ts_local_location_model == ts_local_location_model2
# Convert model instance back to dict and verify no loss of data
ts_local_location_model_json2 = ts_local_location_model.to_dict()
assert ts_local_location_model_json2 == ts_local_location_model_json
#-----------------------------------------------------------------------------
# Test Class for TSLocation
#-----------------------------------------------------------------------------
class TestTSLocation():
#--------------------------------------------------------
# Test serialization/deserialization for TSLocation
#--------------------------------------------------------
def test_ts_location_serialization(self):
# Construct dict forms of any model objects needed in order to build this model.
ts_local_location_model = {} # TSLocalLocation
ts_local_location_model['display_name'] = 'Dallas'
ts_local_location_model['name'] = 'us-south'
ts_local_location_model['type'] = 'region'
# Construct a json representation of a TSLocation model
ts_location_model_json = {}
ts_location_model_json['billing_location'] = 'us'
ts_location_model_json['name'] = 'us-south'
ts_location_model_json['type'] = 'region'
ts_location_model_json['local_connection_locations'] = [ts_local_location_model]
# Construct a model instance of TSLocation by calling from_dict on the json representation
ts_location_model = TSLocation.from_dict(ts_location_model_json)
assert ts_location_model != False
# Construct a model instance of TSLocation by calling from_dict on the json representation
ts_location_model_dict = TSLocation.from_dict(ts_location_model_json).__dict__
ts_location_model2 = TSLocation(**ts_location_model_dict)
# Verify the model instances are equivalent
assert ts_location_model == ts_location_model2
# Convert model instance back to dict and verify no loss of data
ts_location_model_json2 = ts_location_model.to_dict()
assert ts_location_model_json2 == ts_location_model_json
#-----------------------------------------------------------------------------
# Test Class for TSLocationBasic
#-----------------------------------------------------------------------------
class TestTSLocationBasic():
#--------------------------------------------------------
# Test serialization/deserialization for TSLocationBasic
#--------------------------------------------------------
def test_ts_location_basic_serialization(self):
# Construct a json representation of a TSLocationBasic model
ts_location_basic_model_json = {}
ts_location_basic_model_json['billing_location'] = 'us'
ts_location_basic_model_json['name'] = 'us-south'
ts_location_basic_model_json['type'] = 'region'
# Construct a model instance of TSLocationBasic by calling from_dict on the json representation
ts_location_basic_model = TSLocationBasic.from_dict(ts_location_basic_model_json)
assert ts_location_basic_model != False
# Construct a model instance of TSLocationBasic by calling from_dict on the json representation
ts_location_basic_model_dict = TSLocationBasic.from_dict(ts_location_basic_model_json).__dict__
ts_location_basic_model2 = TSLocationBasic(**ts_location_basic_model_dict)
# Verify the model instances are equivalent
assert ts_location_basic_model == ts_location_basic_model2
# Convert model instance back to dict and verify no loss of data
ts_location_basic_model_json2 = ts_location_basic_model.to_dict()
assert ts_location_basic_model_json2 == ts_location_basic_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitConnection
#-----------------------------------------------------------------------------
class TestTransitConnection():
#--------------------------------------------------------
# Test serialization/deserialization for TransitConnection
#--------------------------------------------------------
def test_transit_connection_serialization(self):
# Construct dict forms of any model objects needed in order to build this model.
transit_gateway_reference_model = {} # TransitGatewayReference
transit_gateway_reference_model['crn'] = 'crn:v1:bluemix:public:transit:us-south:a/123456::gateway:456f58c1-afe7-123a-0a0a-7f3d720f1a44'
transit_gateway_reference_model['id'] = '456f58c1-afe7-123a-0a0a-7f3d720f1a44'
transit_gateway_reference_model['name'] = 'my-transit-gw100'
zone_reference_model = {} # ZoneReference
zone_reference_model['name'] = 'us-south-1'
# Construct a json representation of a TransitConnection model
transit_connection_model_json = {}
transit_connection_model_json['base_connection_id'] = '975f58c1-afe7-469a-9727-7f3d720f2d32'
transit_connection_model_json['created_at'] = '2020-01-28T18:40:40.123456Z'
transit_connection_model_json['id'] = '1a15dca5-7e33-45e1-b7c5-bc690e569531'
transit_connection_model_json['local_bgp_asn'] = 64490
transit_connection_model_json['local_gateway_ip'] = '192.168.100.1'
transit_connection_model_json['local_tunnel_ip'] = '192.168.129.2'
transit_connection_model_json['mtu'] = 9000
transit_connection_model_json['name'] = 'Transit_Service_SJ_DL'
transit_connection_model_json['network_account_id'] = '28e4d90ac7504be694471ee66e70d0d5'
transit_connection_model_json['network_id'] = 'crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b'
transit_connection_model_json['network_type'] = 'vpc'
transit_connection_model_json['remote_bgp_asn'] = 65010
transit_connection_model_json['remote_gateway_ip'] = '10.242.63.12'
transit_connection_model_json['remote_tunnel_ip'] = '192.168.129.1'
transit_connection_model_json['request_status'] = 'pending'
transit_connection_model_json['status'] = 'attached'
transit_connection_model_json['transit_gateway'] = transit_gateway_reference_model
transit_connection_model_json['updated_at'] = '2020-01-28T18:40:40.123456Z'
transit_connection_model_json['zone'] = zone_reference_model
# Construct a model instance of TransitConnection by calling from_dict on the json representation
transit_connection_model = TransitConnection.from_dict(transit_connection_model_json)
assert transit_connection_model != False
# Construct a model instance of TransitConnection by calling from_dict on the json representation
transit_connection_model_dict = TransitConnection.from_dict(transit_connection_model_json).__dict__
transit_connection_model2 = TransitConnection(**transit_connection_model_dict)
# Verify the model instances are equivalent
assert transit_connection_model == transit_connection_model2
# Convert model instance back to dict and verify no loss of data
transit_connection_model_json2 = transit_connection_model.to_dict()
assert transit_connection_model_json2 == transit_connection_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitConnectionCollection
#-----------------------------------------------------------------------------
class TestTransitConnectionCollection():
#--------------------------------------------------------
# Test serialization/deserialization for TransitConnectionCollection
#--------------------------------------------------------
def test_transit_connection_collection_serialization(self):
# Construct dict forms of any model objects needed in order to build this model.
transit_gateway_reference_model = {} # TransitGatewayReference
transit_gateway_reference_model['crn'] = 'crn:v1:bluemix:public:transit:us-south:a/123456::gateway:456f58c1-afe7-123a-0a0a-7f3d720f1a44'
transit_gateway_reference_model['id'] = '456f58c1-afe7-123a-0a0a-7f3d720f1a44'
transit_gateway_reference_model['name'] = 'my-transit-gw100'
zone_reference_model = {} # ZoneReference
zone_reference_model['name'] = 'us-south-1'
transit_connection_model = {} # TransitConnection
transit_connection_model['base_connection_id'] = '975f58c1-afe7-469a-9727-7f3d720f2d32'
transit_connection_model['created_at'] = '2020-01-28T18:40:40.123456Z'
transit_connection_model['id'] = '1a15dca5-7e33-45e1-b7c5-bc690e569531'
transit_connection_model['local_bgp_asn'] = 64490
transit_connection_model['local_gateway_ip'] = '192.168.100.1'
transit_connection_model['local_tunnel_ip'] = '192.168.129.2'
transit_connection_model['mtu'] = 9000
transit_connection_model['name'] = 'Transit_Service_SJ_DL'
transit_connection_model['network_account_id'] = '28e4d90ac7504be694471ee66e70d0d5'
transit_connection_model['network_id'] = 'crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b'
transit_connection_model['network_type'] = 'vpc'
transit_connection_model['remote_bgp_asn'] = 65010
transit_connection_model['remote_gateway_ip'] = '10.242.63.12'
transit_connection_model['remote_tunnel_ip'] = '192.168.129.1'
transit_connection_model['request_status'] = 'pending'
transit_connection_model['status'] = 'attached'
transit_connection_model['transit_gateway'] = transit_gateway_reference_model
transit_connection_model['updated_at'] = '2020-01-28T18:40:40.123456Z'
transit_connection_model['zone'] = zone_reference_model
transit_connection_collection_first_model = {} # TransitConnectionCollectionFirst
transit_connection_collection_first_model['href'] = 'https://transit.cloud.ibm.com/v1/connections?limit=50'
transit_connection_collection_next_model = {} # TransitConnectionCollectionNext
transit_connection_collection_next_model['href'] = 'https://transit.cloud.ibm.com/v1/connections?start=MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa&limit=50'
transit_connection_collection_next_model['start'] = 'MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa'
# Construct a json representation of a TransitConnectionCollection model
transit_connection_collection_model_json = {}
transit_connection_collection_model_json['connections'] = [transit_connection_model]
transit_connection_collection_model_json['first'] = transit_connection_collection_first_model
transit_connection_collection_model_json['limit'] = 50
transit_connection_collection_model_json['next'] = transit_connection_collection_next_model
# Construct a model instance of TransitConnectionCollection by calling from_dict on the json representation
transit_connection_collection_model = TransitConnectionCollection.from_dict(transit_connection_collection_model_json)
assert transit_connection_collection_model != False
# Construct a model instance of TransitConnectionCollection by calling from_dict on the json representation
transit_connection_collection_model_dict = TransitConnectionCollection.from_dict(transit_connection_collection_model_json).__dict__
transit_connection_collection_model2 = TransitConnectionCollection(**transit_connection_collection_model_dict)
# Verify the model instances are equivalent
assert transit_connection_collection_model == transit_connection_collection_model2
# Convert model instance back to dict and verify no loss of data
transit_connection_collection_model_json2 = transit_connection_collection_model.to_dict()
assert transit_connection_collection_model_json2 == transit_connection_collection_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitConnectionCollectionFirst
#-----------------------------------------------------------------------------
class TestTransitConnectionCollectionFirst():
#--------------------------------------------------------
# Test serialization/deserialization for TransitConnectionCollectionFirst
#--------------------------------------------------------
def test_transit_connection_collection_first_serialization(self):
# Construct a json representation of a TransitConnectionCollectionFirst model
transit_connection_collection_first_model_json = {}
transit_connection_collection_first_model_json['href'] = 'https://transit.cloud.ibm.com/v1/connections?limit=50'
# Construct a model instance of TransitConnectionCollectionFirst by calling from_dict on the json representation
transit_connection_collection_first_model = TransitConnectionCollectionFirst.from_dict(transit_connection_collection_first_model_json)
assert transit_connection_collection_first_model != False
# Construct a model instance of TransitConnectionCollectionFirst by calling from_dict on the json representation
transit_connection_collection_first_model_dict = TransitConnectionCollectionFirst.from_dict(transit_connection_collection_first_model_json).__dict__
transit_connection_collection_first_model2 = TransitConnectionCollectionFirst(**transit_connection_collection_first_model_dict)
# Verify the model instances are equivalent
assert transit_connection_collection_first_model == transit_connection_collection_first_model2
# Convert model instance back to dict and verify no loss of data
transit_connection_collection_first_model_json2 = transit_connection_collection_first_model.to_dict()
assert transit_connection_collection_first_model_json2 == transit_connection_collection_first_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitConnectionCollectionNext
#-----------------------------------------------------------------------------
class TestTransitConnectionCollectionNext():
#--------------------------------------------------------
# Test serialization/deserialization for TransitConnectionCollectionNext
#--------------------------------------------------------
def test_transit_connection_collection_next_serialization(self):
# Construct a json representation of a TransitConnectionCollectionNext model
transit_connection_collection_next_model_json = {}
transit_connection_collection_next_model_json['href'] = 'https://transit.cloud.ibm.com/v1/connections?start=MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa&limit=50'
transit_connection_collection_next_model_json['start'] = 'MjAyMC0wNS0wOVQxNjoyMDoyMC4yMjQ5NzNa'
# Construct a model instance of TransitConnectionCollectionNext by calling from_dict on the json representation
transit_connection_collection_next_model = TransitConnectionCollectionNext.from_dict(transit_connection_collection_next_model_json)
assert transit_connection_collection_next_model != False
# Construct a model instance of TransitConnectionCollectionNext by calling from_dict on the json representation
transit_connection_collection_next_model_dict = TransitConnectionCollectionNext.from_dict(transit_connection_collection_next_model_json).__dict__
transit_connection_collection_next_model2 = TransitConnectionCollectionNext(**transit_connection_collection_next_model_dict)
# Verify the model instances are equivalent
assert transit_connection_collection_next_model == transit_connection_collection_next_model2
# Convert model instance back to dict and verify no loss of data
transit_connection_collection_next_model_json2 = transit_connection_collection_next_model.to_dict()
assert transit_connection_collection_next_model_json2 == transit_connection_collection_next_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitGateway
#-----------------------------------------------------------------------------
class TestTransitGateway():
#--------------------------------------------------------
# Test serialization/deserialization for TransitGateway
#--------------------------------------------------------
def test_transit_gateway_serialization(self):
# Construct dict forms of any model objects needed in order to build this model.
resource_group_reference_model = {} # ResourceGroupReference
resource_group_reference_model['id'] = '56969d6043e9465c883cb9f7363e78e8'
resource_group_reference_model['href'] = 'https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8'
# Construct a json representation of a TransitGateway model
transit_gateway_model_json = {}
transit_gateway_model_json['id'] = 'ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4'
transit_gateway_model_json['crn'] = 'crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4'
transit_gateway_model_json['name'] = 'my-transit-gateway-in-TransitGateway'
transit_gateway_model_json['location'] = 'us-south'
transit_gateway_model_json['created_at'] = '2020-01-28T18:40:40.123456Z'
transit_gateway_model_json['global'] = True
transit_gateway_model_json['resource_group'] = resource_group_reference_model
transit_gateway_model_json['status'] = 'available'
transit_gateway_model_json['updated_at'] = '2020-01-28T18:40:40.123456Z'
# Construct a model instance of TransitGateway by calling from_dict on the json representation
transit_gateway_model = TransitGateway.from_dict(transit_gateway_model_json)
assert transit_gateway_model != False
# Construct a model instance of TransitGateway by calling from_dict on the json representation
transit_gateway_model_dict = TransitGateway.from_dict(transit_gateway_model_json).__dict__
transit_gateway_model2 = TransitGateway(**transit_gateway_model_dict)
# Verify the model instances are equivalent
assert transit_gateway_model == transit_gateway_model2
# Convert model instance back to dict and verify no loss of data
transit_gateway_model_json2 = transit_gateway_model.to_dict()
assert transit_gateway_model_json2 == transit_gateway_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitGatewayCollection
#-----------------------------------------------------------------------------
class TestTransitGatewayCollection():
#--------------------------------------------------------
# Test serialization/deserialization for TransitGatewayCollection
#--------------------------------------------------------
def test_transit_gateway_collection_serialization(self):
# Construct dict forms of any model objects needed in order to build this model.
resource_group_reference_model = {} # ResourceGroupReference
resource_group_reference_model['id'] = '56969d6043e9465c883cb9f7363e78e8'
resource_group_reference_model['href'] = 'https://resource-manager.bluemix.net/v1/resource_groups/56969d6043e9465c883cb9f7363e78e8'
transit_gateway_model = {} # TransitGateway
transit_gateway_model['id'] = 'ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4'
transit_gateway_model['crn'] = 'crn:v1:bluemix:public:transit:dal03:a/57a7d05f36894e3cb9b46a43556d903e::gateway:ef4dcb1a-fee4-41c7-9e11-9cd99e65c1f4'
transit_gateway_model['name'] = 'my-transit-gateway-in-TransitGateway'
transit_gateway_model['location'] = 'us-south'
transit_gateway_model['created_at'] = '2020-01-28T18:40:40.123456Z'
transit_gateway_model['global'] = True
transit_gateway_model['resource_group'] = resource_group_reference_model
transit_gateway_model['status'] = 'available'
transit_gateway_model['updated_at'] = '2020-01-28T18:40:40.123456Z'
transit_gateway_collection_first_model = {} # TransitGatewayCollectionFirst
transit_gateway_collection_first_model['href'] = 'https://transit.cloud.ibm.com/v1/transit_gateways?limit=50'
transit_gateway_collection_next_model = {} # TransitGatewayCollectionNext
transit_gateway_collection_next_model['href'] = 'https://transit.cloud.ibm.com/v1/transit_gateways?start=MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa&limit=50'
transit_gateway_collection_next_model['start'] = 'MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa'
# Construct a json representation of a TransitGatewayCollection model
transit_gateway_collection_model_json = {}
transit_gateway_collection_model_json['first'] = transit_gateway_collection_first_model
transit_gateway_collection_model_json['limit'] = 50
transit_gateway_collection_model_json['next'] = transit_gateway_collection_next_model
transit_gateway_collection_model_json['transit_gateways'] = [transit_gateway_model]
# Construct a model instance of TransitGatewayCollection by calling from_dict on the json representation
transit_gateway_collection_model = TransitGatewayCollection.from_dict(transit_gateway_collection_model_json)
assert transit_gateway_collection_model != False
# Construct a model instance of TransitGatewayCollection by calling from_dict on the json representation
transit_gateway_collection_model_dict = TransitGatewayCollection.from_dict(transit_gateway_collection_model_json).__dict__
transit_gateway_collection_model2 = TransitGatewayCollection(**transit_gateway_collection_model_dict)
# Verify the model instances are equivalent
assert transit_gateway_collection_model == transit_gateway_collection_model2
# Convert model instance back to dict and verify no loss of data
transit_gateway_collection_model_json2 = transit_gateway_collection_model.to_dict()
assert transit_gateway_collection_model_json2 == transit_gateway_collection_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitGatewayCollectionFirst
#-----------------------------------------------------------------------------
class TestTransitGatewayCollectionFirst():
#--------------------------------------------------------
# Test serialization/deserialization for TransitGatewayCollectionFirst
#--------------------------------------------------------
def test_transit_gateway_collection_first_serialization(self):
# Construct a json representation of a TransitGatewayCollectionFirst model
transit_gateway_collection_first_model_json = {}
transit_gateway_collection_first_model_json['href'] = 'https://transit.cloud.ibm.com/v1/transit_gateways?limit=50'
# Construct a model instance of TransitGatewayCollectionFirst by calling from_dict on the json representation
transit_gateway_collection_first_model = TransitGatewayCollectionFirst.from_dict(transit_gateway_collection_first_model_json)
assert transit_gateway_collection_first_model != False
# Construct a model instance of TransitGatewayCollectionFirst by calling from_dict on the json representation
transit_gateway_collection_first_model_dict = TransitGatewayCollectionFirst.from_dict(transit_gateway_collection_first_model_json).__dict__
transit_gateway_collection_first_model2 = TransitGatewayCollectionFirst(**transit_gateway_collection_first_model_dict)
# Verify the model instances are equivalent
assert transit_gateway_collection_first_model == transit_gateway_collection_first_model2
# Convert model instance back to dict and verify no loss of data
transit_gateway_collection_first_model_json2 = transit_gateway_collection_first_model.to_dict()
assert transit_gateway_collection_first_model_json2 == transit_gateway_collection_first_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitGatewayCollectionNext
#-----------------------------------------------------------------------------
class TestTransitGatewayCollectionNext():
#--------------------------------------------------------
# Test serialization/deserialization for TransitGatewayCollectionNext
#--------------------------------------------------------
def test_transit_gateway_collection_next_serialization(self):
# Construct a json representation of a TransitGatewayCollectionNext model
transit_gateway_collection_next_model_json = {}
transit_gateway_collection_next_model_json['href'] = 'https://transit.cloud.ibm.com/v1/transit_gateways?start=MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa&limit=50'
transit_gateway_collection_next_model_json['start'] = 'MjAyMC0wNS0wOFQxNDoxNzowMy45NzQ5NzNa'
# Construct a model instance of TransitGatewayCollectionNext by calling from_dict on the json representation
transit_gateway_collection_next_model = TransitGatewayCollectionNext.from_dict(transit_gateway_collection_next_model_json)
assert transit_gateway_collection_next_model != False
# Construct a model instance of TransitGatewayCollectionNext by calling from_dict on the json representation
transit_gateway_collection_next_model_dict = TransitGatewayCollectionNext.from_dict(transit_gateway_collection_next_model_json).__dict__
transit_gateway_collection_next_model2 = TransitGatewayCollectionNext(**transit_gateway_collection_next_model_dict)
# Verify the model instances are equivalent
assert transit_gateway_collection_next_model == transit_gateway_collection_next_model2
# Convert model instance back to dict and verify no loss of data
transit_gateway_collection_next_model_json2 = transit_gateway_collection_next_model.to_dict()
assert transit_gateway_collection_next_model_json2 == transit_gateway_collection_next_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitGatewayConnectionCollection
#-----------------------------------------------------------------------------
class TestTransitGatewayConnectionCollection():
#--------------------------------------------------------
# Test serialization/deserialization for TransitGatewayConnectionCollection
#--------------------------------------------------------
def test_transit_gateway_connection_collection_serialization(self):
# Construct dict forms of any model objects needed in order to build this model.
transit_gateway_connection_cust_zone_model = {} # TransitGatewayConnectionCustZone
transit_gateway_connection_cust_zone_model['name'] = 'us-south-1'
transit_gateway_connection_cust_model = {} # TransitGatewayConnectionCust
transit_gateway_connection_cust_model['name'] = 'Transit_Service_BWTN_SJ_DL'
transit_gateway_connection_cust_model['network_id'] = 'crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b'
transit_gateway_connection_cust_model['network_type'] = 'vpc'
transit_gateway_connection_cust_model['id'] = '1a15dca5-7e33-45e1-b7c5-bc690e569531'
transit_gateway_connection_cust_model['base_connection_id'] = '975f58c1-afe7-469a-9727-7f3d720f2d32'
transit_gateway_connection_cust_model['created_at'] = '2020-01-28T18:40:40.123456Z'
transit_gateway_connection_cust_model['local_bgp_asn'] = 64490
transit_gateway_connection_cust_model['local_gateway_ip'] = '192.168.100.1'
transit_gateway_connection_cust_model['local_tunnel_ip'] = '192.168.129.2'
transit_gateway_connection_cust_model['mtu'] = 9000
transit_gateway_connection_cust_model['network_account_id'] = '28e4d90ac7504be694471ee66e70d0d5'
transit_gateway_connection_cust_model['remote_bgp_asn'] = 65010
transit_gateway_connection_cust_model['remote_gateway_ip'] = '10.242.63.12'
transit_gateway_connection_cust_model['remote_tunnel_ip'] = '192.168.129.1'
transit_gateway_connection_cust_model['request_status'] = 'pending'
transit_gateway_connection_cust_model['status'] = 'attached'
transit_gateway_connection_cust_model['updated_at'] = '2020-01-28T18:40:40.123456Z'
transit_gateway_connection_cust_model['zone'] = transit_gateway_connection_cust_zone_model
# Construct a json representation of a TransitGatewayConnectionCollection model
transit_gateway_connection_collection_model_json = {}
transit_gateway_connection_collection_model_json['connections'] = [transit_gateway_connection_cust_model]
# Construct a model instance of TransitGatewayConnectionCollection by calling from_dict on the json representation
transit_gateway_connection_collection_model = TransitGatewayConnectionCollection.from_dict(transit_gateway_connection_collection_model_json)
assert transit_gateway_connection_collection_model != False
# Construct a model instance of TransitGatewayConnectionCollection by calling from_dict on the json representation
transit_gateway_connection_collection_model_dict = TransitGatewayConnectionCollection.from_dict(transit_gateway_connection_collection_model_json).__dict__
transit_gateway_connection_collection_model2 = TransitGatewayConnectionCollection(**transit_gateway_connection_collection_model_dict)
# Verify the model instances are equivalent
assert transit_gateway_connection_collection_model == transit_gateway_connection_collection_model2
# Convert model instance back to dict and verify no loss of data
transit_gateway_connection_collection_model_json2 = transit_gateway_connection_collection_model.to_dict()
assert transit_gateway_connection_collection_model_json2 == transit_gateway_connection_collection_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitGatewayConnectionCust
#-----------------------------------------------------------------------------
class TestTransitGatewayConnectionCust():
#--------------------------------------------------------
# Test serialization/deserialization for TransitGatewayConnectionCust
#--------------------------------------------------------
def test_transit_gateway_connection_cust_serialization(self):
# Construct dict forms of any model objects needed in order to build this model.
transit_gateway_connection_cust_zone_model = {} # TransitGatewayConnectionCustZone
transit_gateway_connection_cust_zone_model['name'] = 'us-south-1'
# Construct a json representation of a TransitGatewayConnectionCust model
transit_gateway_connection_cust_model_json = {}
transit_gateway_connection_cust_model_json['name'] = 'Transit_Service_BWTN_SJ_DL'
transit_gateway_connection_cust_model_json['network_id'] = 'crn:v1:bluemix:public:is:us-south:a/123456::vpc:4727d842-f94f-4a2d-824a-9bc9b02c523b'
transit_gateway_connection_cust_model_json['network_type'] = 'vpc'
transit_gateway_connection_cust_model_json['id'] = '1a15dca5-7e33-45e1-b7c5-bc690e569531'
transit_gateway_connection_cust_model_json['base_connection_id'] = '975f58c1-afe7-469a-9727-7f3d720f2d32'
transit_gateway_connection_cust_model_json['created_at'] = '2020-01-28T18:40:40.123456Z'
transit_gateway_connection_cust_model_json['local_bgp_asn'] = 64490
transit_gateway_connection_cust_model_json['local_gateway_ip'] = '192.168.100.1'
transit_gateway_connection_cust_model_json['local_tunnel_ip'] = '192.168.129.2'
transit_gateway_connection_cust_model_json['mtu'] = 9000
transit_gateway_connection_cust_model_json['network_account_id'] = '28e4d90ac7504be694471ee66e70d0d5'
transit_gateway_connection_cust_model_json['remote_bgp_asn'] = 65010
transit_gateway_connection_cust_model_json['remote_gateway_ip'] = '10.242.63.12'
transit_gateway_connection_cust_model_json['remote_tunnel_ip'] = '192.168.129.1'
transit_gateway_connection_cust_model_json['request_status'] = 'pending'
transit_gateway_connection_cust_model_json['status'] = 'attached'
transit_gateway_connection_cust_model_json['updated_at'] = '2020-01-28T18:40:40.123456Z'
transit_gateway_connection_cust_model_json['zone'] = transit_gateway_connection_cust_zone_model
# Construct a model instance of TransitGatewayConnectionCust by calling from_dict on the json representation
transit_gateway_connection_cust_model = TransitGatewayConnectionCust.from_dict(transit_gateway_connection_cust_model_json)
assert transit_gateway_connection_cust_model != False
# Construct a model instance of TransitGatewayConnectionCust by calling from_dict on the json representation
transit_gateway_connection_cust_model_dict = TransitGatewayConnectionCust.from_dict(transit_gateway_connection_cust_model_json).__dict__
transit_gateway_connection_cust_model2 = TransitGatewayConnectionCust(**transit_gateway_connection_cust_model_dict)
# Verify the model instances are equivalent
assert transit_gateway_connection_cust_model == transit_gateway_connection_cust_model2
# Convert model instance back to dict and verify no loss of data
transit_gateway_connection_cust_model_json2 = transit_gateway_connection_cust_model.to_dict()
assert transit_gateway_connection_cust_model_json2 == transit_gateway_connection_cust_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitGatewayConnectionCustZone
#-----------------------------------------------------------------------------
class TestTransitGatewayConnectionCustZone():
#--------------------------------------------------------
# Test serialization/deserialization for TransitGatewayConnectionCustZone
#--------------------------------------------------------
def test_transit_gateway_connection_cust_zone_serialization(self):
# Construct a json representation of a TransitGatewayConnectionCustZone model
transit_gateway_connection_cust_zone_model_json = {}
transit_gateway_connection_cust_zone_model_json['name'] = 'us-south-1'
# Construct a model instance of TransitGatewayConnectionCustZone by calling from_dict on the json representation
transit_gateway_connection_cust_zone_model = TransitGatewayConnectionCustZone.from_dict(transit_gateway_connection_cust_zone_model_json)
assert transit_gateway_connection_cust_zone_model != False
# Construct a model instance of TransitGatewayConnectionCustZone by calling from_dict on the json representation
transit_gateway_connection_cust_zone_model_dict = TransitGatewayConnectionCustZone.from_dict(transit_gateway_connection_cust_zone_model_json).__dict__
transit_gateway_connection_cust_zone_model2 = TransitGatewayConnectionCustZone(**transit_gateway_connection_cust_zone_model_dict)
# Verify the model instances are equivalent
assert transit_gateway_connection_cust_zone_model == transit_gateway_connection_cust_zone_model2
# Convert model instance back to dict and verify no loss of data
transit_gateway_connection_cust_zone_model_json2 = transit_gateway_connection_cust_zone_model.to_dict()
assert transit_gateway_connection_cust_zone_model_json2 == transit_gateway_connection_cust_zone_model_json
#-----------------------------------------------------------------------------
# Test Class for TransitGatewayReference
#-----------------------------------------------------------------------------
class TestTransitGatewayReference():
#--------------------------------------------------------
# Test serialization/deserialization for TransitGatewayReference
#--------------------------------------------------------
def test_transit_gateway_reference_serialization(self):
# Construct a json representation of a TransitGatewayReference model
transit_gateway_reference_model_json = {}
transit_gateway_reference_model_json['crn'] = 'crn:v1:bluemix:public:transit:us-south:a/123456::gateway:456f58c1-afe7-123a-0a0a-7f3d720f1a44'
transit_gateway_reference_model_json['id'] = '456f58c1-afe7-123a-0a0a-7f3d720f1a44'
transit_gateway_reference_model_json['name'] = 'my-transit-gw100'
# Construct a model instance of TransitGatewayReference by calling from_dict on the json representation
transit_gateway_reference_model = TransitGatewayReference.from_dict(transit_gateway_reference_model_json)
assert transit_gateway_reference_model != False
# Construct a model instance of TransitGatewayReference by calling from_dict on the json representation
transit_gateway_reference_model_dict = TransitGatewayReference.from_dict(transit_gateway_reference_model_json).__dict__
transit_gateway_reference_model2 = TransitGatewayReference(**transit_gateway_reference_model_dict)
# Verify the model instances are equivalent
assert transit_gateway_reference_model == transit_gateway_reference_model2
# Convert model instance back to dict and verify no loss of data
transit_gateway_reference_model_json2 = transit_gateway_reference_model.to_dict()
assert transit_gateway_reference_model_json2 == transit_gateway_reference_model_json
#-----------------------------------------------------------------------------
# Test Class for ZoneReference
#-----------------------------------------------------------------------------
class TestZoneReference():
#--------------------------------------------------------
# Test serialization/deserialization for ZoneReference
#--------------------------------------------------------
def test_zone_reference_serialization(self):
# Construct a json representation of a ZoneReference model
zone_reference_model_json = {}
zone_reference_model_json['name'] = 'us-south-1'
# Construct a model instance of ZoneReference by calling from_dict on the json representation
zone_reference_model = ZoneReference.from_dict(zone_reference_model_json)
assert zone_reference_model != False
# Construct a model instance of ZoneReference by calling from_dict on the json representation
zone_reference_model_dict = ZoneReference.from_dict(zone_reference_model_json).__dict__
zone_reference_model2 = ZoneReference(**zone_reference_model_dict)
# Verify the model instances are equivalent
assert zone_reference_model == zone_reference_model2
# Convert model instance back to dict and verify no loss of data
zone_reference_model_json2 = zone_reference_model.to_dict()
assert zone_reference_model_json2 == zone_reference_model_json
#-----------------------------------------------------------------------------
# Test Class for ZoneIdentityByName
#-----------------------------------------------------------------------------
class TestZoneIdentityByName():
#--------------------------------------------------------
# Test serialization/deserialization for ZoneIdentityByName
#--------------------------------------------------------
def test_zone_identity_by_name_serialization(self):
# Construct a json representation of a ZoneIdentityByName model
zone_identity_by_name_model_json = {}
zone_identity_by_name_model_json['name'] = 'us-south-1'
# Construct a model instance of ZoneIdentityByName by calling from_dict on the json representation
zone_identity_by_name_model = ZoneIdentityByName.from_dict(zone_identity_by_name_model_json)
assert zone_identity_by_name_model != False
# Construct a model instance of ZoneIdentityByName by calling from_dict on the json representation
zone_identity_by_name_model_dict = ZoneIdentityByName.from_dict(zone_identity_by_name_model_json).__dict__
zone_identity_by_name_model2 = ZoneIdentityByName(**zone_identity_by_name_model_dict)
# Verify the model instances are equivalent
assert zone_identity_by_name_model == zone_identity_by_name_model2
# Convert model instance back to dict and verify no loss of data
zone_identity_by_name_model_json2 = zone_identity_by_name_model.to_dict()
assert zone_identity_by_name_model_json2 == zone_identity_by_name_model_json
# endregion
##############################################################################
# End of Model Tests
##############################################################################
| 54.990826 | 1,190 | 0.625042 | 11,103 | 107,892 | 5.762046 | 0.033414 | 0.072871 | 0.047268 | 0.032825 | 0.899619 | 0.867028 | 0.820823 | 0.770555 | 0.730008 | 0.668423 | 0 | 0.060282 | 0.176352 | 107,892 | 1,961 | 1,191 | 55.018868 | 0.659641 | 0.259074 | 0 | 0.532455 | 0 | 0.037629 | 0.2995 | 0.134502 | 0 | 0 | 0 | 0 | 0.109125 | 1 | 0.060207 | false | 0 | 0.009407 | 0 | 0.12794 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c6bb23fc81eece41d1c4a03a23a562f541bbc8cb | 155 | py | Python | geocontrib/admin/__init__.py | hcharp/geocontrib | 87ee241c737aae23eff358d2550bddba714f9c7b | [
"Apache-2.0"
] | 3 | 2020-12-02T09:44:41.000Z | 2021-04-17T13:05:30.000Z | geocontrib/admin/__init__.py | hcharp/geocontrib | 87ee241c737aae23eff358d2550bddba714f9c7b | [
"Apache-2.0"
] | 14 | 2020-01-27T09:49:33.000Z | 2021-06-14T08:04:10.000Z | geocontrib/admin/__init__.py | hcharp/geocontrib | 87ee241c737aae23eff358d2550bddba714f9c7b | [
"Apache-2.0"
] | 9 | 2020-01-16T12:37:39.000Z | 2021-04-22T09:57:59.000Z | from geocontrib.admin.feature import *
from geocontrib.admin.flat_page import *
from geocontrib.admin.project import *
from geocontrib.admin.user import *
| 31 | 40 | 0.819355 | 21 | 155 | 6 | 0.428571 | 0.444444 | 0.603175 | 0.595238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103226 | 155 | 4 | 41 | 38.75 | 0.906475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
05a13edd3e2ed6d8f3ae7aaa6c59706f2d5dbb49 | 3,730 | py | Python | abiflows/fireworks/utils/custodian_utils.py | gmatteo/abiflows | bde67408b024c34c17d78c9fcc672b42be71e1e5 | [
"BSD-3-Clause"
] | 4 | 2018-12-13T09:12:17.000Z | 2019-06-14T15:16:08.000Z | abiflows/fireworks/utils/custodian_utils.py | gmatteo/abiflows | bde67408b024c34c17d78c9fcc672b42be71e1e5 | [
"BSD-3-Clause"
] | 3 | 2018-01-03T17:08:22.000Z | 2018-08-23T22:29:41.000Z | abiflows/fireworks/utils/custodian_utils.py | abinit/abiflows | bde67408b024c34c17d78c9fcc672b42be71e1e5 | [
"BSD-3-Clause"
] | 13 | 2017-07-22T01:05:20.000Z | 2021-06-08T10:57:53.000Z | import abc
from custodian.custodian import ErrorHandler, Validator
#TODO: do we stick to custodian's ErrorHandler/Validator inheritance ??
class SRCErrorHandler(ErrorHandler):
HANDLER_PRIORITIES = {'PRIORITY_FIRST': 0,
'PRIORITY_VERY_HIGH': 1,
'PRIORITY_HIGH': 2,
'PRIORITY_MEDIUM': 3,
'PRIORITY_LOW': 4,
'PRIORITY_VERY_LOW': 5,
'PRIORITY_LAST': 6}
PRIORITY_FIRST = HANDLER_PRIORITIES['PRIORITY_FIRST']
PRIORITY_VERY_HIGH = HANDLER_PRIORITIES['PRIORITY_VERY_HIGH']
PRIORITY_HIGH = HANDLER_PRIORITIES['PRIORITY_HIGH']
PRIORITY_MEDIUM = HANDLER_PRIORITIES['PRIORITY_MEDIUM']
PRIORITY_LOW = HANDLER_PRIORITIES['PRIORITY_LOW']
PRIORITY_VERY_LOW = HANDLER_PRIORITIES['PRIORITY_VERY_LOW']
PRIORITY_LAST = HANDLER_PRIORITIES['PRIORITY_LAST']
def __init__(self):
self.fw_spec = None
self.fw_to_check = None
@abc.abstractmethod
def as_dict(self):
pass
@abc.abstractmethod
def from_dict(cls, d):
pass
@abc.abstractmethod
def setup(self):
pass
def set_fw_spec(self, fw_spec):
self.fw_spec = fw_spec
def set_fw_to_check(self, fw_to_check):
self.fw_to_check = fw_to_check
def src_setup(self, fw_spec, fw_to_check):
self.set_fw_spec(fw_spec=fw_spec)
self.set_fw_to_check(fw_to_check=fw_to_check)
self.setup()
@abc.abstractproperty
def handler_priority(self):
pass
@property
def skip_remaining_handlers(self):
return False
@abc.abstractproperty
def allow_fizzled(self):
pass
@abc.abstractproperty
def allow_completed(self):
pass
@abc.abstractmethod
def has_corrections(self):
pass
class MonitoringSRCErrorHandler(ErrorHandler):
HANDLER_PRIORITIES = {'PRIORITY_FIRST': 0,
'PRIORITY_VERY_HIGH': 1,
'PRIORITY_HIGH': 2,
'PRIORITY_MEDIUM': 3,
'PRIORITY_LOW': 4,
'PRIORITY_VERY_LOW': 5,
'PRIORITY_LAST': 6}
PRIORITY_FIRST = HANDLER_PRIORITIES['PRIORITY_FIRST']
PRIORITY_VERY_HIGH = HANDLER_PRIORITIES['PRIORITY_VERY_HIGH']
PRIORITY_HIGH = HANDLER_PRIORITIES['PRIORITY_HIGH']
PRIORITY_MEDIUM = HANDLER_PRIORITIES['PRIORITY_MEDIUM']
PRIORITY_LOW = HANDLER_PRIORITIES['PRIORITY_LOW']
PRIORITY_VERY_LOW = HANDLER_PRIORITIES['PRIORITY_VERY_LOW']
PRIORITY_LAST = HANDLER_PRIORITIES['PRIORITY_LAST']
@abc.abstractmethod
def as_dict(self):
pass
@abc.abstractmethod
def from_dict(cls, d):
pass
@abc.abstractproperty
def handler_priority(self):
pass
@property
def skip_remaining_handlers(self):
return False
class SRCValidator(Validator):
HANDLER_PRIORITIES = {'PRIORITY_FIRST': 0,
'PRIORITY_VERY_HIGH': 1,
'PRIORITY_HIGH': 2,
'PRIORITY_MEDIUM': 3,
'PRIORITY_LOW': 4,
'PRIORITY_VERY_LOW': 5,
'PRIORITY_LAST': 6}
PRIORITY_FIRST = HANDLER_PRIORITIES['PRIORITY_FIRST']
PRIORITY_VERY_HIGH = HANDLER_PRIORITIES['PRIORITY_VERY_HIGH']
PRIORITY_HIGH = HANDLER_PRIORITIES['PRIORITY_HIGH']
PRIORITY_MEDIUM = HANDLER_PRIORITIES['PRIORITY_MEDIUM']
PRIORITY_LOW = HANDLER_PRIORITIES['PRIORITY_LOW']
PRIORITY_VERY_LOW = HANDLER_PRIORITIES['PRIORITY_VERY_LOW']
PRIORITY_LAST = HANDLER_PRIORITIES['PRIORITY_LAST']
pass
| 30.325203 | 71 | 0.630563 | 398 | 3,730 | 5.522613 | 0.153266 | 0.185623 | 0.272975 | 0.081893 | 0.792539 | 0.767971 | 0.751592 | 0.735669 | 0.735669 | 0.735669 | 0 | 0.007898 | 0.287131 | 3,730 | 122 | 72 | 30.57377 | 0.818729 | 0.018767 | 0 | 0.789474 | 0 | 0 | 0.167259 | 0 | 0 | 0 | 0 | 0.008197 | 0 | 1 | 0.168421 | false | 0.115789 | 0.021053 | 0.021053 | 0.494737 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
05f5b9c9ed11548c6ff71baab3942890158f09c8 | 93,295 | py | Python | nova/tests/unit/virt/test_block_device.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/virt/test_block_device.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/virt/test_block_device.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# All Rights Reserved.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'import'
name|'mock'
newline|'\n'
name|'from'
name|'oslo_serialization'
name|'import'
name|'jsonutils'
newline|'\n'
name|'import'
name|'six'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
name|'import'
name|'block_device'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'context'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'objects'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'fields'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'test'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
name|'import'
name|'fake_block_device'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
name|'import'
name|'fake_instance'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
name|'import'
name|'matchers'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'virt'
name|'import'
name|'block_device'
name|'as'
name|'driver_block_device'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'virt'
name|'import'
name|'driver'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'volume'
name|'import'
name|'cinder'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'volume'
name|'import'
name|'encryptors'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|TestDriverBlockDevice
name|'class'
name|'TestDriverBlockDevice'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|variable|driver_classes
indent|' '
name|'driver_classes'
op|'='
op|'{'
nl|'\n'
string|"'swap'"
op|':'
name|'driver_block_device'
op|'.'
name|'DriverSwapBlockDevice'
op|','
nl|'\n'
string|"'ephemeral'"
op|':'
name|'driver_block_device'
op|'.'
name|'DriverEphemeralBlockDevice'
op|','
nl|'\n'
string|"'volume'"
op|':'
name|'driver_block_device'
op|'.'
name|'DriverVolumeBlockDevice'
op|','
nl|'\n'
string|"'snapshot'"
op|':'
name|'driver_block_device'
op|'.'
name|'DriverSnapshotBlockDevice'
op|','
nl|'\n'
string|"'image'"
op|':'
name|'driver_block_device'
op|'.'
name|'DriverImageBlockDevice'
op|','
nl|'\n'
string|"'blank'"
op|':'
name|'driver_block_device'
op|'.'
name|'DriverBlankBlockDevice'
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|swap_bdm_dict
name|'swap_bdm_dict'
op|'='
name|'block_device'
op|'.'
name|'BlockDeviceDict'
op|'('
nl|'\n'
op|'{'
string|"'id'"
op|':'
number|'1'
op|','
string|"'instance_uuid'"
op|':'
string|"'fake-instance'"
op|','
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sdb1'"
op|','
nl|'\n'
string|"'source_type'"
op|':'
string|"'blank'"
op|','
nl|'\n'
string|"'destination_type'"
op|':'
string|"'local'"
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'guest_format'"
op|':'
string|"'swap'"
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'volume_size'"
op|':'
number|'2'
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|variable|swap_driver_bdm
name|'swap_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sdb1'"
op|','
nl|'\n'
string|"'swap_size'"
op|':'
number|'2'
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|swap_legacy_driver_bdm
name|'swap_legacy_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sdb1'"
op|','
nl|'\n'
string|"'swap_size'"
op|':'
number|'2'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|ephemeral_bdm_dict
name|'ephemeral_bdm_dict'
op|'='
name|'block_device'
op|'.'
name|'BlockDeviceDict'
op|'('
nl|'\n'
op|'{'
string|"'id'"
op|':'
number|'2'
op|','
string|"'instance_uuid'"
op|':'
string|"'fake-instance'"
op|','
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sdc1'"
op|','
nl|'\n'
string|"'source_type'"
op|':'
string|"'blank'"
op|','
nl|'\n'
string|"'destination_type'"
op|':'
string|"'local'"
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'volume_size'"
op|':'
number|'4'
op|','
nl|'\n'
string|"'guest_format'"
op|':'
string|"'ext4'"
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|variable|ephemeral_driver_bdm
name|'ephemeral_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sdc1'"
op|','
nl|'\n'
string|"'size'"
op|':'
number|'4'
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'guest_format'"
op|':'
string|"'ext4'"
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|ephemeral_legacy_driver_bdm
name|'ephemeral_legacy_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sdc1'"
op|','
nl|'\n'
string|"'size'"
op|':'
number|'4'
op|','
nl|'\n'
string|"'virtual_name'"
op|':'
string|"'ephemeral0'"
op|','
nl|'\n'
string|"'num'"
op|':'
number|'0'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|volume_bdm_dict
name|'volume_bdm_dict'
op|'='
name|'block_device'
op|'.'
name|'BlockDeviceDict'
op|'('
nl|'\n'
op|'{'
string|"'id'"
op|':'
number|'3'
op|','
string|"'instance_uuid'"
op|':'
string|"'fake-instance'"
op|','
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sda1'"
op|','
nl|'\n'
string|"'source_type'"
op|':'
string|"'volume'"
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'volume_size'"
op|':'
number|'8'
op|','
nl|'\n'
string|"'destination_type'"
op|':'
string|"'volume'"
op|','
nl|'\n'
string|"'volume_id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'guest_format'"
op|':'
string|"'ext4'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
string|'\'{"fake": "connection_info"}\''
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'boot_index'"
op|':'
number|'0'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|variable|volume_driver_bdm
name|'volume_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'mount_device'"
op|':'
string|"'/dev/sda1'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
op|'{'
string|'"fake"'
op|':'
string|'"connection_info"'
op|'}'
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'guest_format'"
op|':'
string|"'ext4'"
op|','
nl|'\n'
string|"'boot_index'"
op|':'
number|'0'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|volume_legacy_driver_bdm
name|'volume_legacy_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'mount_device'"
op|':'
string|"'/dev/sda1'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
op|'{'
string|'"fake"'
op|':'
string|'"connection_info"'
op|'}'
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'False'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|snapshot_bdm_dict
name|'snapshot_bdm_dict'
op|'='
name|'block_device'
op|'.'
name|'BlockDeviceDict'
op|'('
nl|'\n'
op|'{'
string|"'id'"
op|':'
number|'4'
op|','
string|"'instance_uuid'"
op|':'
string|"'fake-instance'"
op|','
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sda2'"
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'volume_size'"
op|':'
number|'3'
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'source_type'"
op|':'
string|"'snapshot'"
op|','
nl|'\n'
string|"'destination_type'"
op|':'
string|"'volume'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
string|'\'{"fake": "connection_info"}\''
op|','
nl|'\n'
string|"'snapshot_id'"
op|':'
string|"'fake-snapshot-id-1'"
op|','
nl|'\n'
string|"'volume_id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|variable|snapshot_driver_bdm
name|'snapshot_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'mount_device'"
op|':'
string|"'/dev/sda2'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
op|'{'
string|'"fake"'
op|':'
string|'"connection_info"'
op|'}'
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'guest_format'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|snapshot_legacy_driver_bdm
name|'snapshot_legacy_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'mount_device'"
op|':'
string|"'/dev/sda2'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
op|'{'
string|'"fake"'
op|':'
string|'"connection_info"'
op|'}'
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|image_bdm_dict
name|'image_bdm_dict'
op|'='
name|'block_device'
op|'.'
name|'BlockDeviceDict'
op|'('
nl|'\n'
op|'{'
string|"'id'"
op|':'
number|'5'
op|','
string|"'instance_uuid'"
op|':'
string|"'fake-instance'"
op|','
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sda2'"
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'volume_size'"
op|':'
number|'1'
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'source_type'"
op|':'
string|"'image'"
op|','
nl|'\n'
string|"'destination_type'"
op|':'
string|"'volume'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
string|'\'{"fake": "connection_info"}\''
op|','
nl|'\n'
string|"'image_id'"
op|':'
string|"'fake-image-id-1'"
op|','
nl|'\n'
string|"'volume_id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|variable|image_driver_bdm
name|'image_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'mount_device'"
op|':'
string|"'/dev/sda2'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
op|'{'
string|'"fake"'
op|':'
string|'"connection_info"'
op|'}'
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'guest_format'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|image_legacy_driver_bdm
name|'image_legacy_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'mount_device'"
op|':'
string|"'/dev/sda2'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
op|'{'
string|'"fake"'
op|':'
string|'"connection_info"'
op|'}'
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|blank_bdm_dict
name|'blank_bdm_dict'
op|'='
name|'block_device'
op|'.'
name|'BlockDeviceDict'
op|'('
nl|'\n'
op|'{'
string|"'id'"
op|':'
number|'6'
op|','
string|"'instance_uuid'"
op|':'
string|"'fake-instance'"
op|','
nl|'\n'
string|"'device_name'"
op|':'
string|"'/dev/sda2'"
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'volume_size'"
op|':'
number|'3'
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'source_type'"
op|':'
string|"'blank'"
op|','
nl|'\n'
string|"'destination_type'"
op|':'
string|"'volume'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
string|'\'{"fake": "connection_info"}\''
op|','
nl|'\n'
string|"'snapshot_id'"
op|':'
string|"'fake-snapshot-id-1'"
op|','
nl|'\n'
string|"'volume_id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|variable|blank_driver_bdm
name|'blank_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'mount_device'"
op|':'
string|"'/dev/sda2'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
op|'{'
string|'"fake"'
op|':'
string|'"connection_info"'
op|'}'
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|','
nl|'\n'
string|"'disk_bus'"
op|':'
string|"'scsi'"
op|','
nl|'\n'
string|"'device_type'"
op|':'
string|"'disk'"
op|','
nl|'\n'
string|"'guest_format'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'boot_index'"
op|':'
op|'-'
number|'1'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|blank_legacy_driver_bdm
name|'blank_legacy_driver_bdm'
op|'='
op|'{'
nl|'\n'
string|"'mount_device'"
op|':'
string|"'/dev/sda2'"
op|','
nl|'\n'
string|"'connection_info'"
op|':'
op|'{'
string|'"fake"'
op|':'
string|'"connection_info"'
op|'}'
op|','
nl|'\n'
string|"'delete_on_termination'"
op|':'
name|'True'
op|'}'
newline|'\n'
nl|'\n'
DECL|member|setUp
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'TestDriverBlockDevice'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'='
name|'self'
op|'.'
name|'mox'
op|'.'
name|'CreateMock'
op|'('
name|'cinder'
op|'.'
name|'API'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|'='
name|'self'
op|'.'
name|'mox'
op|'.'
name|'CreateMock'
op|'('
name|'driver'
op|'.'
name|'ComputeDriver'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'context'
op|'='
name|'context'
op|'.'
name|'RequestContext'
op|'('
string|"'fake_user'"
op|','
nl|'\n'
string|"'fake_project'"
op|')'
newline|'\n'
comment|'# create bdm objects for testing'
nl|'\n'
name|'self'
op|'.'
name|'swap_bdm'
op|'='
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'self'
op|'.'
name|'swap_bdm_dict'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'ephemeral_bdm'
op|'='
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'self'
op|'.'
name|'ephemeral_bdm_dict'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|'='
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'self'
op|'.'
name|'volume_bdm_dict'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'snapshot_bdm'
op|'='
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'self'
op|'.'
name|'snapshot_bdm_dict'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'image_bdm'
op|'='
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'self'
op|'.'
name|'image_bdm_dict'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'blank_bdm'
op|'='
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'self'
op|'.'
name|'blank_bdm_dict'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_no_device_raises
dedent|''
name|'def'
name|'test_no_device_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'name'
op|','
name|'cls'
name|'in'
name|'self'
op|'.'
name|'driver_classes'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'bdm'
op|'='
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
op|'{'
string|"'no_device'"
op|':'
name|'True'
op|'}'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'driver_block_device'
op|'.'
name|'_NotTransformable'
op|','
nl|'\n'
name|'cls'
op|','
name|'bdm'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_driver_device
dedent|''
dedent|''
name|'def'
name|'_test_driver_device'
op|'('
name|'self'
op|','
name|'name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_bdm'
op|'='
name|'getattr'
op|'('
name|'self'
op|','
string|'"%s_bdm"'
op|'%'
name|'name'
op|')'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
name|'name'
op|']'
op|'('
name|'db_bdm'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertThat'
op|'('
name|'test_bdm'
op|','
name|'matchers'
op|'.'
name|'DictMatches'
op|'('
nl|'\n'
name|'getattr'
op|'('
name|'self'
op|','
string|'"%s_driver_bdm"'
op|'%'
name|'name'
op|')'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'k'
op|','
name|'v'
name|'in'
name|'six'
op|'.'
name|'iteritems'
op|'('
name|'db_bdm'
op|')'
op|':'
newline|'\n'
indent|' '
name|'field_val'
op|'='
name|'getattr'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|','
name|'k'
op|')'
newline|'\n'
name|'if'
name|'isinstance'
op|'('
name|'field_val'
op|','
name|'bool'
op|')'
op|':'
newline|'\n'
indent|' '
name|'v'
op|'='
name|'bool'
op|'('
name|'v'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'field_val'
op|','
name|'v'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'assertThat'
op|'('
name|'test_bdm'
op|'.'
name|'legacy'
op|'('
op|')'
op|','
nl|'\n'
name|'matchers'
op|'.'
name|'DictMatches'
op|'('
nl|'\n'
name|'getattr'
op|'('
name|'self'
op|','
string|'"%s_legacy_driver_bdm"'
op|'%'
name|'name'
op|')'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Test passthru attributes'
nl|'\n'
name|'for'
name|'passthru'
name|'in'
name|'test_bdm'
op|'.'
name|'_proxy_as_attr'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'getattr'
op|'('
name|'test_bdm'
op|','
name|'passthru'
op|')'
op|','
nl|'\n'
name|'getattr'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|','
name|'passthru'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Make sure that all others raise _invalidType'
nl|'\n'
dedent|''
name|'for'
name|'other_name'
op|','
name|'cls'
name|'in'
name|'six'
op|'.'
name|'iteritems'
op|'('
name|'self'
op|'.'
name|'driver_classes'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'other_name'
op|'=='
name|'name'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'driver_block_device'
op|'.'
name|'_InvalidType'
op|','
nl|'\n'
name|'cls'
op|','
nl|'\n'
name|'getattr'
op|'('
name|'self'
op|','
string|"'%s_bdm'"
op|'%'
name|'name'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Test the save method'
nl|'\n'
dedent|''
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|','
string|"'save'"
op|')'
name|'as'
name|'save_mock'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'fld'
op|','
name|'alias'
name|'in'
name|'six'
op|'.'
name|'iteritems'
op|'('
name|'test_bdm'
op|'.'
name|'_update_on_save'
op|')'
op|':'
newline|'\n'
comment|"# We can't set fake values on enums, like device_type,"
nl|'\n'
comment|'# so skip those.'
nl|'\n'
indent|' '
name|'if'
name|'not'
name|'isinstance'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'fields'
op|'['
name|'fld'
op|']'
op|','
nl|'\n'
name|'fields'
op|'.'
name|'BaseEnumField'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'['
name|'alias'
name|'or'
name|'fld'
op|']'
op|'='
string|"'fake_changed_value'"
newline|'\n'
dedent|''
dedent|''
name|'test_bdm'
op|'.'
name|'save'
op|'('
op|')'
newline|'\n'
name|'for'
name|'fld'
op|','
name|'alias'
name|'in'
name|'six'
op|'.'
name|'iteritems'
op|'('
name|'test_bdm'
op|'.'
name|'_update_on_save'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'['
name|'alias'
name|'or'
name|'fld'
op|']'
op|','
nl|'\n'
name|'getattr'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|','
name|'fld'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'save_mock'
op|'.'
name|'assert_called_once_with'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|function|check_save
dedent|''
name|'def'
name|'check_save'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'set'
op|'('
op|'['
op|']'
op|')'
op|','
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Test that nothing is set on the object if there are no actual changes'
nl|'\n'
dedent|''
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'obj_reset_changes'
op|'('
op|')'
newline|'\n'
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|','
string|"'save'"
op|')'
name|'as'
name|'save_mock'
op|':'
newline|'\n'
indent|' '
name|'save_mock'
op|'.'
name|'side_effect'
op|'='
name|'check_save'
newline|'\n'
name|'test_bdm'
op|'.'
name|'save'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_driver_default_size
dedent|''
dedent|''
name|'def'
name|'_test_driver_default_size'
op|'('
name|'self'
op|','
name|'name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'size'
op|'='
string|"'swap_size'"
name|'if'
name|'name'
op|'=='
string|"'swap'"
name|'else'
string|"'size'"
newline|'\n'
name|'no_size_bdm'
op|'='
name|'getattr'
op|'('
name|'self'
op|','
string|'"%s_bdm_dict"'
op|'%'
name|'name'
op|')'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'no_size_bdm'
op|'['
string|"'volume_size'"
op|']'
op|'='
name|'None'
newline|'\n'
nl|'\n'
name|'driver_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
name|'name'
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'no_size_bdm'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'driver_bdm'
op|'['
name|'size'
op|']'
op|','
number|'0'
op|')'
newline|'\n'
nl|'\n'
name|'del'
name|'no_size_bdm'
op|'['
string|"'volume_size'"
op|']'
newline|'\n'
nl|'\n'
name|'driver_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
name|'name'
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'no_size_bdm'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'driver_bdm'
op|'['
name|'size'
op|']'
op|','
number|'0'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_driver_swap_block_device
dedent|''
name|'def'
name|'test_driver_swap_block_device'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_driver_device'
op|'('
string|'"swap"'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_driver_swap_default_size
dedent|''
name|'def'
name|'test_driver_swap_default_size'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_driver_default_size'
op|'('
string|"'swap'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_driver_ephemeral_block_device
dedent|''
name|'def'
name|'test_driver_ephemeral_block_device'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_driver_device'
op|'('
string|'"ephemeral"'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_driver_ephemeral_default_size
dedent|''
name|'def'
name|'test_driver_ephemeral_default_size'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_driver_default_size'
op|'('
string|"'ephemeral'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_driver_volume_block_device
dedent|''
name|'def'
name|'test_driver_volume_block_device'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_driver_device'
op|'('
string|'"volume"'
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'['
string|"'connection_info'"
op|']'
op|','
nl|'\n'
name|'jsonutils'
op|'.'
name|'loads'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'connection_info'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'id'
op|','
number|'3'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'volume_id'
op|','
string|"'fake-volume-id-1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'volume_size'
op|','
number|'8'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_driver_snapshot_block_device
dedent|''
name|'def'
name|'test_driver_snapshot_block_device'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_driver_device'
op|'('
string|'"snapshot"'
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'snapshot'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'snapshot_bdm'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'id'
op|','
number|'4'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'snapshot_id'
op|','
string|"'fake-snapshot-id-1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'volume_id'
op|','
string|"'fake-volume-id-2'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'volume_size'
op|','
number|'3'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_driver_image_block_device
dedent|''
name|'def'
name|'test_driver_image_block_device'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_driver_device'
op|'('
string|"'image'"
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'image'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'image_bdm'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'id'
op|','
number|'5'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'image_id'
op|','
string|"'fake-image-id-1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'volume_size'
op|','
number|'1'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_driver_image_block_device_destination_local
dedent|''
name|'def'
name|'test_driver_image_block_device_destination_local'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_driver_device'
op|'('
string|"'image'"
op|')'
newline|'\n'
name|'bdm'
op|'='
name|'self'
op|'.'
name|'image_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'bdm'
op|'['
string|"'destination_type'"
op|']'
op|'='
string|"'local'"
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'driver_block_device'
op|'.'
name|'_InvalidType'
op|','
nl|'\n'
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'image'"
op|']'
op|','
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'bdm'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_driver_blank_block_device
dedent|''
name|'def'
name|'test_driver_blank_block_device'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_driver_device'
op|'('
string|"'blank'"
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'blank'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'blank_bdm'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'6'
op|','
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'id'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'fake-volume-id-2'"
op|','
name|'test_bdm'
op|'.'
name|'volume_id'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'3'
op|','
name|'test_bdm'
op|'.'
name|'volume_size'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_call_wait_func
dedent|''
name|'def'
name|'_test_call_wait_func'
op|'('
name|'self'
op|','
name|'delete_on_termination'
op|','
name|'delete_fail'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'test_bdm'
op|'['
string|"'delete_on_termination'"
op|']'
op|'='
name|'delete_on_termination'
newline|'\n'
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'delete'"
op|')'
name|'as'
name|'vol_delete'
op|':'
newline|'\n'
indent|' '
name|'wait_func'
op|'='
name|'mock'
op|'.'
name|'MagicMock'
op|'('
op|')'
newline|'\n'
name|'mock_exception'
op|'='
name|'exception'
op|'.'
name|'VolumeNotCreated'
op|'('
name|'volume_id'
op|'='
string|"'fake-id'"
op|','
nl|'\n'
name|'seconds'
op|'='
number|'1'
op|','
nl|'\n'
name|'attempts'
op|'='
number|'1'
op|','
nl|'\n'
name|'volume_status'
op|'='
string|"'error'"
op|')'
newline|'\n'
name|'wait_func'
op|'.'
name|'side_effect'
op|'='
name|'mock_exception'
newline|'\n'
nl|'\n'
name|'if'
name|'delete_on_termination'
name|'and'
name|'delete_fail'
op|':'
newline|'\n'
indent|' '
name|'vol_delete'
op|'.'
name|'side_effect'
op|'='
name|'Exception'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'VolumeNotCreated'
op|','
nl|'\n'
name|'test_bdm'
op|'.'
name|'_call_wait_func'
op|','
nl|'\n'
name|'context'
op|'='
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
name|'wait_func'
op|'='
name|'wait_func'
op|','
nl|'\n'
name|'volume_api'
op|'='
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'volume_id'
op|'='
string|"'fake-id'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'delete_on_termination'
op|','
name|'vol_delete'
op|'.'
name|'called'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_call_wait_delete_volume
dedent|''
dedent|''
name|'def'
name|'test_call_wait_delete_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_call_wait_func'
op|'('
name|'True'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_call_wait_delete_volume_fail
dedent|''
name|'def'
name|'test_call_wait_delete_volume_fail'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_call_wait_func'
op|'('
name|'True'
op|','
name|'True'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_call_wait_no_delete_volume
dedent|''
name|'def'
name|'test_call_wait_no_delete_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_call_wait_func'
op|'('
name|'False'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_volume_attach
dedent|''
name|'def'
name|'_test_volume_attach'
op|'('
name|'self'
op|','
name|'driver_bdm'
op|','
name|'bdm_dict'
op|','
nl|'\n'
name|'fake_volume'
op|','
name|'check_attach'
op|'='
name|'True'
op|','
nl|'\n'
name|'fail_check_attach'
op|'='
name|'False'
op|','
name|'driver_attach'
op|'='
name|'False'
op|','
nl|'\n'
name|'fail_driver_attach'
op|'='
name|'False'
op|','
name|'volume_attach'
op|'='
name|'True'
op|','
nl|'\n'
name|'fail_volume_attach'
op|'='
name|'False'
op|','
name|'access_mode'
op|'='
string|"'rw'"
op|','
nl|'\n'
name|'availability_zone'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'elevated_context'
op|'='
name|'self'
op|'.'
name|'context'
op|'.'
name|'elevated'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'self'
op|'.'
name|'context'
op|','
string|"'elevated'"
op|','
nl|'\n'
name|'lambda'
op|':'
name|'elevated_context'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'driver_bdm'
op|'.'
name|'_bdm_obj'
op|','
string|"'save'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'encryptors'
op|','
string|"'get_encryption_metadata'"
op|')'
newline|'\n'
name|'instance_detail'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'123'"
op|','
string|"'uuid'"
op|':'
string|"'fake_uuid'"
op|','
nl|'\n'
string|"'availability_zone'"
op|':'
name|'availability_zone'
op|'}'
newline|'\n'
name|'instance'
op|'='
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
op|'**'
name|'instance_detail'
op|')'
newline|'\n'
name|'connector'
op|'='
op|'{'
string|"'ip'"
op|':'
string|"'fake_ip'"
op|','
string|"'host'"
op|':'
string|"'fake_host'"
op|'}'
newline|'\n'
name|'connection_info'
op|'='
op|'{'
string|"'data'"
op|':'
op|'{'
string|"'access_mode'"
op|':'
name|'access_mode'
op|'}'
op|'}'
newline|'\n'
name|'expected_conn_info'
op|'='
op|'{'
string|"'data'"
op|':'
op|'{'
string|"'access_mode'"
op|':'
name|'access_mode'
op|'}'
op|','
nl|'\n'
string|"'serial'"
op|':'
name|'fake_volume'
op|'['
string|"'id'"
op|']'
op|'}'
newline|'\n'
name|'enc_data'
op|'='
op|'{'
string|"'fake'"
op|':'
string|"'enc_data'"
op|'}'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'get'
op|'('
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
name|'fake_volume'
op|'['
string|"'id'"
op|']'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'fake_volume'
op|')'
newline|'\n'
name|'if'
name|'check_attach'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'fail_check_attach'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'check_attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'fake_volume'
op|','
nl|'\n'
name|'instance'
op|'='
name|'instance'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'check_attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'fake_volume'
op|','
nl|'\n'
name|'instance'
op|'='
name|'instance'
op|')'
op|'.'
name|'AndRaise'
op|'('
nl|'\n'
name|'test'
op|'.'
name|'TestingException'
op|')'
newline|'\n'
name|'driver_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'save'
op|'('
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'return'
name|'instance'
op|','
name|'expected_conn_info'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'self'
op|'.'
name|'virt_driver'
op|'.'
name|'get_volume_connector'
op|'('
name|'instance'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'connector'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'initialize_connection'
op|'('
nl|'\n'
name|'elevated_context'
op|','
name|'fake_volume'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'connector'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'connection_info'
op|')'
newline|'\n'
name|'if'
name|'driver_attach'
op|':'
newline|'\n'
indent|' '
name|'encryptors'
op|'.'
name|'get_encryption_metadata'
op|'('
nl|'\n'
name|'elevated_context'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
name|'fake_volume'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'connection_info'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'enc_data'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'fail_driver_attach'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'virt_driver'
op|'.'
name|'attach_volume'
op|'('
nl|'\n'
name|'elevated_context'
op|','
name|'expected_conn_info'
op|','
name|'instance'
op|','
nl|'\n'
name|'bdm_dict'
op|'['
string|"'device_name'"
op|']'
op|','
nl|'\n'
name|'disk_bus'
op|'='
name|'bdm_dict'
op|'['
string|"'disk_bus'"
op|']'
op|','
nl|'\n'
name|'device_type'
op|'='
name|'bdm_dict'
op|'['
string|"'device_type'"
op|']'
op|','
nl|'\n'
name|'encryption'
op|'='
name|'enc_data'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'virt_driver'
op|'.'
name|'attach_volume'
op|'('
nl|'\n'
name|'elevated_context'
op|','
name|'expected_conn_info'
op|','
name|'instance'
op|','
nl|'\n'
name|'bdm_dict'
op|'['
string|"'device_name'"
op|']'
op|','
nl|'\n'
name|'disk_bus'
op|'='
name|'bdm_dict'
op|'['
string|"'disk_bus'"
op|']'
op|','
nl|'\n'
name|'device_type'
op|'='
name|'bdm_dict'
op|'['
string|"'device_type'"
op|']'
op|','
nl|'\n'
name|'encryption'
op|'='
name|'enc_data'
op|')'
op|'.'
name|'AndRaise'
op|'('
name|'test'
op|'.'
name|'TestingException'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'terminate_connection'
op|'('
nl|'\n'
name|'elevated_context'
op|','
name|'fake_volume'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'connector'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'driver_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'save'
op|'('
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'return'
name|'instance'
op|','
name|'expected_conn_info'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'if'
name|'volume_attach'
op|':'
newline|'\n'
indent|' '
name|'driver_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'save'
op|'('
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'fail_volume_attach'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'attach'
op|'('
name|'elevated_context'
op|','
name|'fake_volume'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
string|"'fake_uuid'"
op|','
name|'bdm_dict'
op|'['
string|"'device_name'"
op|']'
op|','
nl|'\n'
name|'mode'
op|'='
name|'access_mode'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'attach'
op|'('
name|'elevated_context'
op|','
name|'fake_volume'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
string|"'fake_uuid'"
op|','
name|'bdm_dict'
op|'['
string|"'device_name'"
op|']'
op|','
nl|'\n'
name|'mode'
op|'='
name|'access_mode'
op|')'
op|'.'
name|'AndRaise'
op|'('
nl|'\n'
name|'test'
op|'.'
name|'TestingException'
op|')'
newline|'\n'
name|'if'
name|'driver_attach'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'virt_driver'
op|'.'
name|'detach_volume'
op|'('
nl|'\n'
name|'expected_conn_info'
op|','
name|'instance'
op|','
nl|'\n'
name|'bdm_dict'
op|'['
string|"'device_name'"
op|']'
op|','
nl|'\n'
name|'encryption'
op|'='
name|'enc_data'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'terminate_connection'
op|'('
nl|'\n'
name|'elevated_context'
op|','
name|'fake_volume'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'connector'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'detach'
op|'('
name|'elevated_context'
op|','
nl|'\n'
name|'fake_volume'
op|'['
string|"'id'"
op|']'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'driver_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'save'
op|'('
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'return'
name|'instance'
op|','
name|'expected_conn_info'
newline|'\n'
nl|'\n'
DECL|member|test_volume_attach
dedent|''
name|'def'
name|'test_volume_attach'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|','
name|'expected_conn_info'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'volume'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertThat'
op|'('
name|'test_bdm'
op|'['
string|"'connection_info'"
op|']'
op|','
nl|'\n'
name|'matchers'
op|'.'
name|'DictMatches'
op|'('
name|'expected_conn_info'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_volume_attach_ro
dedent|''
name|'def'
name|'test_volume_attach_ro'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|','
name|'expected_conn_info'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'volume'
op|','
name|'access_mode'
op|'='
string|"'ro'"
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertThat'
op|'('
name|'test_bdm'
op|'['
string|"'connection_info'"
op|']'
op|','
nl|'\n'
name|'matchers'
op|'.'
name|'DictMatches'
op|'('
name|'expected_conn_info'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_volume_attach_update_size
dedent|''
name|'def'
name|'test_volume_attach_update_size'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'test_bdm'
op|'.'
name|'volume_size'
op|'='
name|'None'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|','
nl|'\n'
string|"'size'"
op|':'
number|'42'
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|','
name|'expected_conn_info'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'volume'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'expected_conn_info'
op|','
name|'test_bdm'
op|'['
string|"'connection_info'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'42'
op|','
name|'test_bdm'
op|'.'
name|'volume_size'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_volume_attach_check_attach_fails
dedent|''
name|'def'
name|'test_volume_attach_check_attach_fails'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|','
name|'_'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'volume'
op|','
name|'fail_check_attach'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'test'
op|'.'
name|'TestingException'
op|','
name|'test_bdm'
op|'.'
name|'attach'
op|','
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_volume_no_volume_attach
dedent|''
name|'def'
name|'test_volume_no_volume_attach'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|','
name|'expected_conn_info'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'volume'
op|','
name|'check_attach'
op|'='
name|'False'
op|','
nl|'\n'
name|'driver_attach'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'do_check_attach'
op|'='
name|'False'
op|','
name|'do_driver_attach'
op|'='
name|'False'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertThat'
op|'('
name|'test_bdm'
op|'['
string|"'connection_info'"
op|']'
op|','
nl|'\n'
name|'matchers'
op|'.'
name|'DictMatches'
op|'('
name|'expected_conn_info'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_volume_attach_no_check_driver_attach
dedent|''
name|'def'
name|'test_volume_attach_no_check_driver_attach'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|','
name|'expected_conn_info'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'volume'
op|','
name|'check_attach'
op|'='
name|'False'
op|','
nl|'\n'
name|'driver_attach'
op|'='
name|'True'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'do_check_attach'
op|'='
name|'False'
op|','
name|'do_driver_attach'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertThat'
op|'('
name|'test_bdm'
op|'['
string|"'connection_info'"
op|']'
op|','
nl|'\n'
name|'matchers'
op|'.'
name|'DictMatches'
op|'('
name|'expected_conn_info'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_volume_attach_driver_attach_fails
dedent|''
name|'def'
name|'test_volume_attach_driver_attach_fails'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|','
name|'_'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'volume'
op|','
name|'driver_attach'
op|'='
name|'True'
op|','
nl|'\n'
name|'fail_driver_attach'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'test'
op|'.'
name|'TestingException'
op|','
name|'test_bdm'
op|'.'
name|'attach'
op|','
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'do_driver_attach'
op|'='
name|'True'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_volume_attach_volume_attach_fails
dedent|''
name|'def'
name|'test_volume_attach_volume_attach_fails'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|','
name|'_'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'volume'
op|','
name|'driver_attach'
op|'='
name|'True'
op|','
nl|'\n'
name|'fail_volume_attach'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'test'
op|'.'
name|'TestingException'
op|','
name|'test_bdm'
op|'.'
name|'attach'
op|','
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'do_driver_attach'
op|'='
name|'True'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_volume_attach_no_driver_attach_volume_attach_fails
dedent|''
name|'def'
name|'test_volume_attach_no_driver_attach_volume_attach_fails'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|','
name|'_'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'volume'
op|','
name|'fail_volume_attach'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'test'
op|'.'
name|'TestingException'
op|','
name|'test_bdm'
op|'.'
name|'attach'
op|','
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'do_driver_attach'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_refresh_connection
dedent|''
name|'def'
name|'test_refresh_connection'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'snapshot'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'snapshot_bdm'
op|')'
newline|'\n'
nl|'\n'
name|'instance'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake_id'"
op|','
string|"'uuid'"
op|':'
string|"'fake_uuid'"
op|'}'
newline|'\n'
name|'connector'
op|'='
op|'{'
string|"'ip'"
op|':'
string|"'fake_ip'"
op|','
string|"'host'"
op|':'
string|"'fake_host'"
op|'}'
newline|'\n'
name|'connection_info'
op|'='
op|'{'
string|"'data'"
op|':'
op|'{'
string|"'multipath_id'"
op|':'
string|"'fake_multipath_id'"
op|'}'
op|'}'
newline|'\n'
name|'expected_conn_info'
op|'='
op|'{'
string|"'data'"
op|':'
op|'{'
string|"'multipath_id'"
op|':'
string|"'fake_multipath_id'"
op|'}'
op|','
nl|'\n'
string|"'serial'"
op|':'
string|"'fake-volume-id-2'"
op|'}'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|','
string|"'save'"
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|'.'
name|'get_volume_connector'
op|'('
name|'instance'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'connector'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'initialize_connection'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'test_bdm'
op|'.'
name|'volume_id'
op|','
nl|'\n'
name|'connector'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'connection_info'
op|')'
newline|'\n'
name|'test_bdm'
op|'.'
name|'_bdm_obj'
op|'.'
name|'save'
op|'('
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'refresh_connection_info'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|','
name|'self'
op|'.'
name|'virt_driver'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertThat'
op|'('
name|'test_bdm'
op|'['
string|"'connection_info'"
op|']'
op|','
nl|'\n'
name|'matchers'
op|'.'
name|'DictMatches'
op|'('
name|'expected_conn_info'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_snapshot_attach_no_volume
dedent|''
name|'def'
name|'test_snapshot_attach_no_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'no_volume_snapshot'
op|'='
name|'self'
op|'.'
name|'snapshot_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'no_volume_snapshot'
op|'['
string|"'volume_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'snapshot'"
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'no_volume_snapshot'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'snapshot'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'wait_func'
op|'='
name|'self'
op|'.'
name|'mox'
op|'.'
name|'CreateMockAnything'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'get_snapshot'
op|'('
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
string|"'fake-snapshot-id-1'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'snapshot'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'create'
op|'('
name|'self'
op|'.'
name|'context'
op|','
number|'3'
op|','
string|"''"
op|','
string|"''"
op|','
name|'snapshot'
op|','
nl|'\n'
name|'availability_zone'
op|'='
name|'None'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'volume'
op|')'
newline|'\n'
name|'wait_func'
op|'('
name|'self'
op|'.'
name|'context'
op|','
string|"'fake-volume-id-2'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'instance'
op|','
name|'expected_conn_info'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'no_volume_snapshot'
op|','
name|'volume'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|','
name|'wait_func'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'volume_id'
op|','
string|"'fake-volume-id-2'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_snapshot_attach_no_volume_cinder_cross_az_attach_false
dedent|''
name|'def'
name|'test_snapshot_attach_no_volume_cinder_cross_az_attach_false'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Tests that the volume created from the snapshot has the same AZ as'
nl|'\n'
comment|'# the instance.'
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'flags'
op|'('
name|'cross_az_attach'
op|'='
name|'False'
op|','
name|'group'
op|'='
string|"'cinder'"
op|')'
newline|'\n'
name|'no_volume_snapshot'
op|'='
name|'self'
op|'.'
name|'snapshot_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'no_volume_snapshot'
op|'['
string|"'volume_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'snapshot'"
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'no_volume_snapshot'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'snapshot'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'wait_func'
op|'='
name|'self'
op|'.'
name|'mox'
op|'.'
name|'CreateMockAnything'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'get_snapshot'
op|'('
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
string|"'fake-snapshot-id-1'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'snapshot'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'create'
op|'('
name|'self'
op|'.'
name|'context'
op|','
number|'3'
op|','
string|"''"
op|','
string|"''"
op|','
name|'snapshot'
op|','
nl|'\n'
name|'availability_zone'
op|'='
string|"'test-az'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'volume'
op|')'
newline|'\n'
name|'wait_func'
op|'('
name|'self'
op|'.'
name|'context'
op|','
string|"'fake-volume-id-2'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'instance'
op|','
name|'expected_conn_info'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'no_volume_snapshot'
op|','
name|'volume'
op|','
nl|'\n'
name|'availability_zone'
op|'='
string|"'test-az'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|','
name|'wait_func'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'fake-volume-id-2'"
op|','
name|'test_bdm'
op|'.'
name|'volume_id'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_snapshot_attach_fail_volume
dedent|''
name|'def'
name|'test_snapshot_attach_fail_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'fail_volume_snapshot'
op|'='
name|'self'
op|'.'
name|'snapshot_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'fail_volume_snapshot'
op|'['
string|"'volume_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'snapshot'"
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'fail_volume_snapshot'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'snapshot'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-1'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|'='
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
name|'mock'
op|'.'
name|'sentinel'
op|'.'
name|'ctx'
op|','
nl|'\n'
op|'**'
op|'{'
string|"'uuid'"
op|':'
string|"'fake-uuid'"
op|'}'
op|')'
newline|'\n'
name|'with'
name|'test'
op|'.'
name|'nested'
op|'('
nl|'\n'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'get_snapshot'"
op|','
nl|'\n'
name|'return_value'
op|'='
name|'snapshot'
op|')'
op|','
nl|'\n'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'create'"
op|','
name|'return_value'
op|'='
name|'volume'
op|')'
op|','
nl|'\n'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'delete'"
op|')'
op|','
nl|'\n'
op|')'
name|'as'
op|'('
name|'vol_get_snap'
op|','
name|'vol_create'
op|','
name|'vol_delete'
op|')'
op|':'
newline|'\n'
indent|' '
name|'wait_func'
op|'='
name|'mock'
op|'.'
name|'MagicMock'
op|'('
op|')'
newline|'\n'
name|'mock_exception'
op|'='
name|'exception'
op|'.'
name|'VolumeNotCreated'
op|'('
name|'volume_id'
op|'='
name|'volume'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'seconds'
op|'='
number|'1'
op|','
nl|'\n'
name|'attempts'
op|'='
number|'1'
op|','
nl|'\n'
name|'volume_status'
op|'='
string|"'error'"
op|')'
newline|'\n'
name|'wait_func'
op|'.'
name|'side_effect'
op|'='
name|'mock_exception'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'VolumeNotCreated'
op|','
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|','
name|'context'
op|'='
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
name|'instance'
op|'='
name|'instance'
op|','
nl|'\n'
name|'volume_api'
op|'='
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'virt_driver'
op|'='
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'wait_func'
op|'='
name|'wait_func'
op|')'
newline|'\n'
nl|'\n'
name|'vol_get_snap'
op|'.'
name|'assert_called_once_with'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
string|"'fake-snapshot-id-1'"
op|')'
newline|'\n'
name|'vol_create'
op|'.'
name|'assert_called_once_with'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
number|'3'
op|','
string|"''"
op|','
string|"''"
op|','
name|'snapshot'
op|','
name|'availability_zone'
op|'='
name|'None'
op|')'
newline|'\n'
name|'vol_delete'
op|'.'
name|'assert_called_once_with'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'volume'
op|'['
string|"'id'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_snapshot_attach_volume
dedent|''
dedent|''
name|'def'
name|'test_snapshot_attach_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'snapshot'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'snapshot_bdm'
op|')'
newline|'\n'
nl|'\n'
name|'instance'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake_id'"
op|','
string|"'uuid'"
op|':'
string|"'fake_uuid'"
op|'}'
newline|'\n'
nl|'\n'
name|'volume_class'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'volume_class'
op|','
string|"'attach'"
op|')'
newline|'\n'
nl|'\n'
comment|'# Make sure theses are not called'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'get_snapshot'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'create'"
op|')'
newline|'\n'
nl|'\n'
name|'volume_class'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|','
name|'do_check_attach'
op|'='
name|'True'
nl|'\n'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'volume_id'
op|','
string|"'fake-volume-id-2'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_image_attach_no_volume
dedent|''
name|'def'
name|'test_image_attach_no_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'no_volume_image'
op|'='
name|'self'
op|'.'
name|'image_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'no_volume_image'
op|'['
string|"'volume_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'image'"
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'no_volume_image'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'image'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-image-id-1'"
op|'}'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'wait_func'
op|'='
name|'self'
op|'.'
name|'mox'
op|'.'
name|'CreateMockAnything'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'create'
op|'('
name|'self'
op|'.'
name|'context'
op|','
number|'1'
op|','
string|"''"
op|','
string|"''"
op|','
name|'image_id'
op|'='
name|'image'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'availability_zone'
op|'='
name|'None'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'volume'
op|')'
newline|'\n'
name|'wait_func'
op|'('
name|'self'
op|'.'
name|'context'
op|','
string|"'fake-volume-id-2'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'instance'
op|','
name|'expected_conn_info'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'no_volume_image'
op|','
name|'volume'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|','
name|'wait_func'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'volume_id'
op|','
string|"'fake-volume-id-2'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_image_attach_no_volume_cinder_cross_az_attach_false
dedent|''
name|'def'
name|'test_image_attach_no_volume_cinder_cross_az_attach_false'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Tests that the volume created from the image has the same AZ as the'
nl|'\n'
comment|'# instance.'
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'flags'
op|'('
name|'cross_az_attach'
op|'='
name|'False'
op|','
name|'group'
op|'='
string|"'cinder'"
op|')'
newline|'\n'
name|'no_volume_image'
op|'='
name|'self'
op|'.'
name|'image_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'no_volume_image'
op|'['
string|"'volume_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'image'"
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'no_volume_image'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'image'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-image-id-1'"
op|'}'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'wait_func'
op|'='
name|'self'
op|'.'
name|'mox'
op|'.'
name|'CreateMockAnything'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|'.'
name|'create'
op|'('
name|'self'
op|'.'
name|'context'
op|','
number|'1'
op|','
string|"''"
op|','
string|"''"
op|','
name|'image_id'
op|'='
name|'image'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'availability_zone'
op|'='
string|"'test-az'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'volume'
op|')'
newline|'\n'
name|'wait_func'
op|'('
name|'self'
op|'.'
name|'context'
op|','
string|"'fake-volume-id-2'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'instance'
op|','
name|'expected_conn_info'
op|'='
name|'self'
op|'.'
name|'_test_volume_attach'
op|'('
nl|'\n'
name|'test_bdm'
op|','
name|'no_volume_image'
op|','
name|'volume'
op|','
nl|'\n'
name|'availability_zone'
op|'='
string|"'test-az'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|','
name|'wait_func'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'fake-volume-id-2'"
op|','
name|'test_bdm'
op|'.'
name|'volume_id'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_image_attach_fail_volume
dedent|''
name|'def'
name|'test_image_attach_fail_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'fail_volume_image'
op|'='
name|'self'
op|'.'
name|'image_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'fail_volume_image'
op|'['
string|"'volume_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'image'"
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'fail_volume_image'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'image'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-image-id-1'"
op|'}'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'attach_status'"
op|':'
string|"'detached'"
op|'}'
newline|'\n'
nl|'\n'
name|'instance'
op|'='
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
name|'mock'
op|'.'
name|'sentinel'
op|'.'
name|'ctx'
op|','
nl|'\n'
op|'**'
op|'{'
string|"'uuid'"
op|':'
string|"'fake-uuid'"
op|'}'
op|')'
newline|'\n'
name|'with'
name|'test'
op|'.'
name|'nested'
op|'('
nl|'\n'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'create'"
op|','
name|'return_value'
op|'='
name|'volume'
op|')'
op|','
nl|'\n'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'delete'"
op|')'
op|','
nl|'\n'
op|')'
name|'as'
op|'('
name|'vol_create'
op|','
name|'vol_delete'
op|')'
op|':'
newline|'\n'
indent|' '
name|'wait_func'
op|'='
name|'mock'
op|'.'
name|'MagicMock'
op|'('
op|')'
newline|'\n'
name|'mock_exception'
op|'='
name|'exception'
op|'.'
name|'VolumeNotCreated'
op|'('
name|'volume_id'
op|'='
name|'volume'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'seconds'
op|'='
number|'1'
op|','
nl|'\n'
name|'attempts'
op|'='
number|'1'
op|','
nl|'\n'
name|'volume_status'
op|'='
string|"'error'"
op|')'
newline|'\n'
name|'wait_func'
op|'.'
name|'side_effect'
op|'='
name|'mock_exception'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'VolumeNotCreated'
op|','
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|','
name|'context'
op|'='
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
name|'instance'
op|'='
name|'instance'
op|','
nl|'\n'
name|'volume_api'
op|'='
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'virt_driver'
op|'='
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'wait_func'
op|'='
name|'wait_func'
op|')'
newline|'\n'
nl|'\n'
name|'vol_create'
op|'.'
name|'assert_called_once_with'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
number|'1'
op|','
string|"''"
op|','
string|"''"
op|','
name|'image_id'
op|'='
name|'image'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'availability_zone'
op|'='
name|'None'
op|')'
newline|'\n'
name|'vol_delete'
op|'.'
name|'assert_called_once_with'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'volume'
op|'['
string|"'id'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_image_attach_volume
dedent|''
dedent|''
name|'def'
name|'test_image_attach_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'image'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'image_bdm'
op|')'
newline|'\n'
nl|'\n'
name|'instance'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake_id'"
op|','
string|"'uuid'"
op|':'
string|"'fake_uuid'"
op|'}'
newline|'\n'
nl|'\n'
name|'volume_class'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'volume_class'
op|','
string|"'attach'"
op|')'
newline|'\n'
nl|'\n'
comment|'# Make sure theses are not called'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'get_snapshot'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'create'"
op|')'
newline|'\n'
nl|'\n'
name|'volume_class'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|','
name|'do_check_attach'
op|'='
name|'True'
nl|'\n'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'None'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'test_bdm'
op|'.'
name|'volume_id'
op|','
string|"'fake-volume-id-2'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_blank_attach_fail_volume
dedent|''
name|'def'
name|'test_blank_attach_fail_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'no_blank_volume'
op|'='
name|'self'
op|'.'
name|'blank_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'no_blank_volume'
op|'['
string|"'volume_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'blank'"
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'no_blank_volume'
op|')'
op|')'
newline|'\n'
name|'instance'
op|'='
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
name|'mock'
op|'.'
name|'sentinel'
op|'.'
name|'ctx'
op|','
nl|'\n'
op|'**'
op|'{'
string|"'uuid'"
op|':'
string|"'fake-uuid'"
op|'}'
op|')'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'display_name'"
op|':'
string|"'fake-uuid-blank-vol'"
op|'}'
newline|'\n'
nl|'\n'
name|'with'
name|'test'
op|'.'
name|'nested'
op|'('
nl|'\n'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'create'"
op|','
name|'return_value'
op|'='
name|'volume'
op|')'
op|','
nl|'\n'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'delete'"
op|')'
op|','
nl|'\n'
op|')'
name|'as'
op|'('
name|'vol_create'
op|','
name|'vol_delete'
op|')'
op|':'
newline|'\n'
indent|' '
name|'wait_func'
op|'='
name|'mock'
op|'.'
name|'MagicMock'
op|'('
op|')'
newline|'\n'
name|'mock_exception'
op|'='
name|'exception'
op|'.'
name|'VolumeNotCreated'
op|'('
name|'volume_id'
op|'='
name|'volume'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'seconds'
op|'='
number|'1'
op|','
nl|'\n'
name|'attempts'
op|'='
number|'1'
op|','
nl|'\n'
name|'volume_status'
op|'='
string|"'error'"
op|')'
newline|'\n'
name|'wait_func'
op|'.'
name|'side_effect'
op|'='
name|'mock_exception'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'VolumeNotCreated'
op|','
nl|'\n'
name|'test_bdm'
op|'.'
name|'attach'
op|','
name|'context'
op|'='
name|'self'
op|'.'
name|'context'
op|','
nl|'\n'
name|'instance'
op|'='
name|'instance'
op|','
nl|'\n'
name|'volume_api'
op|'='
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'virt_driver'
op|'='
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'wait_func'
op|'='
name|'wait_func'
op|')'
newline|'\n'
nl|'\n'
name|'vol_create'
op|'.'
name|'assert_called_once_with'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'test_bdm'
op|'.'
name|'volume_size'
op|','
string|"'fake-uuid-blank-vol'"
op|','
nl|'\n'
string|"''"
op|','
name|'availability_zone'
op|'='
name|'None'
op|')'
newline|'\n'
name|'vol_delete'
op|'.'
name|'assert_called_once_with'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'volume'
op|'['
string|"'id'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_blank_attach_volume
dedent|''
dedent|''
name|'def'
name|'test_blank_attach_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'no_blank_volume'
op|'='
name|'self'
op|'.'
name|'blank_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'no_blank_volume'
op|'['
string|"'volume_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'blank'"
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'no_blank_volume'
op|')'
op|')'
newline|'\n'
name|'instance'
op|'='
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
name|'mock'
op|'.'
name|'sentinel'
op|'.'
name|'ctx'
op|','
nl|'\n'
op|'**'
op|'{'
string|"'uuid'"
op|':'
string|"'fake-uuid'"
op|'}'
op|')'
newline|'\n'
name|'volume_class'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'display_name'"
op|':'
string|"'fake-uuid-blank-vol'"
op|'}'
newline|'\n'
nl|'\n'
name|'with'
name|'test'
op|'.'
name|'nested'
op|'('
nl|'\n'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'create'"
op|','
name|'return_value'
op|'='
name|'volume'
op|')'
op|','
nl|'\n'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'volume_class'
op|','
string|"'attach'"
op|')'
nl|'\n'
op|')'
name|'as'
op|'('
name|'vol_create'
op|','
name|'vol_attach'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|')'
newline|'\n'
nl|'\n'
name|'vol_create'
op|'.'
name|'assert_called_once_with'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'test_bdm'
op|'.'
name|'volume_size'
op|','
string|"'fake-uuid-blank-vol'"
op|','
nl|'\n'
string|"''"
op|','
name|'availability_zone'
op|'='
name|'None'
op|')'
newline|'\n'
name|'vol_attach'
op|'.'
name|'assert_called_once_with'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'do_check_attach'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'fake-volume-id-2'"
op|','
name|'test_bdm'
op|'.'
name|'volume_id'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_blank_attach_volume_cinder_cross_az_attach_false
dedent|''
dedent|''
name|'def'
name|'test_blank_attach_volume_cinder_cross_az_attach_false'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Tests that the blank volume created is in the same availability zone'
nl|'\n'
comment|'# as the instance.'
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'flags'
op|'('
name|'cross_az_attach'
op|'='
name|'False'
op|','
name|'group'
op|'='
string|"'cinder'"
op|')'
newline|'\n'
name|'no_blank_volume'
op|'='
name|'self'
op|'.'
name|'blank_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'no_blank_volume'
op|'['
string|"'volume_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'test_bdm'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'blank'"
op|']'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'no_blank_volume'
op|')'
op|')'
newline|'\n'
name|'updates'
op|'='
op|'{'
string|"'uuid'"
op|':'
string|"'fake-uuid'"
op|','
string|"'availability_zone'"
op|':'
string|"'test-az'"
op|'}'
newline|'\n'
name|'instance'
op|'='
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
name|'mock'
op|'.'
name|'sentinel'
op|'.'
name|'ctx'
op|','
nl|'\n'
op|'**'
name|'updates'
op|')'
newline|'\n'
name|'volume_class'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
newline|'\n'
name|'volume'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake-volume-id-2'"
op|','
nl|'\n'
string|"'display_name'"
op|':'
string|"'fake-uuid-blank-vol'"
op|'}'
newline|'\n'
nl|'\n'
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'volume_api'
op|','
string|"'create'"
op|','
nl|'\n'
name|'return_value'
op|'='
name|'volume'
op|')'
name|'as'
name|'vol_create'
op|':'
newline|'\n'
indent|' '
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'volume_class'
op|','
string|"'attach'"
op|')'
name|'as'
name|'vol_attach'
op|':'
newline|'\n'
indent|' '
name|'test_bdm'
op|'.'
name|'attach'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|')'
newline|'\n'
nl|'\n'
name|'vol_create'
op|'.'
name|'assert_called_once_with'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'context'
op|','
name|'test_bdm'
op|'.'
name|'volume_size'
op|','
string|"'fake-uuid-blank-vol'"
op|','
nl|'\n'
string|"''"
op|','
name|'availability_zone'
op|'='
string|"'test-az'"
op|')'
newline|'\n'
name|'vol_attach'
op|'.'
name|'assert_called_once_with'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'instance'
op|','
nl|'\n'
name|'self'
op|'.'
name|'volume_api'
op|','
nl|'\n'
name|'self'
op|'.'
name|'virt_driver'
op|','
nl|'\n'
name|'do_check_attach'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'fake-volume-id-2'"
op|','
name|'test_bdm'
op|'.'
name|'volume_id'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_convert_block_devices
dedent|''
dedent|''
dedent|''
name|'def'
name|'test_convert_block_devices'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'bdms'
op|'='
name|'objects'
op|'.'
name|'BlockDeviceMappingList'
op|'('
nl|'\n'
name|'objects'
op|'='
op|'['
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'self'
op|'.'
name|'ephemeral_bdm'
op|']'
op|')'
newline|'\n'
name|'converted'
op|'='
name|'driver_block_device'
op|'.'
name|'_convert_block_devices'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|','
name|'bdms'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'converted'
op|','
op|'['
name|'self'
op|'.'
name|'volume_driver_bdm'
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_convert_all_volumes
dedent|''
name|'def'
name|'test_convert_all_volumes'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'converted'
op|'='
name|'driver_block_device'
op|'.'
name|'convert_all_volumes'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'['
op|']'
op|','
name|'converted'
op|')'
newline|'\n'
nl|'\n'
name|'converted'
op|'='
name|'driver_block_device'
op|'.'
name|'convert_all_volumes'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'self'
op|'.'
name|'ephemeral_bdm'
op|','
name|'self'
op|'.'
name|'image_bdm'
op|','
nl|'\n'
name|'self'
op|'.'
name|'blank_bdm'
op|','
name|'self'
op|'.'
name|'snapshot_bdm'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'converted'
op|','
op|'['
name|'self'
op|'.'
name|'volume_driver_bdm'
op|','
nl|'\n'
name|'self'
op|'.'
name|'image_driver_bdm'
op|','
nl|'\n'
name|'self'
op|'.'
name|'blank_driver_bdm'
op|','
nl|'\n'
name|'self'
op|'.'
name|'snapshot_driver_bdm'
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_convert_volume
dedent|''
name|'def'
name|'test_convert_volume'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'driver_block_device'
op|'.'
name|'convert_volume'
op|'('
name|'self'
op|'.'
name|'swap_bdm'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'self'
op|'.'
name|'volume_driver_bdm'
op|','
nl|'\n'
name|'driver_block_device'
op|'.'
name|'convert_volume'
op|'('
name|'self'
op|'.'
name|'volume_bdm'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'self'
op|'.'
name|'snapshot_driver_bdm'
op|','
nl|'\n'
name|'driver_block_device'
op|'.'
name|'convert_volume'
op|'('
name|'self'
op|'.'
name|'snapshot_bdm'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_legacy_block_devices
dedent|''
name|'def'
name|'test_legacy_block_devices'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_snapshot'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'snapshot'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'snapshot_bdm'
op|')'
newline|'\n'
nl|'\n'
name|'block_device_mapping'
op|'='
op|'['
name|'test_snapshot'
op|','
name|'test_snapshot'
op|']'
newline|'\n'
name|'legacy_bdm'
op|'='
name|'driver_block_device'
op|'.'
name|'legacy_block_devices'
op|'('
nl|'\n'
name|'block_device_mapping'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'legacy_bdm'
op|','
op|'['
name|'self'
op|'.'
name|'snapshot_legacy_driver_bdm'
op|','
nl|'\n'
name|'self'
op|'.'
name|'snapshot_legacy_driver_bdm'
op|']'
op|')'
newline|'\n'
nl|'\n'
comment|'# Test that the ephemerals work as expected'
nl|'\n'
name|'test_ephemerals'
op|'='
op|'['
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'ephemeral'"
op|']'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'ephemeral_bdm'
op|')'
name|'for'
name|'_'
name|'in'
name|'range'
op|'('
number|'2'
op|')'
op|']'
newline|'\n'
name|'expected'
op|'='
op|'['
name|'self'
op|'.'
name|'ephemeral_legacy_driver_bdm'
op|'.'
name|'copy'
op|'('
op|')'
nl|'\n'
name|'for'
name|'_'
name|'in'
name|'range'
op|'('
number|'2'
op|')'
op|']'
newline|'\n'
name|'expected'
op|'['
number|'0'
op|']'
op|'['
string|"'virtual_name'"
op|']'
op|'='
string|"'ephemeral0'"
newline|'\n'
name|'expected'
op|'['
number|'0'
op|']'
op|'['
string|"'num'"
op|']'
op|'='
number|'0'
newline|'\n'
name|'expected'
op|'['
number|'1'
op|']'
op|'['
string|"'virtual_name'"
op|']'
op|'='
string|"'ephemeral1'"
newline|'\n'
name|'expected'
op|'['
number|'1'
op|']'
op|'['
string|"'num'"
op|']'
op|'='
number|'1'
newline|'\n'
name|'legacy_ephemerals'
op|'='
name|'driver_block_device'
op|'.'
name|'legacy_block_devices'
op|'('
nl|'\n'
name|'test_ephemerals'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'expected'
op|','
name|'legacy_ephemerals'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_get_swap
dedent|''
name|'def'
name|'test_get_swap'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'swap'
op|'='
op|'['
name|'self'
op|'.'
name|'swap_driver_bdm'
op|']'
newline|'\n'
name|'legacy_swap'
op|'='
op|'['
name|'self'
op|'.'
name|'swap_legacy_driver_bdm'
op|']'
newline|'\n'
name|'no_swap'
op|'='
op|'['
name|'self'
op|'.'
name|'volume_driver_bdm'
op|']'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'swap'
op|'['
number|'0'
op|']'
op|','
name|'driver_block_device'
op|'.'
name|'get_swap'
op|'('
name|'swap'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'legacy_swap'
op|'['
number|'0'
op|']'
op|','
nl|'\n'
name|'driver_block_device'
op|'.'
name|'get_swap'
op|'('
name|'legacy_swap'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'driver_block_device'
op|'.'
name|'get_swap'
op|'('
name|'no_swap'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'driver_block_device'
op|'.'
name|'get_swap'
op|'('
op|'['
op|']'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_is_implemented
dedent|''
name|'def'
name|'test_is_implemented'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'bdm'
name|'in'
op|'('
name|'self'
op|'.'
name|'image_bdm'
op|','
name|'self'
op|'.'
name|'volume_bdm'
op|','
name|'self'
op|'.'
name|'swap_bdm'
op|','
nl|'\n'
name|'self'
op|'.'
name|'ephemeral_bdm'
op|','
name|'self'
op|'.'
name|'snapshot_bdm'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'driver_block_device'
op|'.'
name|'is_implemented'
op|'('
name|'bdm'
op|')'
op|')'
newline|'\n'
dedent|''
name|'local_image'
op|'='
name|'self'
op|'.'
name|'image_bdm_dict'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'local_image'
op|'['
string|"'destination_type'"
op|']'
op|'='
string|"'local'"
newline|'\n'
name|'self'
op|'.'
name|'assertFalse'
op|'('
name|'driver_block_device'
op|'.'
name|'is_implemented'
op|'('
nl|'\n'
name|'fake_block_device'
op|'.'
name|'fake_bdm_object'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'local_image'
op|')'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_is_block_device_mapping
dedent|''
name|'def'
name|'test_is_block_device_mapping'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'test_swap'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'swap'"
op|']'
op|'('
name|'self'
op|'.'
name|'swap_bdm'
op|')'
newline|'\n'
name|'test_ephemeral'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'ephemeral'"
op|']'
op|'('
name|'self'
op|'.'
name|'ephemeral_bdm'
op|')'
newline|'\n'
name|'test_image'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'image'"
op|']'
op|'('
name|'self'
op|'.'
name|'image_bdm'
op|')'
newline|'\n'
name|'test_snapshot'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'snapshot'"
op|']'
op|'('
name|'self'
op|'.'
name|'snapshot_bdm'
op|')'
newline|'\n'
name|'test_volume'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'volume'"
op|']'
op|'('
name|'self'
op|'.'
name|'volume_bdm'
op|')'
newline|'\n'
name|'test_blank'
op|'='
name|'self'
op|'.'
name|'driver_classes'
op|'['
string|"'blank'"
op|']'
op|'('
name|'self'
op|'.'
name|'blank_bdm'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'bdm'
name|'in'
op|'('
name|'test_image'
op|','
name|'test_snapshot'
op|','
name|'test_volume'
op|','
name|'test_blank'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'driver_block_device'
op|'.'
name|'is_block_device_mapping'
op|'('
nl|'\n'
name|'bdm'
op|'.'
name|'_bdm_obj'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'for'
name|'bdm'
name|'in'
op|'('
name|'test_swap'
op|','
name|'test_ephemeral'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertFalse'
op|'('
name|'driver_block_device'
op|'.'
name|'is_block_device_mapping'
op|'('
nl|'\n'
name|'bdm'
op|'.'
name|'_bdm_obj'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_get_volume_create_az_cinder_cross_az_attach_true
dedent|''
dedent|''
name|'def'
name|'test_get_volume_create_az_cinder_cross_az_attach_true'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Tests that we get None back if cinder.cross_az_attach=True even if'
nl|'\n'
comment|'# the instance has an AZ assigned. Note that since cross_az_attach'
nl|'\n'
comment|"# defaults to True we don't need to set a flag explicitly for the test."
nl|'\n'
indent|' '
name|'updates'
op|'='
op|'{'
string|"'availability_zone'"
op|':'
string|"'test-az'"
op|'}'
newline|'\n'
name|'instance'
op|'='
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
name|'self'
op|'.'
name|'context'
op|','
op|'**'
name|'updates'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
nl|'\n'
name|'driver_block_device'
op|'.'
name|'_get_volume_create_az_value'
op|'('
name|'instance'
op|')'
op|')'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 12.291831 | 88 | 0.601715 | 14,101 | 93,295 | 3.843203 | 0.023544 | 0.171055 | 0.087096 | 0.109276 | 0.943757 | 0.923865 | 0.903881 | 0.879689 | 0.849556 | 0.815345 | 0 | 0.001761 | 0.093145 | 93,295 | 7,589 | 89 | 12.293451 | 0.63878 | 0 | 0 | 0.972724 | 0 | 0 | 0.374682 | 0.03087 | 0 | 0 | 0 | 0 | 0.010278 | 0 | null | null | 0.000527 | 0.002108 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bb64b40b5acf54ae0b5d4836e2e804f6d82bb300 | 31,311 | py | Python | lib/services/clouddb/ncloud_clouddb/api/v2_api.py | KidongSohn/ncloud-sdk-py | 1c62471a9bd320d77164ed3193a0ebb9f64229ff | [
"MIT"
] | null | null | null | lib/services/clouddb/ncloud_clouddb/api/v2_api.py | KidongSohn/ncloud-sdk-py | 1c62471a9bd320d77164ed3193a0ebb9f64229ff | [
"MIT"
] | null | null | null | lib/services/clouddb/ncloud_clouddb/api/v2_api.py | KidongSohn/ncloud-sdk-py | 1c62471a9bd320d77164ed3193a0ebb9f64229ff | [
"MIT"
] | null | null | null | # coding: utf-8
"""
clouddb
OpenAPI spec version: 2018-06-21T02:28:05Z
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from ncloud_clouddb.api_client import ApiClient
class V2Api(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_cloud_db_instance(self, create_cloud_db_instance_request, **kwargs): # noqa: E501
"""create_cloud_db_instance # noqa: E501
CloudDB인스턴스생성 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.create_cloud_db_instance(create_cloud_db_instance_request, async=True)
>>> result = thread.get()
:param async bool
:param CreateCloudDBInstanceRequest create_cloud_db_instance_request: createCloudDBInstanceRequest (required)
:return: CreateCloudDBInstanceResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.create_cloud_db_instance_with_http_info(create_cloud_db_instance_request, **kwargs) # noqa: E501
else:
(data) = self.create_cloud_db_instance_with_http_info(create_cloud_db_instance_request, **kwargs) # noqa: E501
return data
def create_cloud_db_instance_with_http_info(self, create_cloud_db_instance_request, **kwargs): # noqa: E501
"""create_cloud_db_instance # noqa: E501
CloudDB인스턴스생성 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.create_cloud_db_instance_with_http_info(create_cloud_db_instance_request, async=True)
>>> result = thread.get()
:param async bool
:param CreateCloudDBInstanceRequest create_cloud_db_instance_request: createCloudDBInstanceRequest (required)
:return: CreateCloudDBInstanceResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['create_cloud_db_instance_request'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_cloud_db_instance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'create_cloud_db_instance_request' is set
if ('create_cloud_db_instance_request' not in params or
params['create_cloud_db_instance_request'] is None):
raise ValueError("Missing the required parameter `create_cloud_db_instance_request` when calling `create_cloud_db_instance`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
query_params.append(('responseFormatType', 'json')) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'create_cloud_db_instance_request' in params:
body_params = params['create_cloud_db_instance_request']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = ['x-ncp-iam'] # noqa: E501
return self.api_client.call_api(
'/createCloudDBInstance', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CreateCloudDBInstanceResponse', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_cloud_db_server_instance(self, delete_cloud_db_server_instance_request, **kwargs): # noqa: E501
"""delete_cloud_db_server_instance # noqa: E501
CloudDB서버인스턴스삭제 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_cloud_db_server_instance(delete_cloud_db_server_instance_request, async=True)
>>> result = thread.get()
:param async bool
:param DeleteCloudDBServerInstanceRequest delete_cloud_db_server_instance_request: deleteCloudDBServerInstanceRequest (required)
:return: DeleteCloudDBServerInstanceResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.delete_cloud_db_server_instance_with_http_info(delete_cloud_db_server_instance_request, **kwargs) # noqa: E501
else:
(data) = self.delete_cloud_db_server_instance_with_http_info(delete_cloud_db_server_instance_request, **kwargs) # noqa: E501
return data
def delete_cloud_db_server_instance_with_http_info(self, delete_cloud_db_server_instance_request, **kwargs): # noqa: E501
"""delete_cloud_db_server_instance # noqa: E501
CloudDB서버인스턴스삭제 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_cloud_db_server_instance_with_http_info(delete_cloud_db_server_instance_request, async=True)
>>> result = thread.get()
:param async bool
:param DeleteCloudDBServerInstanceRequest delete_cloud_db_server_instance_request: deleteCloudDBServerInstanceRequest (required)
:return: DeleteCloudDBServerInstanceResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['delete_cloud_db_server_instance_request'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_cloud_db_server_instance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'delete_cloud_db_server_instance_request' is set
if ('delete_cloud_db_server_instance_request' not in params or
params['delete_cloud_db_server_instance_request'] is None):
raise ValueError("Missing the required parameter `delete_cloud_db_server_instance_request` when calling `delete_cloud_db_server_instance`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
query_params.append(('responseFormatType', 'json')) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'delete_cloud_db_server_instance_request' in params:
body_params = params['delete_cloud_db_server_instance_request']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = ['x-ncp-iam'] # noqa: E501
return self.api_client.call_api(
'/deleteCloudDBServerInstance', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeleteCloudDBServerInstanceResponse', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_cloud_db_config_group_list(self, get_cloud_db_config_group_list_request, **kwargs): # noqa: E501
"""get_cloud_db_config_group_list # noqa: E501
CloudDB설정그룹리스트조회 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_cloud_db_config_group_list(get_cloud_db_config_group_list_request, async=True)
>>> result = thread.get()
:param async bool
:param GetCloudDBConfigGroupListRequest get_cloud_db_config_group_list_request: getCloudDBConfigGroupListRequest (required)
:return: GetCloudDBConfigGroupListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.get_cloud_db_config_group_list_with_http_info(get_cloud_db_config_group_list_request, **kwargs) # noqa: E501
else:
(data) = self.get_cloud_db_config_group_list_with_http_info(get_cloud_db_config_group_list_request, **kwargs) # noqa: E501
return data
def get_cloud_db_config_group_list_with_http_info(self, get_cloud_db_config_group_list_request, **kwargs): # noqa: E501
"""get_cloud_db_config_group_list # noqa: E501
CloudDB설정그룹리스트조회 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_cloud_db_config_group_list_with_http_info(get_cloud_db_config_group_list_request, async=True)
>>> result = thread.get()
:param async bool
:param GetCloudDBConfigGroupListRequest get_cloud_db_config_group_list_request: getCloudDBConfigGroupListRequest (required)
:return: GetCloudDBConfigGroupListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['get_cloud_db_config_group_list_request'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_cloud_db_config_group_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'get_cloud_db_config_group_list_request' is set
if ('get_cloud_db_config_group_list_request' not in params or
params['get_cloud_db_config_group_list_request'] is None):
raise ValueError("Missing the required parameter `get_cloud_db_config_group_list_request` when calling `get_cloud_db_config_group_list`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
query_params.append(('responseFormatType', 'json')) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'get_cloud_db_config_group_list_request' in params:
body_params = params['get_cloud_db_config_group_list_request']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = ['x-ncp-iam'] # noqa: E501
return self.api_client.call_api(
'/getCloudDBConfigGroupList', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetCloudDBConfigGroupListResponse', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_cloud_db_image_product_list(self, get_cloud_db_image_product_list_request, **kwargs): # noqa: E501
"""get_cloud_db_image_product_list # noqa: E501
CloudDB이미지상품리스트 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_cloud_db_image_product_list(get_cloud_db_image_product_list_request, async=True)
>>> result = thread.get()
:param async bool
:param GetCloudDBImageProductListRequest get_cloud_db_image_product_list_request: getCloudDBImageProductListRequest (required)
:return: GetCloudDBImageProductListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.get_cloud_db_image_product_list_with_http_info(get_cloud_db_image_product_list_request, **kwargs) # noqa: E501
else:
(data) = self.get_cloud_db_image_product_list_with_http_info(get_cloud_db_image_product_list_request, **kwargs) # noqa: E501
return data
def get_cloud_db_image_product_list_with_http_info(self, get_cloud_db_image_product_list_request, **kwargs): # noqa: E501
"""get_cloud_db_image_product_list # noqa: E501
CloudDB이미지상품리스트 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_cloud_db_image_product_list_with_http_info(get_cloud_db_image_product_list_request, async=True)
>>> result = thread.get()
:param async bool
:param GetCloudDBImageProductListRequest get_cloud_db_image_product_list_request: getCloudDBImageProductListRequest (required)
:return: GetCloudDBImageProductListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['get_cloud_db_image_product_list_request'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_cloud_db_image_product_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'get_cloud_db_image_product_list_request' is set
if ('get_cloud_db_image_product_list_request' not in params or
params['get_cloud_db_image_product_list_request'] is None):
raise ValueError("Missing the required parameter `get_cloud_db_image_product_list_request` when calling `get_cloud_db_image_product_list`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
query_params.append(('responseFormatType', 'json')) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'get_cloud_db_image_product_list_request' in params:
body_params = params['get_cloud_db_image_product_list_request']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = ['x-ncp-iam'] # noqa: E501
return self.api_client.call_api(
'/getCloudDBImageProductList', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetCloudDBImageProductListResponse', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_cloud_db_instance_list(self, get_cloud_db_instance_list_request, **kwargs): # noqa: E501
"""get_cloud_db_instance_list # noqa: E501
CloudDB인스턴스리스트조회 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_cloud_db_instance_list(get_cloud_db_instance_list_request, async=True)
>>> result = thread.get()
:param async bool
:param GetCloudDBInstanceListRequest get_cloud_db_instance_list_request: getCloudDBInstanceListRequest (required)
:return: GetCloudDBInstanceListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.get_cloud_db_instance_list_with_http_info(get_cloud_db_instance_list_request, **kwargs) # noqa: E501
else:
(data) = self.get_cloud_db_instance_list_with_http_info(get_cloud_db_instance_list_request, **kwargs) # noqa: E501
return data
def get_cloud_db_instance_list_with_http_info(self, get_cloud_db_instance_list_request, **kwargs): # noqa: E501
"""get_cloud_db_instance_list # noqa: E501
CloudDB인스턴스리스트조회 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_cloud_db_instance_list_with_http_info(get_cloud_db_instance_list_request, async=True)
>>> result = thread.get()
:param async bool
:param GetCloudDBInstanceListRequest get_cloud_db_instance_list_request: getCloudDBInstanceListRequest (required)
:return: GetCloudDBInstanceListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['get_cloud_db_instance_list_request'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_cloud_db_instance_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'get_cloud_db_instance_list_request' is set
if ('get_cloud_db_instance_list_request' not in params or
params['get_cloud_db_instance_list_request'] is None):
raise ValueError("Missing the required parameter `get_cloud_db_instance_list_request` when calling `get_cloud_db_instance_list`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
query_params.append(('responseFormatType', 'json')) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'get_cloud_db_instance_list_request' in params:
body_params = params['get_cloud_db_instance_list_request']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = ['x-ncp-iam'] # noqa: E501
return self.api_client.call_api(
'/getCloudDBInstanceList', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetCloudDBInstanceListResponse', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_cloud_db_product_list(self, get_cloud_db_product_list_request, **kwargs): # noqa: E501
"""get_cloud_db_product_list # noqa: E501
CloudDB상품리스트조회 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_cloud_db_product_list(get_cloud_db_product_list_request, async=True)
>>> result = thread.get()
:param async bool
:param GetCloudDBProductListRequest get_cloud_db_product_list_request: getCloudDBProductListRequest (required)
:return: GetCloudDBProductListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.get_cloud_db_product_list_with_http_info(get_cloud_db_product_list_request, **kwargs) # noqa: E501
else:
(data) = self.get_cloud_db_product_list_with_http_info(get_cloud_db_product_list_request, **kwargs) # noqa: E501
return data
def get_cloud_db_product_list_with_http_info(self, get_cloud_db_product_list_request, **kwargs): # noqa: E501
"""get_cloud_db_product_list # noqa: E501
CloudDB상품리스트조회 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_cloud_db_product_list_with_http_info(get_cloud_db_product_list_request, async=True)
>>> result = thread.get()
:param async bool
:param GetCloudDBProductListRequest get_cloud_db_product_list_request: getCloudDBProductListRequest (required)
:return: GetCloudDBProductListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['get_cloud_db_product_list_request'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_cloud_db_product_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'get_cloud_db_product_list_request' is set
if ('get_cloud_db_product_list_request' not in params or
params['get_cloud_db_product_list_request'] is None):
raise ValueError("Missing the required parameter `get_cloud_db_product_list_request` when calling `get_cloud_db_product_list`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
query_params.append(('responseFormatType', 'json')) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'get_cloud_db_product_list_request' in params:
body_params = params['get_cloud_db_product_list_request']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = ['x-ncp-iam'] # noqa: E501
return self.api_client.call_api(
'/getCloudDBProductList', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetCloudDBProductListResponse', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def reboot_cloud_db_server_instance(self, reboot_cloud_db_server_instance_request, **kwargs): # noqa: E501
"""reboot_cloud_db_server_instance # noqa: E501
CloudDB서버인스턴스재부팅 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.reboot_cloud_db_server_instance(reboot_cloud_db_server_instance_request, async=True)
>>> result = thread.get()
:param async bool
:param RebootCloudDBServerInstanceRequest reboot_cloud_db_server_instance_request: rebootCloudDBServerInstanceRequest (required)
:return: RebootCloudDBServerInstanceResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.reboot_cloud_db_server_instance_with_http_info(reboot_cloud_db_server_instance_request, **kwargs) # noqa: E501
else:
(data) = self.reboot_cloud_db_server_instance_with_http_info(reboot_cloud_db_server_instance_request, **kwargs) # noqa: E501
return data
def reboot_cloud_db_server_instance_with_http_info(self, reboot_cloud_db_server_instance_request, **kwargs): # noqa: E501
"""reboot_cloud_db_server_instance # noqa: E501
CloudDB서버인스턴스재부팅 # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.reboot_cloud_db_server_instance_with_http_info(reboot_cloud_db_server_instance_request, async=True)
>>> result = thread.get()
:param async bool
:param RebootCloudDBServerInstanceRequest reboot_cloud_db_server_instance_request: rebootCloudDBServerInstanceRequest (required)
:return: RebootCloudDBServerInstanceResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['reboot_cloud_db_server_instance_request'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method reboot_cloud_db_server_instance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'reboot_cloud_db_server_instance_request' is set
if ('reboot_cloud_db_server_instance_request' not in params or
params['reboot_cloud_db_server_instance_request'] is None):
raise ValueError("Missing the required parameter `reboot_cloud_db_server_instance_request` when calling `reboot_cloud_db_server_instance`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
query_params.append(('responseFormatType', 'json')) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'reboot_cloud_db_server_instance_request' in params:
body_params = params['reboot_cloud_db_server_instance_request']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = ['x-ncp-iam'] # noqa: E501
return self.api_client.call_api(
'/rebootCloudDBServerInstance', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RebootCloudDBServerInstanceResponse', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 44.412766 | 165 | 0.665134 | 3,578 | 31,311 | 5.433482 | 0.051984 | 0.063011 | 0.051438 | 0.05401 | 0.958798 | 0.956381 | 0.947379 | 0.927473 | 0.908184 | 0.891621 | 0 | 0.014455 | 0.257609 | 31,311 | 704 | 166 | 44.475852 | 0.821897 | 0.057456 | 0 | 0.718421 | 1 | 0 | 0.239457 | 0.15588 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.010526 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
bba50bb7583bf31d55861efa1e21933f032d4dbb | 7,311 | py | Python | mayan/apps/duplicates/tests/test_views.py | UbuhingaVizion/mayan-edms | 69b3292e80fe067ebf5d46c4c9ceb1d44962e07a | [
"Apache-2.0"
] | null | null | null | mayan/apps/duplicates/tests/test_views.py | UbuhingaVizion/mayan-edms | 69b3292e80fe067ebf5d46c4c9ceb1d44962e07a | [
"Apache-2.0"
] | null | null | null | mayan/apps/duplicates/tests/test_views.py | UbuhingaVizion/mayan-edms | 69b3292e80fe067ebf5d46c4c9ceb1d44962e07a | [
"Apache-2.0"
] | null | null | null | from mayan.apps.documents.permissions import permission_document_view
from mayan.apps.documents.tests.base import GenericDocumentViewTestCase
from .mixins import (
DuplicatedDocumentTestMixin, DuplicatedDocumentViewTestMixin
)
class DocumentsDuplicateListViewsTestCase(
DuplicatedDocumentTestMixin, DuplicatedDocumentViewTestMixin,
GenericDocumentViewTestCase
):
def test_document_duplicates_list_no_permission(self):
self._upload_duplicate_document()
response = self._request_test_document_duplicates_list_view()
self.assertEqual(response.status_code, 404)
def test_document_duplicates_list_with_source_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[0],
permission=permission_document_view
)
response = self._request_test_document_duplicates_list_view()
self.assertContains(
response=response, status_code=200,
text=self.test_documents[0].label
)
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[1].label
)
def test_document_duplicates_list_with_target_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[1],
permission=permission_document_view
)
response = self._request_test_document_duplicates_list_view()
self.assertEqual(response.status_code, 404)
def test_document_duplicates_list_with_full_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[0],
permission=permission_document_view
)
self.grant_access(
obj=self.test_documents[1],
permission=permission_document_view
)
response = self._request_test_document_duplicates_list_view()
self.assertContains(
response=response, status_code=200,
text=self.test_documents[0].label
)
self.assertContains(
response=response, status_code=200,
text=self.test_documents[1].label
)
def test_document_duplicates_list_trashed_source_with_full_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[0],
permission=permission_document_view
)
self.grant_access(
obj=self.test_documents[1],
permission=permission_document_view
)
self.test_documents[0].delete()
response = self._request_test_document_duplicates_list_view()
self.assertEqual(response.status_code, 404)
def test_document_duplicates_list_trashed_target_with_full_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[0],
permission=permission_document_view
)
self.grant_access(
obj=self.test_documents[1],
permission=permission_document_view
)
self.test_documents[1].delete()
response = self._request_test_document_duplicates_list_view()
self.assertContains(
response=response, status_code=200,
text=self.test_documents[0].label
)
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[1].label
)
class DuplicatedDocumentListViewsTestCase(
DuplicatedDocumentTestMixin, DuplicatedDocumentViewTestMixin,
GenericDocumentViewTestCase
):
def test_duplicated_document_list_no_permission(self):
self._upload_duplicate_document()
response = self._request_test_duplicated_document_list_view()
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[0].label
)
def test_duplicated_document_list_with_source_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[0],
permission=permission_document_view
)
response = self._request_test_duplicated_document_list_view()
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[0].label
)
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[1].label
)
def test_duplicated_document_list_with_target_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[1],
permission=permission_document_view
)
response = self._request_test_duplicated_document_list_view()
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[0].label
)
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[1].label
)
def test_duplicated_document_list_with_full_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[0],
permission=permission_document_view
)
self.grant_access(
obj=self.test_documents[1],
permission=permission_document_view
)
response = self._request_test_duplicated_document_list_view()
self.assertContains(
response=response, status_code=200,
text=self.test_documents[0].label
)
self.assertContains(
response=response, status_code=200,
text=self.test_documents[1].label
)
def test_duplicated_document_list_trashed_source_with_full_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[0],
permission=permission_document_view
)
self.grant_access(
obj=self.test_documents[1],
permission=permission_document_view
)
self.test_documents[0].delete()
response = self._request_test_duplicated_document_list_view()
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[0].label
)
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[1].label
)
def test_duplicated_document_list_trashed_target_with_full_access(self):
self._upload_duplicate_document()
self.grant_access(
obj=self.test_documents[0],
permission=permission_document_view
)
self.grant_access(
obj=self.test_documents[1],
permission=permission_document_view
)
self.test_documents[1].delete()
response = self._request_test_duplicated_document_list_view()
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[0].label
)
self.assertNotContains(
response=response, status_code=200,
text=self.test_documents[1].label
)
| 33.081448 | 76 | 0.664341 | 741 | 7,311 | 6.159244 | 0.066127 | 0.064855 | 0.137818 | 0.074934 | 0.938869 | 0.890666 | 0.890666 | 0.885188 | 0.885188 | 0.885188 | 0 | 0.018013 | 0.263439 | 7,311 | 220 | 77 | 33.231818 | 0.829526 | 0 | 0 | 0.723404 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106383 | 1 | 0.06383 | false | 0 | 0.015957 | 0 | 0.090426 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a559d863d9d5ae0c125424e9d9e44c571a4cdb87 | 1,788 | py | Python | Python/challenges/challenge_008.py | David29595/ProjectEuler | b2d1b60bdfd71193c2fb449801a8b4954bfc95ac | [
"MIT"
] | null | null | null | Python/challenges/challenge_008.py | David29595/ProjectEuler | b2d1b60bdfd71193c2fb449801a8b4954bfc95ac | [
"MIT"
] | null | null | null | Python/challenges/challenge_008.py | David29595/ProjectEuler | b2d1b60bdfd71193c2fb449801a8b4954bfc95ac | [
"MIT"
] | null | null | null | # Largest product in a series
# The four adjacent digits in the 1000-digit(below) number that have the greatest product are 9 × 9 × 8 × 9 = 5832.
# Find the thirteen adjacent digits in the 1000-digit number that have the greatest product. What is the value of this product?
# adj: number of adjacent digits to create series from
def main(adj):
NUM = "7316717653133062491922511967442657474235534919493496983520312774506326239578318016984801869478851843858615607891129494954595017379583319528532088055111254069874715852386305071569329096329522744304355766896648950445244523161731856403098711121722383113622298934233803081353362766142828064444866452387493035890729629049156044077239071381051585930796086670172427121883998797908792274921901699720888093776657273330010533678812202354218097512545405947522435258490771167055601360483958644670632441572215539753697817977846174064955149290862569321978468622482839722413756570560574902614079729686524145351004748216637048440319989000889524345065854122758866688116427171479924442928230863465674813919123162824586178664583591245665294765456828489128831426076900422421902267105562632111110937054421750694165896040807198403850962455444362981230987879927244284909188845801561660979191338754992005240636899125607176060588611646710940507754100225698315520005593572972571636269561882670428252483600823257530420752963450"
max_product = 0
for i in range(1000 - adj + 1):
if(product(NUM[i : i + adj]) > max_product):
max_product = product(NUM[i : i + adj])
return max_product
# get the product of the numbers in a given series of numbers
def product(series):
ans = 1
for num in series:
num = int(num)
ans *= num
return ans
if __name__ == '__main__':
ans = main(13)
print(str(ans))
| 63.857143 | 1,012 | 0.844519 | 130 | 1,788 | 11.546154 | 0.407692 | 0.026649 | 0.021319 | 0.025316 | 0.099933 | 0.079947 | 0 | 0 | 0 | 0 | 0 | 0.649556 | 0.11745 | 1,788 | 27 | 1,013 | 66.222222 | 0.299747 | 0.212528 | 0 | 0 | 0 | 0 | 0.718973 | 0.713267 | 0 | 1 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.25 | 0.0625 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a56a12a345ec67640a72083e483f714a199a3a3a | 26,625 | py | Python | post_optimization_studies/mad_analyses/ma100MeV_L1pt8-2pt4TeV_deta3pt6/Output/Histos/MadAnalysis5job_0/selection_14.py | sheride/axion_pheno | 7d3fc08f5ae5b17a3500eba19a2e43f87f076ce5 | [
"MIT"
] | null | null | null | post_optimization_studies/mad_analyses/ma100MeV_L1pt8-2pt4TeV_deta3pt6/Output/Histos/MadAnalysis5job_0/selection_14.py | sheride/axion_pheno | 7d3fc08f5ae5b17a3500eba19a2e43f87f076ce5 | [
"MIT"
] | null | null | null | post_optimization_studies/mad_analyses/ma100MeV_L1pt8-2pt4TeV_deta3pt6/Output/Histos/MadAnalysis5job_0/selection_14.py | sheride/axion_pheno | 7d3fc08f5ae5b17a3500eba19a2e43f87f076ce5 | [
"MIT"
] | null | null | null | def selection_14():
# Library import
import numpy
import matplotlib
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
# Library version
matplotlib_version = matplotlib.__version__
numpy_version = numpy.__version__
# Histo binning
xBinning = numpy.linspace(0.0,8000.0,81,endpoint=True)
# Creating data sequence: middle of each bin
xData = numpy.array([50.0,150.0,250.0,350.0,450.0,550.0,650.0,750.0,850.0,950.0,1050.0,1150.0,1250.0,1350.0,1450.0,1550.0,1650.0,1750.0,1850.0,1950.0,2050.0,2150.0,2250.0,2350.0,2450.0,2550.0,2650.0,2750.0,2850.0,2950.0,3050.0,3150.0,3250.0,3350.0,3450.0,3550.0,3650.0,3750.0,3850.0,3950.0,4050.0,4150.0,4250.0,4350.0,4450.0,4550.0,4650.0,4750.0,4850.0,4950.0,5050.0,5150.0,5250.0,5350.0,5450.0,5550.0,5650.0,5750.0,5850.0,5950.0,6050.0,6150.0,6250.0,6350.0,6450.0,6550.0,6650.0,6750.0,6850.0,6950.0,7050.0,7150.0,7250.0,7350.0,7450.0,7550.0,7650.0,7750.0,7850.0,7950.0])
# Creating weights for histo: y15_TET_0
y15_TET_0_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.541322753874,1.82033104489,2.78622096259,3.2461685234,3.42660970802,3.47791130365,3.26916612144,3.18779052837,2.77383776364,2.60931817766,2.44479819168,2.147601417,1.94593223419,1.70180625499,1.5514386678,1.36038348408,1.23832029448,1.06495550926,0.932278320561,0.831443529153,0.730609137745,0.633312346036,0.59793194905,0.486483158547,0.40687676533,0.403338765632,0.332577491661,0.306042053922,0.242357019349,0.219359621308,0.196362223268,0.145060387639,0.14329134779,0.132677188695,0.0972965917094,0.0672231142719,0.0760682735183,0.0672231142719,0.0689921541212,0.0495328357793,0.0424567163823,0.0459947560808,0.0353805849852,0.0265354377389,0.0229973780404,0.0247664098897,0.0247664098897,0.00176902904926,0.00530708754779,0.0123832029448,0.00353805849852,0.0106141750956,0.00176902904926,0.00530708754779,0.00530708754779,0.00353805849852,0.0,0.0,0.00176902904926,0.00176902904926,0.00176902904926,0.00176902904926,0.00176902904926,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_1
y15_TET_1_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.353672468828,1.2086320588,1.82300748865,2.17453114628,2.21399656217,2.18842689595,2.08907790132,2.01961873737,1.80692148074,1.70763723461,1.5216543339,1.36567637235,1.2983187372,1.09537812179,1.00122499524,0.906134614022,0.790763560615,0.688158755097,0.62082709929,0.571677384647,0.479818034014,0.437032058731,0.387875829269,0.325924175121,0.316275488052,0.255423359527,0.211562039391,0.20303206269,0.192365744872,0.152820712292,0.122872248228,0.0929587963201,0.0908125031182,0.0673181065813,0.0715961445546,0.0555761641333,0.03526097939,0.0512779821792,0.0267102791685,0.0235078658256,0.0288442022076,0.0160319109338,0.0245782186483,0.00961097306007,0.0160218269529,0.0149630888938,0.00855036848529,0.0106847231811,0.00748340896416,0.00748077905554,0.00320504445056,0.00320690816848,0.00320690816848,0.00427014025378,0.00427386928834,0.0,0.0010695906288,0.00427090764353,0.00213731753968,0.0010695906288,0.0,0.0,0.0,0.00106772691088,0.0,0.0,0.0,0.0010695906288,0.0,0.00213468803074,0.00106772691088,0.0,0.00106696111986,0.0])
# Creating weights for histo: y15_TET_2
y15_TET_2_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.252089855593,0.87780059358,1.23128187153,1.50837233264,1.49517752973,1.53545633861,1.42017551319,1.32433989205,1.2347542723,1.15211305407,1.03683222865,0.931968605525,0.830576983166,0.761130967851,0.66182254595,0.604876933392,0.525708115933,0.484040106744,0.42570529388,0.381954324232,0.33750871443,0.2757015808,0.265284618503,0.207644245791,0.201394084413,0.165282036449,0.155559594305,0.131947869098,0.113891825116,0.093752420675,0.0784742573058,0.0569459325582,0.0451400499547,0.0500012910267,0.051390211333,0.0381954324232,0.0368065081169,0.0263895698196,0.0256951096665,0.0250006455133,0.0159726355224,0.0145837112161,0.00833354983778,0.0152781713693,0.00972247414408,0.00486123707204,0.00902800999093,0.00416677291889,0.00277784941259,0.00347231196574,0.00486123707204,0.00347231196574,0.00208338725945,0.000694462553149,0.0013889247063,0.00277784941259,0.000694462553149,0.000694462553149,0.0,0.0,0.0,0.0,0.0,0.0,0.000694462553149,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_3
y15_TET_3_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.186817284884,0.603125744905,0.919387820889,1.00568384162,1.07822985905,1.09292866258,0.991459438204,0.953527029091,0.853480205054,0.781408587738,0.731622175777,0.712181771106,0.560926134766,0.497863319615,0.499285719956,0.413937699451,0.356565005667,0.321003317123,0.285441668579,0.244664258782,0.216214931947,0.195826247048,0.185869004656,0.156471357593,0.124702909961,0.126125390302,0.100046824037,0.0938828225559,0.0744424178852,0.0697008567461,0.0635368552651,0.0507346521893,0.0365099807717,0.0365099807717,0.0365099807717,0.0265527103795,0.0218111572403,0.0222853133542,0.0142246674176,0.0184920684428,0.00853480205054,0.00995726639229,0.0109055786201,0.00900895416446,0.00616402148094,0.00521571325311,0.00711233370878,0.00331908919743,0.00426740102527,0.00379324451135,0.00426740102527,0.00237077776959,0.00142246674176,0.000474155713919,0.0,0.00189662245568,0.000474155713919,0.000948311027838,0.000948311027838,0.0,0.0,0.000474155713919,0.0,0.000948311027838,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_4
y15_TET_4_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_5
y15_TET_5_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,2.10674221742,0.0,1.0529581672,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_6
y15_TET_6_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.691829993515,5.2978182267,5.06681435217,5.06814008347,1.38077450405,0.690627997143,0.461033240486,0.23012854603,0.0,0.0,0.230673171817,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_7
y15_TET_7_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.332435765982,1.60580251853,2.13237520421,2.51968650854,2.63087350198,1.38477604996,0.719744800729,0.332138681087,0.249261343098,0.138402554805,0.110813435051,0.0277263950555,0.0276696169506,0.0554236624816,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_8
y15_TET_8_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0604818175633,0.151143961499,0.403213391549,0.453828545279,0.746194426083,0.645139387342,0.534350744552,0.231751792407,0.171464208753,0.0705768526515,0.0706294696799,0.0201750558134,0.0100953567379,0.0100429581886,0.0,0.0,0.0100805669227,0.0,0.0,0.0,0.0,0.0100975172526,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_9
y15_TET_9_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.011317582905,0.0509394803948,0.0622634731433,0.0989753324578,0.101844295309,0.152781313339,0.198036424467,0.251867683062,0.141452626708,0.118830765363,0.107487496914,0.0622151877075,0.0452703470099,0.0424225451063,0.0113235810719,0.00848724455472,0.0,0.00283014097637,0.00282347950995,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_10
y15_TET_10_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00305829576001,0.00304355754346,0.0,0.00612419346139,0.00457451903961,0.00304894934329,0.0,0.00611716001098,0.00304400548124,0.0075950710189,0.0106366087057,0.0182773386486,0.00606803813798,0.0106546916819,0.00455394344924,0.00306899308143,0.00304183197841,0.00610962543756,0.00151115650058,0.00153821477883,0.0,0.0,0.00151265396011,0.0,0.0,0.00152449653668,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_11
y15_TET_11_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000180553030001,0.0,0.000721759378433,0.000180626138525,0.000180734319121,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000180547139741,0.00252864058845,0.00126248642138,0.000180657129763,0.000361173239767,0.00108292126116,0.00054161443182,0.000360506985889,0.000361704556638,0.000361583017085,0.00036167183297,0.000361395760708,0.0,0.0,0.00018020300225,0.0,0.0,0.000180766965792,0.0,0.0,0.0,0.0,0.000180755031278,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_12
y15_TET_12_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0121704927804,0.0242994195771,0.0,0.0,0.0121313836425,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_13
y15_TET_13_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.220879096585,0.441761829362,0.230809248523,0.120544402743,0.0502051137366,0.0501796190701,0.0100546543364,0.0301461580631,0.0100696701578,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_14
y15_TET_14_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.511511761888,1.38040517552,1.56756688076,1.17169068658,0.676676643777,0.330056636725,0.192540760371,0.0659724728581,0.0770204073513,0.0274921778746,0.00550370717489,0.0109833557686,0.00549442824894,0.0,0.010972805256,0.0,0.00549598015337,0.0,0.0,0.0,0.0,0.00551421706168,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_15
y15_TET_15_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0917632079944,0.633559011637,0.989790504888,1.05102433758,0.744094801675,0.434199933413,0.213154808046,0.115462739059,0.0631728425952,0.0365086204957,0.0157838573059,0.0197451252806,0.0078951112346,0.00493792772495,0.00493571915759,0.000984219383707,0.00197349803006,0.000983908741293,0.00098817275936,0.0,0.00197004649222,0.000985428684564,0.000988894652248,0.0,0.0,0.0,0.000986794308699,0.0,0.0,0.000986290466745,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_16
y15_TET_16_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0151205420752,0.111653684494,0.261657168041,0.343094295266,0.331480637049,0.238964801687,0.16257873154,0.0804142402529,0.0380674438393,0.022441684025,0.0138683267656,0.00731159527149,0.00352862875846,0.0030259033186,0.0022693930793,0.000756678686561,0.000755416732519,0.00100723257957,0.000756493034476,0.0,0.000504272673936,0.00025201786373,0.0,0.0,0.000251958166982,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_17
y15_TET_17_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00372534281898,0.0183210275661,0.0360822271564,0.0684229371749,0.0853200768397,0.0930575474769,0.0821456755683,0.0710181160836,0.0538331497211,0.0357861528456,0.0266283032915,0.0197457052388,0.0137463515593,0.00743379452655,0.00229388610822,0.00200115782904,0.0014346567846,0.000573726553247,0.000572941480461,0.0,0.000572185899139,0.0,0.000286183773284,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_18
y15_TET_18_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000151233502217,0.00110138422607,0.00220321730341,0.00414651744629,0.00401822686611,0.00382183410569,0.00369232395761,0.00379994810059,0.00488093704425,0.00507276962655,0.00556959057563,0.00427551587877,0.00410439541507,0.00282801073919,0.00200744564252,0.00125294843654,0.000840630970017,0.00049662690786,0.000216079142796,0.000151277549154,0.000108084058385,2.15933968115e-05,2.16292043309e-05,2.16220838984e-05,2.16751874889e-05,0.0,4.31049358904e-05,0.0,0.0,0.0,0.0,0.0,0.0,2.15827559768e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_19
y15_TET_19_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,5.6858529886e-05,8.51343800459e-05,0.000226838278621,8.52280444334e-05,0.000283801015335,0.000312017486312,0.000227213054959,0.000113571381062,0.000141943835613,0.000198131509915,0.000283985582288,0.000396740102387,0.000510742936541,0.000312537184087,0.000566361289355,0.000284116249157,0.000340548894141,0.00028235298885,0.000170249130364,0.000310712005502,0.000140637552984,5.67977549434e-05,8.48283374507e-05,2.84489095189e-05,5.67791349145e-05,0.0,5.67987794904e-05,0.000113523850988,5.67183451234e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,2.79727327207e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating a new Canvas
fig = plt.figure(figsize=(12,6),dpi=80)
frame = gridspec.GridSpec(1,1,right=0.7)
pad = fig.add_subplot(frame[0])
# Creating a new Stack
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights+y15_TET_16_weights+y15_TET_17_weights+y15_TET_18_weights+y15_TET_19_weights,\
label="$bg\_vbf\_1600\_inf$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#ccc6aa", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights+y15_TET_16_weights+y15_TET_17_weights+y15_TET_18_weights,\
label="$bg\_vbf\_1200\_1600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#c1bfa8", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights+y15_TET_16_weights+y15_TET_17_weights,\
label="$bg\_vbf\_800\_1200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#bab5a3", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights+y15_TET_16_weights,\
label="$bg\_vbf\_600\_800$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#b2a596", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights,\
label="$bg\_vbf\_400\_600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#b7a39b", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights,\
label="$bg\_vbf\_200\_400$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#ad998c", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights,\
label="$bg\_vbf\_100\_200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#9b8e82", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights,\
label="$bg\_vbf\_0\_100$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#876656", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights,\
label="$bg\_dip\_1600\_inf$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#afcec6", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights,\
label="$bg\_dip\_1200\_1600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#84c1a3", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights,\
label="$bg\_dip\_800\_1200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#89a8a0", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights,\
label="$bg\_dip\_600\_800$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#829e8c", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights,\
label="$bg\_dip\_400\_600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#adbcc6", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights,\
label="$bg\_dip\_200\_400$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#7a8e99", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights,\
label="$bg\_dip\_100\_200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#758991", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights,\
label="$bg\_dip\_0\_100$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#688296", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights,\
label="$signal\_2pt4TeVL$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#6d7a84", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights,\
label="$signal\_2pt2TeVL$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#7c99d1", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights,\
label="$signal\_2TeVL$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#7f7f9b", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights,\
label="$signal\_1pt8TeVL$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#aaa5bf", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
# Axis
plt.rc('text',usetex=False)
plt.xlabel(r"TET",\
fontsize=16,color="black")
plt.ylabel(r"$\mathrm{Events}$ $(\mathcal{L}_{\mathrm{int}} = 40.0\ \mathrm{fb}^{-1})$ ",\
fontsize=16,color="black")
# Boundary of y-axis
ymax=(y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights+y15_TET_16_weights+y15_TET_17_weights+y15_TET_18_weights+y15_TET_19_weights).max()*1.1
ymin=0 # linear scale
#ymin=min([x for x in (y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights+y15_TET_16_weights+y15_TET_17_weights+y15_TET_18_weights+y15_TET_19_weights) if x])/100. # log scale
plt.gca().set_ylim(ymin,ymax)
# Log/Linear scale for X-axis
plt.gca().set_xscale("linear")
#plt.gca().set_xscale("log",nonposx="clip")
# Log/Linear scale for Y-axis
plt.gca().set_yscale("linear")
#plt.gca().set_yscale("log",nonposy="clip")
# Legend
plt.legend(bbox_to_anchor=(1.05,1), loc=2, borderaxespad=0.)
# Saving the image
plt.savefig('../../HTML/MadAnalysis5job_0/selection_14.png')
plt.savefig('../../PDF/MadAnalysis5job_0/selection_14.png')
plt.savefig('../../DVI/MadAnalysis5job_0/selection_14.eps')
# Running!
if __name__ == '__main__':
selection_14()
| 122.133028 | 1,089 | 0.73893 | 5,552 | 26,625 | 3.37482 | 0.130223 | 0.233549 | 0.337674 | 0.434435 | 0.555372 | 0.541015 | 0.541015 | 0.532049 | 0.522282 | 0.516838 | 0 | 0.432924 | 0.070498 | 26,625 | 217 | 1,090 | 122.695853 | 0.324188 | 0.057277 | 0 | 0.171875 | 0 | 0.007813 | 0.046327 | 0.0081 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007813 | false | 0 | 0.03125 | 0 | 0.039063 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3c633a2305c54e6e6ec0dc8e2cb8b213ef41e891 | 219 | py | Python | utils/xmlparser/__init__.py | ansvver/pylufia | 0076b4baef1de5371476910c12c1829d694fa2f3 | [
"MIT"
] | null | null | null | utils/xmlparser/__init__.py | ansvver/pylufia | 0076b4baef1de5371476910c12c1829d694fa2f3 | [
"MIT"
] | null | null | null | utils/xmlparser/__init__.py | ansvver/pylufia | 0076b4baef1de5371476910c12c1829d694fa2f3 | [
"MIT"
] | 1 | 2021-04-08T03:15:08.000Z | 2021-04-08T03:15:08.000Z | # -*- coding: utf-8 -*-
"""
====================================================================
xmlparser __init__.py
====================================================================
"""
from .xmlparser import *
| 21.9 | 68 | 0.219178 | 9 | 219 | 4.888889 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004951 | 0.077626 | 219 | 9 | 69 | 24.333333 | 0.212871 | 0.83105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
3c6aa1e1f2e6f5621904878dd50ce141c92e55e4 | 42 | py | Python | src/kapidox/depdiagram/__init__.py | KDE/kapidox | 0c590019ffbf4768232a941bea13bb9c981086a0 | [
"BSD-2-Clause"
] | 7 | 2015-12-14T09:18:09.000Z | 2020-07-30T17:39:46.000Z | src/kapidox/depdiagram/__init__.py | KDE/kapidox | 0c590019ffbf4768232a941bea13bb9c981086a0 | [
"BSD-2-Clause"
] | null | null | null | src/kapidox/depdiagram/__init__.py | KDE/kapidox | 0c590019ffbf4768232a941bea13bb9c981086a0 | [
"BSD-2-Clause"
] | 1 | 2020-04-13T18:04:03.000Z | 2020-04-13T18:04:03.000Z | from kapidox.depdiagram.generate import *
| 21 | 41 | 0.833333 | 5 | 42 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 42 | 1 | 42 | 42 | 0.921053 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b1e14df40713cadc4f83c807f436356ebf38c448 | 14,998 | py | Python | test/core/derivatives/convolution_settings.py | jabader97/backpack | 089daafa0d611e13901fd7ecf8a0d708ce7a5928 | [
"MIT"
] | 395 | 2019-10-04T09:37:52.000Z | 2022-03-29T18:00:56.000Z | test/core/derivatives/convolution_settings.py | jabader97/backpack | 089daafa0d611e13901fd7ecf8a0d708ce7a5928 | [
"MIT"
] | 78 | 2019-10-11T18:56:43.000Z | 2022-03-23T01:49:54.000Z | test/core/derivatives/convolution_settings.py | jabader97/backpack | 089daafa0d611e13901fd7ecf8a0d708ce7a5928 | [
"MIT"
] | 50 | 2019-10-03T16:31:10.000Z | 2022-03-15T19:36:14.000Z | """Test configurations for `backpack.core.derivatives` for CONVOLUTIONal layers
Required entries:
"module_fn" (callable): Contains a model constructed from `torch.nn` layers
"input_fn" (callable): Used for specifying input function
Optional entries:
"target_fn" (callable): Fetches the groundtruth/target classes
of regression/classification task
"loss_function_fn" (callable): Loss function used in the model
"device" [list(torch.device)]: List of devices to run the test on.
"id_prefix" (str): Prefix to be included in the test name.
"seed" (int): seed for the random number for torch.rand
"""
import torch
CONVOLUTION_SETTINGS = []
###############################################################################
# examples #
###############################################################################
example = {
"module_fn": lambda: torch.nn.Conv2d(
in_channels=2,
out_channels=3,
kernel_size=2,
bias=False,
padding=1,
stride=2,
dilation=2,
),
"input_fn": lambda: torch.rand(size=(3, 2, 7, 7)),
"device": [torch.device("cpu")], # optional
"seed": 0, # optional
"id_prefix": "conv-example", # optional
}
CONVOLUTION_SETTINGS.append(example)
###############################################################################
# test settings #
###############################################################################
CONVOLUTION_SETTINGS += [
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=3,
out_channels=9,
kernel_size=2,
padding=1,
bias=False,
groups=3,
),
"input_fn": lambda: torch.rand(size=(3, 3, 7)),
},
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=6,
out_channels=9,
kernel_size=2,
padding=1,
bias=False,
groups=3,
),
"input_fn": lambda: torch.rand(size=(3, 6, 7)),
},
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=4,
out_channels=8,
kernel_size=2,
padding=1,
bias=False,
groups=4,
),
"input_fn": lambda: torch.rand(size=(3, 4, 7)),
},
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=2, out_channels=3, kernel_size=2, padding=1, bias=False
),
"input_fn": lambda: torch.rand(size=(3, 2, 7)),
},
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=3,
out_channels=6,
kernel_size=2,
padding=2,
padding_mode="zeros",
stride=4,
dilation=3,
),
"input_fn": lambda: torch.rand(size=(1, 3, 8)),
},
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=3,
out_channels=6,
kernel_size=2,
padding=2,
padding_mode="zeros",
stride=4,
dilation=3,
groups=3,
),
"input_fn": lambda: torch.rand(size=(1, 3, 8)),
},
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=2, out_channels=3, kernel_size=2, padding=1, groups=1
),
"input_fn": lambda: torch.rand(size=(3, 2, 11)),
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=2, out_channels=4, kernel_size=2, padding=1, groups=2
),
"input_fn": lambda: torch.rand(size=(3, 2, 7, 7)),
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=4, out_channels=4, kernel_size=2, padding=1, groups=2
),
"input_fn": lambda: torch.rand(size=(3, 4, 7, 7)),
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=6, out_channels=9, kernel_size=2, padding=1, groups=3
),
"input_fn": lambda: torch.rand(size=(3, 6, 7, 7)),
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=4, out_channels=12, kernel_size=2, padding=1, groups=4
),
"input_fn": lambda: torch.rand(size=(3, 4, 7, 7)),
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=2,
out_channels=4,
kernel_size=2,
padding=2,
padding_mode="zeros",
stride=4,
dilation=3,
groups=2,
),
"input_fn": lambda: torch.rand(size=(1, 2, 8, 8)),
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=3,
out_channels=6,
kernel_size=2,
padding=2,
padding_mode="zeros",
stride=4,
dilation=3,
),
"input_fn": lambda: torch.rand(size=(1, 3, 8, 8)),
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=2, out_channels=3, kernel_size=2, padding=1, groups=1
),
"input_fn": lambda: torch.rand(size=(3, 2, 7, 7)),
},
{
"module_fn": lambda: torch.nn.Conv3d(
in_channels=2,
out_channels=4,
kernel_size=2,
groups=2,
),
"input_fn": lambda: torch.rand(size=(3, 2, 5, 7, 7)),
},
{
"module_fn": lambda: torch.nn.Conv3d(
in_channels=3,
out_channels=6,
kernel_size=2,
groups=3,
),
"input_fn": lambda: torch.rand(size=(3, 3, 5, 7, 7)),
},
{
"module_fn": lambda: torch.nn.Conv3d(
in_channels=2,
out_channels=3,
kernel_size=2,
padding=2,
bias=False,
dilation=2,
stride=2,
),
"input_fn": lambda: torch.rand(size=(3, 2, 5, 7, 7)),
},
{
"module_fn": lambda: torch.nn.Conv3d(
in_channels=3,
out_channels=6,
kernel_size=2,
padding=2,
padding_mode="zeros",
stride=4,
),
"input_fn": lambda: torch.rand(size=(1, 3, 3, 4, 4)),
},
{
"module_fn": lambda: torch.nn.Conv3d(
in_channels=2, out_channels=3, kernel_size=2, padding=1, groups=1
),
"input_fn": lambda: torch.rand(size=(3, 2, 3, 7, 7)),
},
{
"module_fn": lambda: torch.nn.ConvTranspose1d(
in_channels=2, out_channels=3, kernel_size=2, padding=1, bias=False
),
"input_fn": lambda: torch.rand(size=(3, 2, 7)),
},
{
"module_fn": lambda: torch.nn.ConvTranspose1d(
in_channels=3,
out_channels=6,
kernel_size=2,
padding=2,
padding_mode="zeros",
stride=4,
dilation=3,
),
"input_fn": lambda: torch.rand(size=(1, 3, 8)),
},
{
"module_fn": lambda: torch.nn.ConvTranspose1d(
in_channels=2, out_channels=3, kernel_size=2, padding=1, groups=1
),
"input_fn": lambda: torch.rand(size=(3, 2, 11)),
},
{
"module_fn": lambda: torch.nn.ConvTranspose2d(
in_channels=2,
out_channels=3,
kernel_size=2,
bias=False,
padding=1,
stride=2,
dilation=2,
),
"input_fn": lambda: torch.rand(size=(3, 2, 7, 7)),
},
{
"module_fn": lambda: torch.nn.ConvTranspose2d(
in_channels=3,
out_channels=6,
kernel_size=2,
padding=2,
padding_mode="zeros",
stride=4,
dilation=3,
),
"input_fn": lambda: torch.rand(size=(1, 3, 8, 8)),
},
{
"module_fn": lambda: torch.nn.ConvTranspose2d(
in_channels=2, out_channels=3, kernel_size=2, padding=1, groups=1
),
"input_fn": lambda: torch.rand(size=(3, 2, 7, 7)),
},
{
"module_fn": lambda: torch.nn.ConvTranspose3d(
in_channels=2,
out_channels=3,
kernel_size=2,
padding=2,
bias=False,
dilation=2,
stride=2,
),
"input_fn": lambda: torch.rand(size=(3, 2, 5, 7, 7)),
},
{
"module_fn": lambda: torch.nn.ConvTranspose3d(
in_channels=3,
out_channels=6,
kernel_size=2,
padding=2,
padding_mode="zeros",
stride=4,
),
"input_fn": lambda: torch.rand(size=(1, 3, 3, 4, 4)),
},
{
"module_fn": lambda: torch.nn.ConvTranspose3d(
in_channels=2, out_channels=3, kernel_size=2, padding=1, groups=1
),
"input_fn": lambda: torch.rand(size=(3, 2, 3, 7, 7)),
},
]
# non-default hyperparameters
CONVOLUTION_SETTINGS += [
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=2,
out_channels=3,
kernel_size=2,
padding=0,
dilation=2,
groups=1,
),
"input_fn": lambda: torch.rand(size=(3, 2, 7)),
"id_prefix": "non-default-conv",
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=2,
out_channels=3,
kernel_size=2,
padding=0,
dilation=2,
groups=1,
),
"input_fn": lambda: torch.rand(size=(3, 2, 7, 7)),
"id_prefix": "non-default-conv",
},
{
"module_fn": lambda: torch.nn.Conv3d(
in_channels=2,
out_channels=3,
kernel_size=2,
padding=0,
dilation=2,
groups=1,
),
"input_fn": lambda: torch.rand(size=(3, 2, 3, 7, 7)),
"id_prefix": "non-default-conv",
},
{
"module_fn": lambda: torch.nn.ConvTranspose1d(
in_channels=2,
out_channels=3,
kernel_size=2,
padding=0,
stride=3,
dilation=5,
groups=1,
),
"input_fn": lambda: torch.rand(size=(3, 2, 20)),
"id_prefix": "non-default-conv",
},
{
"module_fn": lambda: torch.nn.ConvTranspose2d(
in_channels=2,
out_channels=4,
kernel_size=2,
padding=0,
dilation=2,
groups=1,
),
"input_fn": lambda: torch.rand(size=(3, 2, 9, 9)),
"id_prefix": "non-default-conv",
},
{
"module_fn": lambda: torch.nn.ConvTranspose3d(
in_channels=2,
out_channels=4,
kernel_size=2,
padding=0,
dilation=2,
groups=1,
),
"input_fn": lambda: torch.rand(size=(3, 2, 9, 9, 9)),
"id_prefix": "non-default-conv",
},
]
_CONVOLUTION_GROUP_SETTINGS = [
# groups - 2
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=4,
out_channels=6,
kernel_size=2,
padding=0,
dilation=2,
groups=2,
),
"input_fn": lambda: torch.rand(size=(3, 4, 7)),
"id_prefix": "groups-2",
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=4,
out_channels=6,
kernel_size=2,
padding=0,
dilation=2,
groups=2,
),
"input_fn": lambda: torch.rand(size=(3, 4, 7, 7)),
"id_prefix": "groups-2",
},
{
"module_fn": lambda: torch.nn.Conv3d(
in_channels=4,
out_channels=6,
kernel_size=2,
padding=0,
dilation=2,
groups=2,
),
"input_fn": lambda: torch.rand(size=(3, 4, 3, 7, 7)),
"id_prefix": "groups-2",
},
# groups - 3
{
"module_fn": lambda: torch.nn.Conv1d(
in_channels=6,
out_channels=9,
kernel_size=2,
padding=0,
dilation=2,
groups=3,
),
"input_fn": lambda: torch.rand(size=(3, 6, 7)),
"id_prefix": "groups-3",
},
{
"module_fn": lambda: torch.nn.Conv2d(
in_channels=6,
out_channels=9,
kernel_size=2,
padding=0,
dilation=2,
groups=3,
),
"input_fn": lambda: torch.rand(size=(3, 6, 7, 7)),
"id_prefix": "groups-3",
},
{
"module_fn": lambda: torch.nn.Conv3d(
in_channels=6,
out_channels=9,
kernel_size=2,
padding=0,
dilation=2,
groups=3,
),
"input_fn": lambda: torch.rand(size=(3, 6, 3, 7, 7)),
"id_prefix": "groups-3",
},
]
CONVOLUTION_SETTINGS += _CONVOLUTION_GROUP_SETTINGS
_CONVOLUTION_TRANSPOSED_GROUP_SETTINGS = [
{
"module_fn": lambda: torch.nn.ConvTranspose1d(
in_channels=6,
out_channels=9,
kernel_size=2,
padding=0,
dilation=2,
groups=3,
),
"input_fn": lambda: torch.rand(size=(3, 6, 7)),
"id_prefix": "groups-3",
},
{
"module_fn": lambda: torch.nn.ConvTranspose2d(
in_channels=6,
out_channels=9,
kernel_size=2,
padding=0,
dilation=2,
groups=3,
),
"input_fn": lambda: torch.rand(size=(3, 6, 7, 7)),
"id_prefix": "groups-3",
},
{
"module_fn": lambda: torch.nn.ConvTranspose3d(
in_channels=6,
out_channels=9,
kernel_size=2,
padding=0,
dilation=2,
groups=3,
),
"input_fn": lambda: torch.rand(size=(3, 6, 3, 7, 7)),
"id_prefix": "groups-3",
},
{
"module_fn": lambda: torch.nn.ConvTranspose1d(
in_channels=4,
out_channels=6,
kernel_size=2,
padding=0,
dilation=2,
groups=2,
),
"input_fn": lambda: torch.rand(size=(3, 4, 7)),
"id_prefix": "groups-2",
},
{
"module_fn": lambda: torch.nn.ConvTranspose2d(
in_channels=4,
out_channels=6,
kernel_size=2,
padding=0,
dilation=2,
groups=2,
),
"input_fn": lambda: torch.rand(size=(3, 4, 7, 7)),
"id_prefix": "groups-2",
},
{
"module_fn": lambda: torch.nn.ConvTranspose3d(
in_channels=4,
out_channels=6,
kernel_size=2,
padding=0,
dilation=2,
groups=2,
),
"input_fn": lambda: torch.rand(size=(3, 4, 3, 7, 7)),
"id_prefix": "groups-2",
},
]
CONVOLUTION_SETTINGS += _CONVOLUTION_TRANSPOSED_GROUP_SETTINGS
| 28.086142 | 79 | 0.469663 | 1,676 | 14,998 | 4.031026 | 0.058473 | 0.111308 | 0.180876 | 0.132179 | 0.891652 | 0.880403 | 0.876406 | 0.870782 | 0.842214 | 0.838218 | 0 | 0.054733 | 0.37385 | 14,998 | 533 | 80 | 28.138837 | 0.664679 | 0.058674 | 0 | 0.756 | 0 | 0 | 0.089003 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.002 | 0 | 0.002 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3cc71bf2cd260c051376543fdee0f228a4e38841 | 10,738 | py | Python | lipsync/models.py | snehitvaddi/pseudo-visual-speech-denoising | 9247319ef967fa5c40902aeb7c04a117b1387f3a | [
"MIT"
] | 84 | 2020-12-22T10:50:06.000Z | 2022-03-22T06:17:06.000Z | lipsync/models.py | snehitvaddi/pseudo-visual-speech-denoising | 9247319ef967fa5c40902aeb7c04a117b1387f3a | [
"MIT"
] | 6 | 2020-12-23T03:49:47.000Z | 2022-03-30T11:41:26.000Z | lipsync/models.py | snehitvaddi/pseudo-visual-speech-denoising | 9247319ef967fa5c40902aeb7c04a117b1387f3a | [
"MIT"
] | 17 | 2020-12-23T15:52:01.000Z | 2022-02-07T11:58:02.000Z | import torch
from torch import nn
from torch.nn import functional as F
class Conv2d(nn.Module):
def __init__(self, cin, cout, kernel_size, stride, padding, residual=False, *args, **kwargs):
super().__init__(*args, **kwargs)
self.conv_block = nn.Sequential(
nn.Conv2d(cin, cout, kernel_size, stride, padding),
nn.BatchNorm2d(cout)
)
self.act = nn.ReLU()
self.residual = residual
def forward(self, x):
out = self.conv_block(x)
if self.residual:
out += x
return self.act(out)
class Conv2dTranspose(nn.Module):
def __init__(self, cin, cout, kernel_size, stride, padding, output_padding=0, *args, **kwargs):
super().__init__(*args, **kwargs)
self.conv_block = nn.Sequential(
nn.ConvTranspose2d(cin, cout, kernel_size, stride, padding, output_padding),
nn.BatchNorm2d(cout)
)
self.act = nn.ReLU()
def forward(self, x):
out = self.conv_block(x)
return self.act(out)
class Wav2Lip_Teacher(nn.Module):
def __init__(self):
super(Wav2Lip_Teacher, self).__init__()
self.face_encoder_blocks = nn.ModuleList([
nn.Sequential(Conv2d(6, 16, kernel_size=7, stride=1, padding=3)), # 96,96
nn.Sequential(Conv2d(16, 32, kernel_size=3, stride=2, padding=1), # 48,48
Conv2d(32, 32, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(32, 32, kernel_size=3, stride=1, padding=1, residual=True)),
nn.Sequential(Conv2d(32, 64, kernel_size=3, stride=2, padding=1), # 24,24
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True)),
nn.Sequential(Conv2d(64, 128, kernel_size=3, stride=2, padding=1), # 12,12
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True)),
nn.Sequential(Conv2d(128, 256, kernel_size=3, stride=2, padding=1), # 6,6
Conv2d(256, 256, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(256, 256, kernel_size=3, stride=1, padding=1, residual=True)),
nn.Sequential(Conv2d(256, 512, kernel_size=3, stride=2, padding=1), # 3,3
Conv2d(512, 512, kernel_size=3, stride=1, padding=1, residual=True),),
nn.Sequential(Conv2d(512, 512, kernel_size=3, stride=1, padding=0), # 1, 1
Conv2d(512, 512, kernel_size=1, stride=1, padding=0)),])
self.audio_encoder = nn.Sequential(
Conv2d(1, 32, kernel_size=3, stride=1, padding=1),
Conv2d(32, 32, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(32, 32, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(32, 64, kernel_size=3, stride=(3, 1), padding=1),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(64, 128, kernel_size=3, stride=3, padding=1),
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(128, 256, kernel_size=3, stride=(3, 2), padding=1),
Conv2d(256, 256, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(256, 512, kernel_size=3, stride=1, padding=0),
Conv2d(512, 512, kernel_size=1, stride=1, padding=0),)
self.face_decoder_blocks = nn.ModuleList([
nn.Sequential(Conv2d(512, 512, kernel_size=1, stride=1, padding=0),),
nn.Sequential(Conv2dTranspose(1024, 512, kernel_size=3, stride=1, padding=0), # 3,3
Conv2d(512, 512, kernel_size=3, stride=1, padding=1, residual=True),),
nn.Sequential(Conv2dTranspose(1024, 512, kernel_size=3, stride=2, padding=1, output_padding=1),
Conv2d(512, 512, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(512, 512, kernel_size=3, stride=1, padding=1, residual=True),), # 6, 6
nn.Sequential(Conv2dTranspose(768, 384, kernel_size=3, stride=2, padding=1, output_padding=1),
Conv2d(384, 384, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(384, 384, kernel_size=3, stride=1, padding=1, residual=True),), # 12, 12
nn.Sequential(Conv2dTranspose(512, 256, kernel_size=3, stride=2, padding=1, output_padding=1),
Conv2d(256, 256, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(256, 256, kernel_size=3, stride=1, padding=1, residual=True),), # 24, 24
nn.Sequential(Conv2dTranspose(320, 128, kernel_size=3, stride=2, padding=1, output_padding=1),
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True),), # 48, 48
nn.Sequential(Conv2dTranspose(160, 64, kernel_size=3, stride=2, padding=1, output_padding=1),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),),]) # 96,96
self.output_block = nn.Sequential(Conv2d(80, 32, kernel_size=3, stride=1, padding=1),
nn.Conv2d(32, 3, kernel_size=1, stride=1, padding=0),
nn.Sigmoid())
def forward(self, audio_sequences, face_sequences):
# audio_sequences = (B, T, 1, 80, 16)
B = audio_sequences.size(0)
input_dim_size = len(face_sequences.size())
if input_dim_size > 4:
audio_sequences = torch.cat([audio_sequences[:, i] for i in range(audio_sequences.size(1))], dim=0)
face_sequences = torch.cat([face_sequences[:, :, i] for i in range(face_sequences.size(2))], dim=0)
audio_embedding = self.audio_encoder(audio_sequences) # B, 512, 1, 1
feats = []
x = face_sequences
for f in self.face_encoder_blocks:
x = f(x)
feats.append(x)
x = audio_embedding
for f in self.face_decoder_blocks:
x = f(x)
try:
x = torch.cat((x, feats[-1]), dim=1)
except Exception as e:
print(x.size())
print(feats[-1].size())
raise e
feats.pop()
x = self.output_block(x)
if input_dim_size > 4:
x = torch.split(x, B, dim=0) # [(B, C, H, W)]
outputs = torch.stack(x, dim=2) # (B, C, T, H, W)
else:
outputs = x
return outputs
class Lipsync_Student(nn.Module):
def __init__(self):
super(Lipsync_Student, self).__init__()
self.audio_encoder = nn.Sequential(
Conv2d(1, 64, kernel_size=3, stride=1, padding=1),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(64, 128, kernel_size=3, stride=(3, 1), padding=1),
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(128, 256, kernel_size=3, stride=3, padding=1),
Conv2d(256, 256, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(256, 256, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(256, 512, kernel_size=3, stride=(3, 2), padding=1),
Conv2d(512, 512, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(512, 512, kernel_size=3, stride=1, padding=0),
Conv2d(512, 512, kernel_size=1, stride=1, padding=0),)
self.face_decoder_blocks = nn.ModuleList([
nn.Sequential(Conv2d(512, 512, kernel_size=1, stride=1, padding=0),),
nn.Sequential(Conv2dTranspose(512, 512, kernel_size=3, stride=1, padding=0), # 3,3
Conv2d(512, 512, kernel_size=3, stride=1, padding=1, residual=True),),
nn.Sequential(Conv2dTranspose(512, 512, kernel_size=3, stride=2, padding=1, output_padding=1),
Conv2d(512, 512, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(512, 512, kernel_size=3, stride=1, padding=1, residual=True),), # 6, 6
nn.Sequential(Conv2dTranspose(512, 256, kernel_size=3, stride=2, padding=1, output_padding=1),
Conv2d(256, 256, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(256, 256, kernel_size=3, stride=1, padding=1, residual=True),), # 12, 12
nn.Sequential(Conv2dTranspose(256, 128, kernel_size=3, stride=2, padding=1, output_padding=1),
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(128, 128, kernel_size=3, stride=1, padding=1, residual=True),), # 24, 24
nn.Sequential(Conv2dTranspose(128, 64, kernel_size=3, stride=2, padding=1, output_padding=1),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(64, 64, kernel_size=3, stride=1, padding=1, residual=True),), # 48, 48
nn.Sequential(Conv2dTranspose(64, 32, kernel_size=3, stride=(1, 2), padding=1, output_padding=(0, 1)),
Conv2d(32, 32, kernel_size=3, stride=1, padding=1, residual=True),
Conv2d(32, 32, kernel_size=3, stride=1, padding=1, residual=True),),]) # 48,96
self.output_block = nn.Sequential(Conv2d(32, 16, kernel_size=3, stride=1, padding=1),
nn.Conv2d(16, 3, kernel_size=1, stride=1, padding=0),
nn.Sigmoid())
def forward(self, audio_sequences):
# audio_sequences = (B, T, 1, 80, 16)
B = audio_sequences.size(0)
input_dim_size = len(audio_sequences.size())
if input_dim_size > 4:
audio_sequences = torch.cat([audio_sequences[:, i] for i in range(audio_sequences.size(1))], dim=0)
x = self.audio_encoder(audio_sequences) # B, 512, 1, 1
for f in self.face_decoder_blocks:
x = f(x)
x = self.output_block(x)
if input_dim_size > 4:
x = torch.split(x, B, dim=0) # [(B, C, H, W)]
outputs = torch.stack(x, dim=2) # (B, C, T, H, W)
else:
outputs = x
return outputs | 46.284483 | 114 | 0.594804 | 1,520 | 10,738 | 4.073684 | 0.066447 | 0.142119 | 0.135013 | 0.208656 | 0.909076 | 0.902455 | 0.871447 | 0.843346 | 0.797158 | 0.758398 | 0 | 0.113903 | 0.262526 | 10,738 | 232 | 115 | 46.284483 | 0.668014 | 0.024958 | 0 | 0.532544 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047337 | false | 0 | 0.017751 | 0 | 0.112426 | 0.011834 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3ceeab27d751f4613ec1a9c665737c2eabbcf1a6 | 2,188 | py | Python | tests/conic_sections/utils/circle/test_get_circle_formula.py | EderVs/Voronoi-Diagrams | 6e69f9b6eb516dee12d66f187cf267a7b527da5f | [
"MIT"
] | 3 | 2021-11-12T17:43:08.000Z | 2022-01-03T02:47:34.000Z | tests/conic_sections/utils/circle/test_get_circle_formula.py | EderVs/Voronoi-Diagrams | 6e69f9b6eb516dee12d66f187cf267a7b527da5f | [
"MIT"
] | 3 | 2021-11-19T20:12:31.000Z | 2021-11-19T20:14:39.000Z | tests/conic_sections/utils/circle/test_get_circle_formula.py | EderVs/Voronoi-Diagrams | 6e69f9b6eb516dee12d66f187cf267a7b527da5f | [
"MIT"
] | null | null | null | """Test get circle formula."""
# Conic Sections
from conic_sections.utils.circle import get_circle_formula_x, get_circle_formula_y
# Math
from decimal import Decimal
class TestGetCircleXFormula:
"""Test get circle formula x."""
def test_get_circle_formula_inside(self):
"""Test get circle x formula with y inside the range."""
h = Decimal(0)
k = Decimal(0)
r = Decimal(2)
y = Decimal(0)
points = get_circle_formula_x(h, k, r, y)
assert points is not None
assert points == (Decimal("2"), Decimal("-2"),)
y = Decimal(1)
points = get_circle_formula_x(h, k, r, y)
assert points is not None
assert points == (
Decimal("1.732050807568877293527446342"),
Decimal("-1.732050807568877293527446342"),
)
def test_get_circle_formula_frontier(self):
"""Test get circle x formula with y inside the range."""
h = Decimal(0)
k = Decimal(0)
r = Decimal(2)
y = Decimal(2)
points = get_circle_formula_x(h, k, r, y)
assert points is not None
assert points == (Decimal("0"), Decimal("0"),)
class TestGetCircleYFormula:
"""Test get circle formula y."""
def test_get_circle_formula_inside(self):
"""Test get circle y formula with x inside the range."""
h = Decimal(0)
k = Decimal(0)
r = Decimal(2)
x = Decimal(0)
points = get_circle_formula_y(h, k, r, x)
assert points is not None
assert points == (Decimal("2"), Decimal("-2"),)
x = Decimal(1)
points = get_circle_formula_y(h, k, r, x)
assert points is not None
assert points == (
Decimal("1.732050807568877293527446342"),
Decimal("-1.732050807568877293527446342"),
)
def test_get_circle_formula_frontier(self):
"""Test get circle x formula with y inside the range."""
h = Decimal(0)
k = Decimal(0)
r = Decimal(2)
x = Decimal(2)
points = get_circle_formula_y(h, k, r, x)
assert points is not None
assert points == (Decimal("0"), Decimal("0"),)
| 31.257143 | 82 | 0.587751 | 285 | 2,188 | 4.368421 | 0.122807 | 0.137349 | 0.192771 | 0.11245 | 0.808835 | 0.808835 | 0.792771 | 0.792771 | 0.792771 | 0.792771 | 0 | 0.089844 | 0.297989 | 2,188 | 69 | 83 | 31.710145 | 0.720703 | 0.138483 | 0 | 0.76 | 0 | 0 | 0.069264 | 0.063853 | 0 | 0 | 0 | 0 | 0.24 | 1 | 0.08 | false | 0 | 0.04 | 0 | 0.16 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a72dfa6d22ad98344f9139a5f3c84f87ec8b2445 | 23,732 | py | Python | sdk/python/pulumi_aws/ec2/vpc_ipam.py | chivandikwa/pulumi-aws | 19c08bf9dcb90544450ffa4eec7bf6751058fde2 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/ec2/vpc_ipam.py | chivandikwa/pulumi-aws | 19c08bf9dcb90544450ffa4eec7bf6751058fde2 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/ec2/vpc_ipam.py | chivandikwa/pulumi-aws | 19c08bf9dcb90544450ffa4eec7bf6751058fde2 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['VpcIpamArgs', 'VpcIpam']
@pulumi.input_type
class VpcIpamArgs:
def __init__(__self__, *,
operating_regions: pulumi.Input[Sequence[pulumi.Input['VpcIpamOperatingRegionArgs']]],
description: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a VpcIpam resource.
:param pulumi.Input[Sequence[pulumi.Input['VpcIpamOperatingRegionArgs']]] operating_regions: Determines which locales can be chosen when you create pools. Locale is the Region where you want to make an IPAM pool available for allocations. You can only create pools with locales that match the operating Regions of the IPAM. You can only create VPCs from a pool whose locale matches the VPC's Region. You specify a region using the region_name parameter. You **must** set your provider block region as an operating_region.
:param pulumi.Input[str] description: A description for the IPAM.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A map of tags to assign to the resource. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
pulumi.set(__self__, "operating_regions", operating_regions)
if description is not None:
pulumi.set(__self__, "description", description)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="operatingRegions")
def operating_regions(self) -> pulumi.Input[Sequence[pulumi.Input['VpcIpamOperatingRegionArgs']]]:
"""
Determines which locales can be chosen when you create pools. Locale is the Region where you want to make an IPAM pool available for allocations. You can only create pools with locales that match the operating Regions of the IPAM. You can only create VPCs from a pool whose locale matches the VPC's Region. You specify a region using the region_name parameter. You **must** set your provider block region as an operating_region.
"""
return pulumi.get(self, "operating_regions")
@operating_regions.setter
def operating_regions(self, value: pulumi.Input[Sequence[pulumi.Input['VpcIpamOperatingRegionArgs']]]):
pulumi.set(self, "operating_regions", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
A description for the IPAM.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A map of tags to assign to the resource. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _VpcIpamState:
def __init__(__self__, *,
arn: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
operating_regions: Optional[pulumi.Input[Sequence[pulumi.Input['VpcIpamOperatingRegionArgs']]]] = None,
private_default_scope_id: Optional[pulumi.Input[str]] = None,
public_default_scope_id: Optional[pulumi.Input[str]] = None,
scope_count: Optional[pulumi.Input[int]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
tags_all: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
Input properties used for looking up and filtering VpcIpam resources.
:param pulumi.Input[str] arn: Amazon Resource Name (ARN) of IPAM
:param pulumi.Input[str] description: A description for the IPAM.
:param pulumi.Input[Sequence[pulumi.Input['VpcIpamOperatingRegionArgs']]] operating_regions: Determines which locales can be chosen when you create pools. Locale is the Region where you want to make an IPAM pool available for allocations. You can only create pools with locales that match the operating Regions of the IPAM. You can only create VPCs from a pool whose locale matches the VPC's Region. You specify a region using the region_name parameter. You **must** set your provider block region as an operating_region.
:param pulumi.Input[str] private_default_scope_id: The ID of the IPAM's private scope. A scope is a top-level container in IPAM. Each scope represents an IP-independent network. Scopes enable you to represent networks where you have overlapping IP space. When you create an IPAM, IPAM automatically creates two scopes: public and private. The private scope is intended for private IP space. The public scope is intended for all internet-routable IP space.
:param pulumi.Input[str] public_default_scope_id: The ID of the IPAM's public scope. A scope is a top-level container in IPAM. Each scope represents an IP-independent network. Scopes enable you to represent networks where you have overlapping IP space. When you create an IPAM, IPAM automatically creates two scopes: public and private. The private scope is intended for private
IP space. The public scope is intended for all internet-routable IP space.
:param pulumi.Input[int] scope_count: The number of scopes in the IPAM.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A map of tags to assign to the resource. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags_all: A map of tags assigned to the resource, including those inherited from the provider `default_tags` configuration block.
"""
if arn is not None:
pulumi.set(__self__, "arn", arn)
if description is not None:
pulumi.set(__self__, "description", description)
if operating_regions is not None:
pulumi.set(__self__, "operating_regions", operating_regions)
if private_default_scope_id is not None:
pulumi.set(__self__, "private_default_scope_id", private_default_scope_id)
if public_default_scope_id is not None:
pulumi.set(__self__, "public_default_scope_id", public_default_scope_id)
if scope_count is not None:
pulumi.set(__self__, "scope_count", scope_count)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if tags_all is not None:
pulumi.set(__self__, "tags_all", tags_all)
@property
@pulumi.getter
def arn(self) -> Optional[pulumi.Input[str]]:
"""
Amazon Resource Name (ARN) of IPAM
"""
return pulumi.get(self, "arn")
@arn.setter
def arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "arn", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
A description for the IPAM.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="operatingRegions")
def operating_regions(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['VpcIpamOperatingRegionArgs']]]]:
"""
Determines which locales can be chosen when you create pools. Locale is the Region where you want to make an IPAM pool available for allocations. You can only create pools with locales that match the operating Regions of the IPAM. You can only create VPCs from a pool whose locale matches the VPC's Region. You specify a region using the region_name parameter. You **must** set your provider block region as an operating_region.
"""
return pulumi.get(self, "operating_regions")
@operating_regions.setter
def operating_regions(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['VpcIpamOperatingRegionArgs']]]]):
pulumi.set(self, "operating_regions", value)
@property
@pulumi.getter(name="privateDefaultScopeId")
def private_default_scope_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the IPAM's private scope. A scope is a top-level container in IPAM. Each scope represents an IP-independent network. Scopes enable you to represent networks where you have overlapping IP space. When you create an IPAM, IPAM automatically creates two scopes: public and private. The private scope is intended for private IP space. The public scope is intended for all internet-routable IP space.
"""
return pulumi.get(self, "private_default_scope_id")
@private_default_scope_id.setter
def private_default_scope_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "private_default_scope_id", value)
@property
@pulumi.getter(name="publicDefaultScopeId")
def public_default_scope_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the IPAM's public scope. A scope is a top-level container in IPAM. Each scope represents an IP-independent network. Scopes enable you to represent networks where you have overlapping IP space. When you create an IPAM, IPAM automatically creates two scopes: public and private. The private scope is intended for private
IP space. The public scope is intended for all internet-routable IP space.
"""
return pulumi.get(self, "public_default_scope_id")
@public_default_scope_id.setter
def public_default_scope_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_default_scope_id", value)
@property
@pulumi.getter(name="scopeCount")
def scope_count(self) -> Optional[pulumi.Input[int]]:
"""
The number of scopes in the IPAM.
"""
return pulumi.get(self, "scope_count")
@scope_count.setter
def scope_count(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "scope_count", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A map of tags to assign to the resource. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="tagsAll")
def tags_all(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A map of tags assigned to the resource, including those inherited from the provider `default_tags` configuration block.
"""
return pulumi.get(self, "tags_all")
@tags_all.setter
def tags_all(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags_all", value)
class VpcIpam(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
operating_regions: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['VpcIpamOperatingRegionArgs']]]]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
"""
Provides a IPAM resource.
## Import
IPAMs can be imported using the `ipam id`, e.g.
```sh
$ pulumi import aws:ec2/vpcIpam:VpcIpam example ipam-0178368ad2146a492
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] description: A description for the IPAM.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['VpcIpamOperatingRegionArgs']]]] operating_regions: Determines which locales can be chosen when you create pools. Locale is the Region where you want to make an IPAM pool available for allocations. You can only create pools with locales that match the operating Regions of the IPAM. You can only create VPCs from a pool whose locale matches the VPC's Region. You specify a region using the region_name parameter. You **must** set your provider block region as an operating_region.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A map of tags to assign to the resource. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: VpcIpamArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a IPAM resource.
## Import
IPAMs can be imported using the `ipam id`, e.g.
```sh
$ pulumi import aws:ec2/vpcIpam:VpcIpam example ipam-0178368ad2146a492
```
:param str resource_name: The name of the resource.
:param VpcIpamArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(VpcIpamArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
operating_regions: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['VpcIpamOperatingRegionArgs']]]]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = VpcIpamArgs.__new__(VpcIpamArgs)
__props__.__dict__["description"] = description
if operating_regions is None and not opts.urn:
raise TypeError("Missing required property 'operating_regions'")
__props__.__dict__["operating_regions"] = operating_regions
__props__.__dict__["tags"] = tags
__props__.__dict__["arn"] = None
__props__.__dict__["private_default_scope_id"] = None
__props__.__dict__["public_default_scope_id"] = None
__props__.__dict__["scope_count"] = None
__props__.__dict__["tags_all"] = None
super(VpcIpam, __self__).__init__(
'aws:ec2/vpcIpam:VpcIpam',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
arn: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
operating_regions: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['VpcIpamOperatingRegionArgs']]]]] = None,
private_default_scope_id: Optional[pulumi.Input[str]] = None,
public_default_scope_id: Optional[pulumi.Input[str]] = None,
scope_count: Optional[pulumi.Input[int]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
tags_all: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None) -> 'VpcIpam':
"""
Get an existing VpcIpam resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] arn: Amazon Resource Name (ARN) of IPAM
:param pulumi.Input[str] description: A description for the IPAM.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['VpcIpamOperatingRegionArgs']]]] operating_regions: Determines which locales can be chosen when you create pools. Locale is the Region where you want to make an IPAM pool available for allocations. You can only create pools with locales that match the operating Regions of the IPAM. You can only create VPCs from a pool whose locale matches the VPC's Region. You specify a region using the region_name parameter. You **must** set your provider block region as an operating_region.
:param pulumi.Input[str] private_default_scope_id: The ID of the IPAM's private scope. A scope is a top-level container in IPAM. Each scope represents an IP-independent network. Scopes enable you to represent networks where you have overlapping IP space. When you create an IPAM, IPAM automatically creates two scopes: public and private. The private scope is intended for private IP space. The public scope is intended for all internet-routable IP space.
:param pulumi.Input[str] public_default_scope_id: The ID of the IPAM's public scope. A scope is a top-level container in IPAM. Each scope represents an IP-independent network. Scopes enable you to represent networks where you have overlapping IP space. When you create an IPAM, IPAM automatically creates two scopes: public and private. The private scope is intended for private
IP space. The public scope is intended for all internet-routable IP space.
:param pulumi.Input[int] scope_count: The number of scopes in the IPAM.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A map of tags to assign to the resource. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags_all: A map of tags assigned to the resource, including those inherited from the provider `default_tags` configuration block.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _VpcIpamState.__new__(_VpcIpamState)
__props__.__dict__["arn"] = arn
__props__.__dict__["description"] = description
__props__.__dict__["operating_regions"] = operating_regions
__props__.__dict__["private_default_scope_id"] = private_default_scope_id
__props__.__dict__["public_default_scope_id"] = public_default_scope_id
__props__.__dict__["scope_count"] = scope_count
__props__.__dict__["tags"] = tags
__props__.__dict__["tags_all"] = tags_all
return VpcIpam(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def arn(self) -> pulumi.Output[str]:
"""
Amazon Resource Name (ARN) of IPAM
"""
return pulumi.get(self, "arn")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
A description for the IPAM.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="operatingRegions")
def operating_regions(self) -> pulumi.Output[Sequence['outputs.VpcIpamOperatingRegion']]:
"""
Determines which locales can be chosen when you create pools. Locale is the Region where you want to make an IPAM pool available for allocations. You can only create pools with locales that match the operating Regions of the IPAM. You can only create VPCs from a pool whose locale matches the VPC's Region. You specify a region using the region_name parameter. You **must** set your provider block region as an operating_region.
"""
return pulumi.get(self, "operating_regions")
@property
@pulumi.getter(name="privateDefaultScopeId")
def private_default_scope_id(self) -> pulumi.Output[str]:
"""
The ID of the IPAM's private scope. A scope is a top-level container in IPAM. Each scope represents an IP-independent network. Scopes enable you to represent networks where you have overlapping IP space. When you create an IPAM, IPAM automatically creates two scopes: public and private. The private scope is intended for private IP space. The public scope is intended for all internet-routable IP space.
"""
return pulumi.get(self, "private_default_scope_id")
@property
@pulumi.getter(name="publicDefaultScopeId")
def public_default_scope_id(self) -> pulumi.Output[str]:
"""
The ID of the IPAM's public scope. A scope is a top-level container in IPAM. Each scope represents an IP-independent network. Scopes enable you to represent networks where you have overlapping IP space. When you create an IPAM, IPAM automatically creates two scopes: public and private. The private scope is intended for private
IP space. The public scope is intended for all internet-routable IP space.
"""
return pulumi.get(self, "public_default_scope_id")
@property
@pulumi.getter(name="scopeCount")
def scope_count(self) -> pulumi.Output[int]:
"""
The number of scopes in the IPAM.
"""
return pulumi.get(self, "scope_count")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
A map of tags to assign to the resource. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter(name="tagsAll")
def tags_all(self) -> pulumi.Output[Mapping[str, str]]:
"""
A map of tags assigned to the resource, including those inherited from the provider `default_tags` configuration block.
"""
return pulumi.get(self, "tags_all")
| 58.597531 | 547 | 0.694252 | 3,107 | 23,732 | 5.138397 | 0.070808 | 0.072346 | 0.0456 | 0.028938 | 0.876041 | 0.858503 | 0.826182 | 0.816536 | 0.793799 | 0.76436 | 0 | 0.001727 | 0.219324 | 23,732 | 404 | 548 | 58.742574 | 0.859988 | 0.467596 | 0 | 0.564655 | 1 | 0 | 0.117757 | 0.052138 | 0 | 0 | 0 | 0 | 0 | 1 | 0.159483 | false | 0.00431 | 0.030172 | 0 | 0.288793 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
596535754261018b3fe97941d11447c11388b739 | 896 | py | Python | models/twitter.py | leirbag95/dyor | fd054a5e00b809ed17788c8073bb177a07f42b0e | [
"MIT"
] | null | null | null | models/twitter.py | leirbag95/dyor | fd054a5e00b809ed17788c8073bb177a07f42b0e | [
"MIT"
] | null | null | null | models/twitter.py | leirbag95/dyor | fd054a5e00b809ed17788c8073bb177a07f42b0e | [
"MIT"
] | null | null | null | from lightdb import LightDB
from datetime import datetime as dt
class Queue:
def __init__(self, path):
self.db = LightDB(path)
def update(self, name, twitter_id):
self.db.set(name, twitter_id)
def get(self, name):
return self.db.get(name)
def is_exist(self, name):
return not self.get(name) is None
def get_all(self):
return dict(self.db)
def delete(self, name):
self.db.pop(name)
class Account:
def __init__(self, path):
self.db = LightDB(path)
def update(self, name):
self.db.set(name, str(dt.now()))
def get(self, name):
return self.db.get(name)
def is_exist(self, name):
return not self.get(name) is None
def get_all(self):
return dict(self.db)
def delete(self, name):
self.db.pop(name) | 21.333333 | 41 | 0.571429 | 127 | 896 | 3.92126 | 0.244094 | 0.120482 | 0.11245 | 0.084337 | 0.742972 | 0.742972 | 0.742972 | 0.742972 | 0.742972 | 0.742972 | 0 | 0 | 0.316964 | 896 | 42 | 42 | 21.333333 | 0.813725 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0.071429 | 0.214286 | 0.785714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
597be821eb35e561dd05ce39f779c7c987011ec9 | 7,385 | py | Python | mozillians/dino_park/tests/test_views.py | divyamoncy/mozillians | d53d1d05d1f05b74f8533541e37083dcb89b29a8 | [
"BSD-3-Clause"
] | 202 | 2015-01-14T10:19:55.000Z | 2021-12-11T06:04:16.000Z | mozillians/dino_park/tests/test_views.py | divyamoncy/mozillians | d53d1d05d1f05b74f8533541e37083dcb89b29a8 | [
"BSD-3-Clause"
] | 2,924 | 2015-01-07T11:27:32.000Z | 2021-01-19T14:05:17.000Z | mozillians/dino_park/tests/test_views.py | divyamoncy/mozillians | d53d1d05d1f05b74f8533541e37083dcb89b29a8 | [
"BSD-3-Clause"
] | 270 | 2015-01-02T18:31:01.000Z | 2021-02-17T20:57:44.000Z | import mock
from django.test import RequestFactory
from django.test.utils import override_settings
from mozillians.common.tests import TestCase
from mozillians.dino_park import views
from mozillians.users.tests import UserFactory
class TestAPIEndpoints(TestCase):
def setUp(self):
self.factory = RequestFactory()
@mock.patch('mozillians.dino_park.views.requests.get')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_ORGCHART_SVC='orgchart-svc')
def test_orgchart_allowed_as_staff(self, mock_scope, mock_get):
response = mock.Mock()
response.json.return_value = {'foo': 'bar'}
# Mock UserAccessLevel class and attributes
mock_scope.get_privacy.return_value = 'staff'
mock_scope.STAFF = 'staff'
mock_get.return_value = response
request = self.factory.get('/')
request.user = UserFactory.create()
resp = views.orgchart(request)
mock_get.assert_called_with('http://orgchart-svc/orgchart')
self.assertEqual(resp.content, '{"foo": "bar"}')
@mock.patch('mozillians.dino_park.views.requests.get')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_ORGCHART_SVC='orgchart-svc')
def test_orgchart_allowed_as_private(self, mock_scope, mock_get):
response = mock.Mock()
response.json.return_value = {'foo': 'bar'}
# Mock UserAccessLevel class and attributes
mock_scope.get_privacy.return_value = 'private'
mock_scope.PRIVATE = 'private'
mock_get.return_value = response
request = self.factory.get('/')
request.user = UserFactory.create()
resp = views.orgchart(request)
mock_get.assert_called_with('http://orgchart-svc/orgchart')
self.assertEqual(resp.content, '{"foo": "bar"}')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_ORGCHART_SVC='orgchart-svc')
def test_orgchart_forbidden(self, mock_scope):
mock_scope.get_privacy.return_value = 'dummy'
request = self.factory.get('/')
request.user = UserFactory.create()
resp = views.orgchart(request)
self.assertEqual(resp.status_code, 403)
@mock.patch('mozillians.dino_park.views.requests.get')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_ORGCHART_SVC='orgchart-svc')
def test_orgchart_get_related_by_username(self, mock_scope, mock_get):
mock_scope.get_privacy.return_value = 'staff'
mock_scope.STAFF = 'staff'
response = mock.Mock()
response.json.return_value = {'foo': 'bar'}
mock_get.return_value = response
request = self.factory.get('/')
request.user = UserFactory.create()
resp = views.orgchart_get_by_username(request, 'related', 'asdf')
mock_get.assert_called_with('http://orgchart-svc/orgchart/related/asdf')
self.assertEqual(resp.content, '{"foo": "bar"}')
@mock.patch('mozillians.dino_park.views.requests.get')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_ORGCHART_SVC='orgchart-svc')
def test_orgchart_get_trace_by_username(self, mock_scope, mock_get):
mock_scope.get_privacy.return_value = 'staff'
mock_scope.STAFF = 'staff'
response = mock.Mock()
response.json.return_value = {'foo': 'bar'}
mock_get.return_value = response
request = self.factory.get('/')
request.user = UserFactory.create()
resp = views.orgchart_get_by_username(request, 'trace', 'asdf')
mock_get.assert_called_with('http://orgchart-svc/orgchart/trace/asdf')
self.assertEqual(resp.content, '{"foo": "bar"}')
@mock.patch('mozillians.dino_park.views.requests.get')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_ORGCHART_SVC='orgchart-svc')
def test_orgchart_get_trace_by_username_no_staff_profile(self, mock_scope, mock_get):
user_non_staff = UserFactory.create()
mock_scope.get_privacy.return_value = 'staff'
mock_scope.STAFF = 'staff'
response = mock.Mock()
response.json.return_value = {'foo': 'bar'}
mock_get.return_value = response
request = self.factory.get('/')
request.user = UserFactory.create()
resp = views.orgchart_get_by_username(request, 'trace', user_non_staff.username)
mock_get.assert_not_called()
self.assertEqual(resp.content, 'null')
@mock.patch('mozillians.dino_park.views.requests.get')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_ORGCHART_SVC='orgchart-svc')
def test_orgchart_get_trace_by_username_staff_profile(self, mock_scope, mock_get):
user_staff = UserFactory.create()
user_staff.userprofile.is_staff = True
user_staff.userprofile.save()
mock_scope.get_privacy.return_value = 'staff'
mock_scope.STAFF = 'staff'
response = mock.Mock()
response.json.return_value = {'foo': 'bar'}
mock_get.return_value = response
request = self.factory.get('/')
request.user = UserFactory.create()
resp = views.orgchart_get_by_username(request, 'trace', user_staff.username)
mock_get.assert_called_with(
'http://orgchart-svc/orgchart/trace/{0}'.format(user_staff.username))
self.assertEqual(resp.content, '{"foo": "bar"}')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_ORGCHART_SVC='orgchart-svc')
def test_orgchart_related_forbidden(self, mock_scope):
mock_scope.get_privacy.return_value = 'dummy'
request = self.factory.get('/')
request.user = UserFactory.create()
resp = views.orgchart_get_by_username(request, 'related', 'abc')
self.assertEqual(resp.status_code, 403)
@mock.patch('mozillians.dino_park.views.requests.get')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_SEARCH_SVC='search-svc')
def test_search_simple(self, mock_scope, mock_get):
mock_scope.get_privacy.return_value = 'dummy'
response = mock.Mock()
response.json.return_value = {'foo': 'bar'}
mock_get.return_value = response
request = self.factory.get('/', {'q': 'asdf'})
request.user = UserFactory.create()
resp = views.search_simple(request)
mock_get.assert_called_with('http://search-svc/search/simple/dummy?q=asdf')
self.assertEqual(resp.content, '{"foo": "bar"}')
@mock.patch('mozillians.dino_park.views.requests.get')
@mock.patch('mozillians.dino_park.views.UserAccessLevel')
@override_settings(DINO_PARK_SEARCH_SVC='search-svc')
def test_search_get_profile(self, mock_scope, mock_get):
mock_scope.get_privacy.return_value = 'dummy'
response = mock.Mock()
response.json.return_value = {'foo': 'bar'}
mock_get.return_value = response
request = self.factory.get('/')
request.user = UserFactory.create()
resp = views.search_get_profile(request, 'asdf')
mock_get.assert_called_with('http://search-svc/search/get/dummy/asdf')
self.assertEqual(resp.content, '{"foo": "bar"}')
| 45.869565 | 89 | 0.692485 | 899 | 7,385 | 5.427141 | 0.087875 | 0.047551 | 0.070096 | 0.084853 | 0.885837 | 0.877844 | 0.876409 | 0.856733 | 0.841976 | 0.824759 | 0 | 0.001156 | 0.180095 | 7,385 | 160 | 90 | 46.15625 | 0.804624 | 0.011239 | 0 | 0.707143 | 0 | 0 | 0.191533 | 0.100288 | 0 | 0 | 0 | 0 | 0.128571 | 1 | 0.078571 | false | 0 | 0.042857 | 0 | 0.128571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
59c222ef2b9a82498cda253c9b2b528a110e4052 | 5,370 | py | Python | test/utils/metadata_validation_test.py | imperial-genomics-facility/data-management-python | 7b867d8d4562a49173d0b823bdc4bf374a3688f0 | [
"Apache-2.0"
] | 7 | 2018-05-08T07:28:08.000Z | 2022-02-21T14:56:49.000Z | test/utils/metadata_validation_test.py | imperial-genomics-facility/data-management-python | 7b867d8d4562a49173d0b823bdc4bf374a3688f0 | [
"Apache-2.0"
] | 15 | 2021-08-19T12:32:20.000Z | 2022-02-09T19:52:51.000Z | test/utils/metadata_validation_test.py | imperial-genomics-facility/data-management-python | 7b867d8d4562a49173d0b823bdc4bf374a3688f0 | [
"Apache-2.0"
] | 2 | 2017-05-12T15:20:10.000Z | 2020-05-07T16:25:11.000Z | import os, unittest
import pandas as pd
from igf_data.utils.validation_check.metadata_validation import Validate_project_and_samplesheet_metadata
class Validate_project_and_samplesheet_metadata_test1(unittest.TestCase):
def setUp(self):
self.metadata_file='data/metadata_validation/metadata_file.csv'
self.samplesheet_file='data/metadata_validation/SampleSheet.csv'
self.samplesheet_schema='data/validation_schema/samplesheet_validation.json'
self.metadata_schema='data/validation_schema/metadata_validation.json'
def test_get_samplesheet_validation_report(self):
va = \
Validate_project_and_samplesheet_metadata(
samplesheet_file=self.samplesheet_file,
metadata_files=self.metadata_file,
samplesheet_schema=self.samplesheet_schema,
metadata_schema=self.metadata_schema)
errors = va.get_samplesheet_validation_report()
err_lines = [
val
for err_line in errors
for key,val in err_line.items()
if key=='line']
self.assertTrue(22 in err_lines)
for err_line in errors:
error = err_line.get('error')
column_name = err_line.get('column')
if 'line' in err_line and \
err_line.get('line')==22:
if column_name=='index':
self.assertTrue(' TAAGGCGA' in error)
elif column_name=='Sample_Name':
self.assertTrue('KDSC_76' in error)
def test_get_metadata_validation_report(self):
va = \
Validate_project_and_samplesheet_metadata(
samplesheet_file=self.samplesheet_file,
metadata_files=[self.metadata_file],
samplesheet_schema=self.samplesheet_schema,
metadata_schema=self.metadata_schema)
errors = va.get_metadata_validation_report()
for err_line in errors:
error = err_line.get('error')
column_name = err_line.get('column')
if column_name=='sample_submitter_id' and \
error.startswith('\'KDSC_77'):
self.assertTrue('KDSC_77' in error)
class Validate_project_and_samplesheet_metadata_test2(unittest.TestCase):
def setUp(self):
self.metadata_file = 'data/metadata_validation/metadata_file2.csv'
self.samplesheet_file = 'data/metadata_validation/SampleSheet2.csv'
self.samplesheet_schema = 'data/validation_schema/samplesheet_validation.json'
self.metadata_schema = 'data/validation_schema/metadata_validation.json'
def test_get_samplesheet_validation_report2(self):
va = \
Validate_project_and_samplesheet_metadata(
samplesheet_file=self.samplesheet_file,
metadata_files=self.metadata_file,
samplesheet_schema=self.samplesheet_schema,
metadata_schema=self.metadata_schema)
errors = va.get_samplesheet_validation_report()
err_lines = [
val
for err_line in errors
for key,val in err_line.items()
if key=='line']
self.assertTrue(22 in err_lines)
for err_line in errors:
error = err_line.get('error')
column_name = err_line.get('column')
if 'line' in err_line and \
err_line.get('line')==22:
if column_name=='index':
self.assertTrue(' TAAGGCGA' in error)
elif column_name=='Sample_Name':
self.assertTrue('KDSC_76' in error)
def test_get_metadata_validation_report2(self):
va = \
Validate_project_and_samplesheet_metadata(
samplesheet_file=self.samplesheet_file,
metadata_files=[self.metadata_file],
samplesheet_schema=self.samplesheet_schema,
metadata_schema=self.metadata_schema)
errors = va.get_metadata_validation_report()
for err_line in errors:
error = err_line.get('error')
column_name = err_line.get('column')
if column_name=='sample_submitter_id' and \
error.startswith('\'KDSC_77'):
self.assertTrue('KDSC_77' in error)
def test_check_metadata_library_by_row(self):
data = \
pd.Series(
dict(
sample_igf_id='SampleA',
library_source='GENOMIC',
library_strategy='WGS',
experiment_type='WGS'))
err = \
Validate_project_and_samplesheet_metadata.\
check_metadata_library_by_row(data)
self.assertIsNone(err)
data = \
pd.Series(
dict(
library_source='GENOMIC',
library_strategy='WGS',
experiment_type='WGS'))
err = \
Validate_project_and_samplesheet_metadata.\
check_metadata_library_by_row(data)
self.assertEqual(err,'Sample igf id not found')
data = \
pd.Series(
dict(
sample_igf_id='SampleA',
library_source='GENOMIC',
library_strategy='WGSA',
experiment_type='WGS'))
err = \
Validate_project_and_samplesheet_metadata.\
check_metadata_library_by_row(data)
self.assertTrue(err.startswith('SampleA'))
data = \
pd.Series(
dict(
sample_igf_id='SampleA',
library_source='GENOMIC',
library_strategy='WGS',
experiment_type='WGSA'))
err = \
Validate_project_and_samplesheet_metadata.\
check_metadata_library_by_row(data)
self.assertTrue(err.startswith('SampleA'))
if __name__=='__main__':
unittest.main() | 37.816901 | 106 | 0.672626 | 625 | 5,370 | 5.4208 | 0.1264 | 0.041322 | 0.058442 | 0.094156 | 0.930933 | 0.912633 | 0.887839 | 0.861865 | 0.861865 | 0.861865 | 0 | 0.006341 | 0.236499 | 5,370 | 142 | 107 | 37.816901 | 0.82 | 0 | 0 | 0.851852 | 0 | 0 | 0.127342 | 0.068834 | 0 | 0 | 0 | 0 | 0.088889 | 1 | 0.051852 | false | 0 | 0.022222 | 0 | 0.088889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ab7dcd453c255ab3738741c88a000dcfcda31d7d | 251 | py | Python | src/onepasswordconnectsdk/__init__.py | moyue83/connect-sdk-python | c809bfa0f9e520467323a572a48d14bd76f8871a | [
"MIT"
] | 93 | 2021-04-13T15:11:41.000Z | 2022-03-19T21:28:44.000Z | src/onepasswordconnectsdk/__init__.py | moyue83/connect-sdk-python | c809bfa0f9e520467323a572a48d14bd76f8871a | [
"MIT"
] | 20 | 2021-04-14T08:27:31.000Z | 2022-03-30T06:32:08.000Z | src/onepasswordconnectsdk/__init__.py | moyue83/connect-sdk-python | c809bfa0f9e520467323a572a48d14bd76f8871a | [
"MIT"
] | 14 | 2021-04-13T13:57:36.000Z | 2022-03-30T07:23:19.000Z | # coding: utf-8
# flake8: noqa
from onepasswordconnectsdk.config import load
from onepasswordconnectsdk.config import load_dict
from onepasswordconnectsdk.client import new_client
from onepasswordconnectsdk.client import new_client_from_environment
| 27.888889 | 68 | 0.868526 | 30 | 251 | 7.1 | 0.466667 | 0.469484 | 0.29108 | 0.347418 | 0.835681 | 0.450704 | 0.450704 | 0 | 0 | 0 | 0 | 0.008811 | 0.095618 | 251 | 8 | 69 | 31.375 | 0.929515 | 0.103586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 9 |
abe8da9e0974128d3e69eba6c0fd736119856a4f | 7,559 | py | Python | tests/push/test_status_tracker.py | cfogg/python-client | 40e6891c8240e6b2acd5df538e622e9f15de43d6 | [
"Apache-2.0"
] | 13 | 2017-03-17T15:15:20.000Z | 2022-03-14T22:24:10.000Z | tests/push/test_status_tracker.py | cfogg/python-client | 40e6891c8240e6b2acd5df538e622e9f15de43d6 | [
"Apache-2.0"
] | 81 | 2017-01-12T23:06:48.000Z | 2022-02-21T18:20:23.000Z | tests/push/test_status_tracker.py | cfogg/python-client | 40e6891c8240e6b2acd5df538e622e9f15de43d6 | [
"Apache-2.0"
] | 14 | 2017-05-25T10:49:13.000Z | 2021-12-27T16:39:20.000Z | """SSE Status tracker unit tests."""
#pylint:disable=protected-access,no-self-use,line-too-long
from splitio.push.status_tracker import PushStatusTracker, Status
from splitio.push.parser import ControlType, AblyError, OccupancyMessage, ControlMessage
class StatusTrackerTests(object):
"""Parser tests."""
def test_initial_status_and_reset(self):
"""Test the initial status is ok and reset() works as expected."""
tracker = PushStatusTracker()
assert tracker._occupancy_ok()
assert tracker._last_control_message == ControlType.STREAMING_ENABLED
assert tracker._last_status_propagated == Status.PUSH_SUBSYSTEM_UP
assert not tracker._shutdown_expected
tracker._last_control_message = ControlType.STREAMING_PAUSED
tracker._publishers['control_pri'] = 0
tracker._publishers['control_sec'] = 1
tracker._last_status_propagated = Status.PUSH_NONRETRYABLE_ERROR
tracker.reset()
assert tracker._occupancy_ok()
assert tracker._last_control_message == ControlType.STREAMING_ENABLED
assert tracker._last_status_propagated == Status.PUSH_SUBSYSTEM_UP
assert not tracker._shutdown_expected
def test_handling_occupancy(self):
"""Test handling occupancy works properly."""
tracker = PushStatusTracker()
assert tracker._occupancy_ok()
message = OccupancyMessage('[?occupancy=metrics.publishers]control_sec', 123, 0)
assert tracker.handle_occupancy(message) is None
# old message
message = OccupancyMessage('[?occupancy=metrics.publishers]control_pri', 122, 0)
assert tracker.handle_occupancy(message) is None
message = OccupancyMessage('[?occupancy=metrics.publishers]control_pri', 124, 0)
assert tracker.handle_occupancy(message) is Status.PUSH_SUBSYSTEM_DOWN
message = OccupancyMessage('[?occupancy=metrics.publishers]control_pri', 125, 1)
assert tracker.handle_occupancy(message) is Status.PUSH_SUBSYSTEM_UP
message = OccupancyMessage('[?occupancy=metrics.publishers]control_sec', 125, 2)
assert tracker.handle_occupancy(message) is None
def test_handling_control(self):
"""Test handling incoming control messages."""
tracker = PushStatusTracker()
assert tracker._last_control_message == ControlType.STREAMING_ENABLED
assert tracker._last_status_propagated == Status.PUSH_SUBSYSTEM_UP
message = ControlMessage('control_pri', 123, ControlType.STREAMING_ENABLED)
assert tracker.handle_control_message(message) is None
# old message
message = ControlMessage('control_pri', 122, ControlType.STREAMING_PAUSED)
assert tracker.handle_control_message(message) is None
message = ControlMessage('control_pri', 124, ControlType.STREAMING_PAUSED)
assert tracker.handle_control_message(message) is Status.PUSH_SUBSYSTEM_DOWN
message = ControlMessage('control_pri', 125, ControlType.STREAMING_ENABLED)
assert tracker.handle_control_message(message) is Status.PUSH_SUBSYSTEM_UP
message = ControlMessage('control_pri', 126, ControlType.STREAMING_DISABLED)
assert tracker.handle_control_message(message) is Status.PUSH_NONRETRYABLE_ERROR
# test that disabling works as well with streaming paused
tracker = PushStatusTracker()
assert tracker._last_control_message == ControlType.STREAMING_ENABLED
assert tracker._last_status_propagated == Status.PUSH_SUBSYSTEM_UP
message = ControlMessage('control_pri', 124, ControlType.STREAMING_PAUSED)
assert tracker.handle_control_message(message) is Status.PUSH_SUBSYSTEM_DOWN
message = ControlMessage('control_pri', 126, ControlType.STREAMING_DISABLED)
assert tracker.handle_control_message(message) is Status.PUSH_NONRETRYABLE_ERROR
def test_control_occupancy_overlap(self):
"""Test control and occupancy messages together."""
tracker = PushStatusTracker()
assert tracker._last_control_message == ControlType.STREAMING_ENABLED
assert tracker._last_status_propagated == Status.PUSH_SUBSYSTEM_UP
message = ControlMessage('control_pri', 122, ControlType.STREAMING_PAUSED)
assert tracker.handle_control_message(message) is Status.PUSH_SUBSYSTEM_DOWN
message = OccupancyMessage('[?occupancy=metrics.publishers]control_sec', 123, 0)
assert tracker.handle_occupancy(message) is None
message = OccupancyMessage('[?occupancy=metrics.publishers]control_pri', 124, 0)
assert tracker.handle_occupancy(message) is None
message = ControlMessage('control_pri', 125, ControlType.STREAMING_ENABLED)
assert tracker.handle_control_message(message) is None
message = OccupancyMessage('[?occupancy=metrics.publishers]control_pri', 126, 1)
assert tracker.handle_occupancy(message) is Status.PUSH_SUBSYSTEM_UP
def test_ably_error(self):
"""Test the status tracker reacts appropriately to an ably error."""
tracker = PushStatusTracker()
assert tracker._last_control_message == ControlType.STREAMING_ENABLED
assert tracker._last_status_propagated == Status.PUSH_SUBSYSTEM_UP
message = AblyError(39999, 100, 'some message', 'http://somewhere')
assert tracker.handle_ably_error(message) is None
message = AblyError(50000, 100, 'some message', 'http://somewhere')
assert tracker.handle_ably_error(message) is None
tracker.reset()
message = AblyError(40140, 100, 'some message', 'http://somewhere')
assert tracker.handle_ably_error(message) is Status.PUSH_RETRYABLE_ERROR
tracker.reset()
message = AblyError(40149, 100, 'some message', 'http://somewhere')
assert tracker.handle_ably_error(message) is Status.PUSH_RETRYABLE_ERROR
tracker.reset()
message = AblyError(40150, 100, 'some message', 'http://somewhere')
assert tracker.handle_ably_error(message) is Status.PUSH_NONRETRYABLE_ERROR
tracker.reset()
message = AblyError(40139, 100, 'some message', 'http://somewhere')
assert tracker.handle_ably_error(message) is Status.PUSH_NONRETRYABLE_ERROR
def test_disconnect_expected(self):
"""Test that no error is propagated when a disconnect is expected."""
tracker = PushStatusTracker()
assert tracker._last_control_message == ControlType.STREAMING_ENABLED
assert tracker._last_status_propagated == Status.PUSH_SUBSYSTEM_UP
tracker.notify_sse_shutdown_expected()
assert tracker.handle_ably_error(AblyError(40139, 100, 'some message', 'http://somewhere')) is None
assert tracker.handle_ably_error(AblyError(40149, 100, 'some message', 'http://somewhere')) is None
assert tracker.handle_ably_error(AblyError(39999, 100, 'some message', 'http://somewhere')) is None
assert tracker.handle_control_message(ControlMessage('control_pri', 123, ControlType.STREAMING_ENABLED)) is None
assert tracker.handle_control_message(ControlMessage('control_pri', 124, ControlType.STREAMING_PAUSED)) is None
assert tracker.handle_control_message(ControlMessage('control_pri', 125, ControlType.STREAMING_DISABLED)) is None
assert tracker.handle_occupancy(OccupancyMessage('[?occupancy=metrics.publishers]control_sec', 123, 0)) is None
assert tracker.handle_occupancy(OccupancyMessage('[?occupancy=metrics.publishers]control_sec', 124, 1)) is None
| 51.074324 | 121 | 0.73687 | 845 | 7,559 | 6.334911 | 0.119527 | 0.11657 | 0.110032 | 0.046142 | 0.836353 | 0.832057 | 0.786662 | 0.741453 | 0.726135 | 0.72464 | 0 | 0.024096 | 0.176478 | 7,559 | 147 | 122 | 51.421769 | 0.835823 | 0.065749 | 0 | 0.646465 | 0 | 0 | 0.117697 | 0.059846 | 0 | 0 | 0 | 0 | 0.505051 | 1 | 0.060606 | false | 0 | 0.020202 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e63c6c43d2522e1c7d3a15b9825e793ba95c2298 | 92 | py | Python | web2py/parameters_8001.py | seadsystem/website | d7e8110c012d94c494895fde9951eef728266798 | [
"MIT"
] | 5 | 2016-03-20T07:07:17.000Z | 2018-04-12T16:34:12.000Z | web2py/parameters_8001.py | grant523/website | 1b5351e638a15a1c0c68e55f9162d1b8e62974bb | [
"MIT"
] | 22 | 2015-11-04T23:11:41.000Z | 2016-05-19T00:08:54.000Z | web2py/parameters_8001.py | grant523/website | 1b5351e638a15a1c0c68e55f9162d1b8e62974bb | [
"MIT"
] | 4 | 2015-10-29T01:46:44.000Z | 2018-04-09T22:05:28.000Z | password="pbkdf2(1000,20,sha512)$b6a8b74fe1d3e5c7$ebc55f586425e9cdf0e49fe3969e77f4d48c0422"
| 46 | 91 | 0.891304 | 7 | 92 | 11.714286 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.472527 | 0.01087 | 92 | 1 | 92 | 92 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0.869565 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
e666319dc3f78b1b51da9ce3390f469f21744313 | 25,112 | py | Python | src/common/models/inception/model.py | pfnet-research/chainer-stylegan | 9bb2f5ac9d68958e594d03662ca791f403a13574 | [
"MIT"
] | 84 | 2019-02-28T12:57:37.000Z | 2021-12-05T16:54:36.000Z | src/common/models/inception/model.py | pfnet-research/chainer-stylegan | 9bb2f5ac9d68958e594d03662ca791f403a13574 | [
"MIT"
] | 6 | 2019-03-02T17:43:29.000Z | 2019-11-18T07:56:20.000Z | src/common/models/inception/model.py | pfnet-research/chainer-stylegan | 9bb2f5ac9d68958e594d03662ca791f403a13574 | [
"MIT"
] | 23 | 2019-03-01T17:59:19.000Z | 2021-08-12T18:08:36.000Z | #Modified from https://github.com/hvy/chainer-inception-score
import math
import chainer
from chainer import Chain
from chainer import functions as F
from chainer import links as L
from chainer import Variable
from chainer.functions.activation.relu import ReLU
from chainer.functions.pooling.average_pooling_2d import AveragePooling2D
from chainer.functions.pooling.max_pooling_2d import MaxPooling2D
class Mixed(Chain):
def __init__(self, trunk):
super().__init__()
for name, link in trunk:
self.add_link(name, link)
self.trunk = trunk
def __call__(self, x):
hs = []
#print(type(x))
for name, f in self.trunk:
if not name.startswith('_'):
if 'bn' in name:
h = getattr(self, name)(x)
else:
h = getattr(self, name)(x)
else:
h = f.apply((x,))[0]
hs.append(h)
return F.concat(hs)
class Tower(Chain):
def __init__(self, trunk):
super().__init__()
for name, link in trunk:
if not name.startswith('_'):
self.add_link(name, link)
self.trunk = trunk
def __call__(self, x):
h = x
for name, f in self.trunk:
if not name.startswith('_'): # Link
if 'bn' in name:
h = getattr(self, name)(h)
else:
h = getattr(self, name)(h)
else: # AveragePooling2D, MaxPooling2D or ReLU
h = f.apply((h,))[0]
return h
class Inception(Chain):
def __init__(self):
super().__init__(
conv=L.Convolution2D(3, 32, 3, stride=2, pad=0),
conv_1=L.Convolution2D(32, 32, 3, stride=1, pad=0),
conv_2=L.Convolution2D(32, 64, 3, stride=1, pad=1),
conv_3=L.Convolution2D(64, 80, 1, stride=1, pad=0),
conv_4=L.Convolution2D(80, 192, 3, stride=1, pad=0),
bn_conv=L.BatchNormalization(32),
bn_conv_1=L.BatchNormalization(32),
bn_conv_2=L.BatchNormalization(64),
bn_conv_3=L.BatchNormalization(80),
bn_conv_4=L.BatchNormalization(192),
mixed=Mixed([
('conv', Tower([
('conv', L.Convolution2D(192, 64, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(64)),
('_relu', ReLU())
])),
('tower', Tower([
('conv', L.Convolution2D(192, 48, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(48)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(48, 64, 5, stride=1, pad=2)),
('bn_conv_1', L.BatchNormalization(64)),
('_relu_1', ReLU())
])),
('tower_1', Tower([
('conv', L.Convolution2D(192, 64, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(64)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(64, 96, 3, stride=1, pad=1)),
('bn_conv_1', L.BatchNormalization(96)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(96, 96, 3, stride=1, pad=1)),
('bn_conv_2', L.BatchNormalization(96)),
('_relu_2', ReLU())
])),
('tower_2', Tower([
('_pooling', AveragePooling2D(3,1,pad=1)),
('conv', L.Convolution2D(192, 32, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(32)),
('_relu', ReLU())
]))
]),
mixed_1=Mixed([
('conv', Tower([
('conv', L.Convolution2D(256, 64, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(64)),
('_relu', ReLU())
])),
('tower', Tower([
('conv', L.Convolution2D(256, 48, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(48)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(48, 64, 5, stride=1, pad=2)),
('bn_conv_1', L.BatchNormalization(64)),
('_relu_1', ReLU())
])),
('tower_1', Tower([
('conv', L.Convolution2D(256, 64, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(64)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(64, 96, 3, stride=1, pad=1)),
('bn_conv_1', L.BatchNormalization(96)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(96, 96, 3, stride=1, pad=1)),
('bn_conv_2', L.BatchNormalization(96)),
('_relu_2', ReLU())
])),
('tower_2', Tower([
('_pooling', AveragePooling2D(3,1,pad=1)),
('conv', L.Convolution2D(256, 64, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(64)),
('_relu', ReLU())
]))
]),
mixed_2=Mixed([
('conv', Tower([
('conv', L.Convolution2D(288, 64, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(64)),
('_relu', ReLU())
])),
('tower', Tower([
('conv', L.Convolution2D(288, 48, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(48)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(48, 64, 5, stride=1, pad=2)),
('bn_conv_1', L.BatchNormalization(64)),
('_relu_1', ReLU())
])),
('tower_1', Tower([
('conv', L.Convolution2D(288, 64, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(64)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(64, 96, 3, stride=1, pad=1)),
('bn_conv_1', L.BatchNormalization(96)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(96, 96, 3, stride=1, pad=1)),
('bn_conv_2', L.BatchNormalization(96)),
('_relu_2', ReLU())
])),
('tower_2', Tower([
('_pooling', AveragePooling2D(3,1,pad=1)),
('conv', L.Convolution2D(288, 64, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(64)),
('_relu', ReLU())
]))
]),
mixed_3=Mixed([
('conv', Tower([
('conv', L.Convolution2D(288, 384, 3, stride=2, pad=0)),
('bn_conv', L.BatchNormalization(384)),
('_relu', ReLU())
])),
('tower', Tower([
('conv', L.Convolution2D(288, 64, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(64)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(64, 96, 3, stride=1, pad=1)),
('bn_conv_1', L.BatchNormalization(96)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(96, 96, 3, stride=2, pad=0)),
('bn_conv_2', L.BatchNormalization(96)),
('_relu_2', ReLU())
])),
('pool', Tower([
('_pooling', MaxPooling2D(3, 2, pad=0))
]))
]),
mixed_4=Mixed([
('conv', Tower([
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
])),
('tower', Tower([
('conv', L.Convolution2D(768, 128, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(128)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(128, 128, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_1', L.BatchNormalization(128)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(128, 192, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_2', L.BatchNormalization(192)),
('_relu_2', ReLU())
])),
('tower_1', Tower([
('conv', L.Convolution2D(768, 128, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(128)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(128, 128, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_1', L.BatchNormalization(128)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(128, 128, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_2', L.BatchNormalization(128)),
('_relu_2', ReLU()),
('conv_3', L.Convolution2D(128, 128, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_3', L.BatchNormalization(128)),
('_relu_3', ReLU()),
('conv_4', L.Convolution2D(128, 192, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_4', L.BatchNormalization(192)),
('_relu_4', ReLU())
])),
('tower_2', Tower([
('_pooling', AveragePooling2D(3,1,pad=1)),
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
]))
]),
mixed_5=Mixed([
('conv', Tower([
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
])),
('tower', Tower([
('conv', L.Convolution2D(768, 160, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(160)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(160, 160, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_1', L.BatchNormalization(160)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(160, 192, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_2', L.BatchNormalization(192)),
('_relu_2', ReLU())
])),
('tower_1', Tower([
('conv', L.Convolution2D(768, 160, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(160)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(160, 160, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_1', L.BatchNormalization(160)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(160, 160, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_2', L.BatchNormalization(160)),
('_relu_2', ReLU()),
('conv_3', L.Convolution2D(160, 160, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_3', L.BatchNormalization(160)),
('_relu_3', ReLU()),
('conv_4', L.Convolution2D(160, 192, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_4', L.BatchNormalization(192)),
('_relu_4', ReLU())
])),
('tower_2', Tower([
('_pooling', AveragePooling2D(3,1,pad=1)),
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
]))
]),
mixed_6=Mixed([
('conv', Tower([
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
])),
('tower', Tower([
('conv', L.Convolution2D(768, 160, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(160)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(160, 160, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_1', L.BatchNormalization(160)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(160, 192, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_2', L.BatchNormalization(192)),
('_relu_2', ReLU())
])),
('tower_1', Tower([
('conv', L.Convolution2D(768, 160, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(160)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(160, 160, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_1', L.BatchNormalization(160)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(160, 160, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_2', L.BatchNormalization(160)),
('_relu_2', ReLU()),
('conv_3', L.Convolution2D(160, 160, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_3', L.BatchNormalization(160)),
('_relu_3', ReLU()),
('conv_4', L.Convolution2D(160, 192, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_4', L.BatchNormalization(192)),
('_relu_4', ReLU())
])),
('tower_2', Tower([
('_pooling', AveragePooling2D(3,1,pad=1)),
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
]))
]),
mixed_7=Mixed([
('conv', Tower([
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
])),
('tower', Tower([
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(192, 192, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_1', L.BatchNormalization(192)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(192, 192, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_2', L.BatchNormalization(192)),
('_relu_2', ReLU())
])),
('tower_1', Tower([
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(192, 192, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_1', L.BatchNormalization(192)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(192, 192, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_2', L.BatchNormalization(192)),
('_relu_2', ReLU()),
('conv_3', L.Convolution2D(192, 192, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_3', L.BatchNormalization(192)),
('_relu_3', ReLU()),
('conv_4', L.Convolution2D(192, 192, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_4', L.BatchNormalization(192)),
('_relu_4', ReLU())
])),
('tower_2', Tower([
('_pooling', AveragePooling2D(3,1,pad=1)),
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
]))
]),
mixed_8=Mixed([
('tower', Tower([
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(192, 320, 3, stride=2, pad=0)),
('bn_conv_1', L.BatchNormalization(320)),
('_relu_1', ReLU())
])),
('tower_1', Tower([
('conv', L.Convolution2D(768, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(192, 192, (1, 7), stride=1, pad=(0, 3))),
('bn_conv_1', L.BatchNormalization(192)),
('_relu_1', ReLU()),
('conv_2', L.Convolution2D(192, 192, (7, 1), stride=1, pad=(3, 0))),
('bn_conv_2', L.BatchNormalization(192)),
('_relu_2', ReLU()),
('conv_3', L.Convolution2D(192, 192, 3, stride=2, pad=0)),
('bn_conv_3', L.BatchNormalization(192)),
('_relu_3', ReLU())
])),
('pool', Tower([
('_pooling', MaxPooling2D(3, 2, pad=0))
]))
]),
mixed_9=Mixed([
('conv', Tower([
('conv', L.Convolution2D(1280, 320, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(320)),
('_relu', ReLU()),
])),
('tower', Tower([
('conv', L.Convolution2D(1280, 384, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(384)),
('_relu', ReLU()),
('mixed', Mixed([
('conv', Tower([
('conv', L.Convolution2D(384, 384, (1, 3), stride=1, pad=(0, 1))),
('bn_conv', L.BatchNormalization(384)),
('_relu', ReLU()),
])),
('conv_1', Tower([
('conv_1', L.Convolution2D(384, 384, (3, 1), stride=1, pad=(1, 0))),
('bn_conv_1', L.BatchNormalization(384)),
('_relu_1', ReLU()),
]))
]))
])),
('tower_1', Tower([
('conv', L.Convolution2D(1280, 448, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(448)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(448, 384, 3, stride=1, pad=1)),
('bn_conv_1', L.BatchNormalization(384)),
('_relu_1', ReLU()),
('mixed', Mixed([
('conv', Tower([
('conv', L.Convolution2D(384, 384, (1, 3), stride=1, pad=(0, 1))),
('bn_conv', L.BatchNormalization(384)),
('_relu', ReLU()),
])),
('conv_1', Tower([
('conv_1', L.Convolution2D(384, 384, (3, 1), stride=1, pad=(1, 0))),
('bn_conv_1', L.BatchNormalization(384)),
('_relu_1', ReLU()),
]))
]))
])),
('tower_2', Tower([
('_pooling', AveragePooling2D(3,1,pad=1)),
('conv', L.Convolution2D(1280, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
]))
]),
mixed_10=Mixed([
('conv', Tower([
('conv', L.Convolution2D(2048, 320, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(320)),
('_relu', ReLU()),
])),
('tower', Tower([
('conv', L.Convolution2D(2048, 384, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(384)),
('_relu', ReLU()),
('mixed', Mixed([
('conv', Tower([
('conv', L.Convolution2D(384, 384, (1, 3), stride=1, pad=(0, 1))),
('bn_conv', L.BatchNormalization(384)),
('_relu', ReLU()),
])),
('conv_1', Tower([
('conv_1', L.Convolution2D(384, 384, (3, 1), stride=1, pad=(1, 0))),
('bn_conv_1', L.BatchNormalization(384)),
('_relu_1', ReLU()),
]))
]))
])),
('tower_1', Tower([
('conv', L.Convolution2D(2048, 448, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(448)),
('_relu', ReLU()),
('conv_1', L.Convolution2D(448, 384, 3, stride=1, pad=1)),
('bn_conv_1', L.BatchNormalization(384)),
('_relu_1', ReLU()),
('mixed', Mixed([
('conv', Tower([
('conv', L.Convolution2D(384, 384, (1, 3), stride=1, pad=(0, 1))),
('bn_conv', L.BatchNormalization(384)),
('_relu', ReLU())
])),
('conv_1', Tower([
('conv_1', L.Convolution2D(384, 384, (3, 1), stride=1, pad=(1, 0))),
('bn_conv_1', L.BatchNormalization(384)),
('_relu_1', ReLU())
]))
]))
])),
('tower_2', Tower([
('_pooling', MaxPooling2D(3, 1, pad=1)),
('conv', L.Convolution2D(2048, 192, 1, stride=1, pad=0)),
('bn_conv', L.BatchNormalization(192)),
('_relu', ReLU())
]))
]),
logit=L.Linear(2048, 1008)
)
def __call__(self, x, get_feature=False, scaled=False, resize=False):
"""Input dims are (batch_size, 3, 299, 299)."""
if resize:
x = F.resize_images(x, (299, 299))
if scaled:
x = (x+1)*127.5
# assert x.shape[1:] == (3, 299, 299)
x -= 128.0
x *= 0.0078125
h = F.relu(self.bn_conv(self.conv(x)))
# assert h.shape[1:] == (32, 149, 149)
h = F.relu(self.bn_conv_1(self.conv_1(h)))
# assert h.shape[1:] == (32, 147, 147)
h = F.relu(self.bn_conv_2(self.conv_2(h)))
# assert h.shape[1:] == (64, 147, 147)
h = F.max_pooling_2d(h, 3, stride=2, pad=0)
# assert h.shape[1:] == (64, 73, 73)
h = F.relu(self.bn_conv_3(self.conv_3(h)))
# assert h.shape[1:] == (80, 73, 73)
h = F.relu(self.bn_conv_4(self.conv_4(h)))
# assert h.shape[1:] == (192, 71, 71)
h = F.max_pooling_2d(h, 3, stride=2, pad=0)
# assert h.shape[1:] == (192, 35, 35)
h = self.mixed(h)
# assert h.shape[1:] == (256, 35, 35)
h = self.mixed_1(h)
# assert h.shape[1:] == (288, 35, 35)
h = self.mixed_2(h)
# assert h.shape[1:] == (288, 35, 35)
h = self.mixed_3(h)
# assert h.shape[1:] == (768, 17, 17)
h = self.mixed_4(h)
# assert h.shape[1:] == (768, 17, 17)
h = self.mixed_5(h)
# assert h.shape[1:] == (768, 17, 17)
h = self.mixed_6(h)
# assert h.shape[1:] == (768, 17, 17)
h = self.mixed_7(h)
# assert h.shape[1:] == (768, 17, 17)
h = self.mixed_8(h)
# assert h.shape[1:] == (1280, 8, 8)
h = self.mixed_9(h)
# assert h.shape[1:] == (2048, 8, 8)
h = self.mixed_10(h)
# assert h.shape[1:] == (2048, 8, 8)
h = F.average_pooling_2d(h, 8, 1)
# assert h.shape[1:] == (2048, 1, 1)
h = F.reshape(h, (-1, 2048))
if get_feature:
return h
else:
h = self.logit(h)
h = F.softmax(h)
# assert h.shape[1:] == (1008,)
return h
| 44.603908 | 96 | 0.404309 | 2,582 | 25,112 | 3.758714 | 0.044926 | 0.061206 | 0.091705 | 0.066873 | 0.894075 | 0.860999 | 0.834209 | 0.813807 | 0.803503 | 0.783205 | 0 | 0.116146 | 0.420914 | 25,112 | 562 | 97 | 44.683274 | 0.551231 | 0.036118 | 0 | 0.794411 | 0 | 0 | 0.086246 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011976 | false | 0 | 0.017964 | 0 | 0.043912 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
056ede9d32d0ffe8b87fca0a98b3f951d4af798a | 25 | py | Python | Test/module.py | twodulls/pythonsample | f576f79be75d96df61e26ef79a0e36a6feedfec1 | [
"Apache-2.0"
] | null | null | null | Test/module.py | twodulls/pythonsample | f576f79be75d96df61e26ef79a0e36a6feedfec1 | [
"Apache-2.0"
] | null | null | null | Test/module.py | twodulls/pythonsample | f576f79be75d96df61e26ef79a0e36a6feedfec1 | [
"Apache-2.0"
] | null | null | null | def sum(a,b):
return a+b | 12.5 | 13 | 0.64 | 7 | 25 | 2.285714 | 0.714286 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 25 | 2 | 14 | 12.5 | 0.761905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
0580e92ac6009b13445da23ce444b9cfbfa98c7d | 104 | py | Python | hello.py | tlaudele/penguins_and_pandas | 83d35cf9546cef4d570340d6dc89c273e24ce933 | [
"MIT"
] | null | null | null | hello.py | tlaudele/penguins_and_pandas | 83d35cf9546cef4d570340d6dc89c273e24ce933 | [
"MIT"
] | null | null | null | hello.py | tlaudele/penguins_and_pandas | 83d35cf9546cef4d570340d6dc89c273e24ce933 | [
"MIT"
] | null | null | null | print('hello world 2!')
print("selber hello world!")
# comment
# new edit
print("selber hello world!")
| 14.857143 | 28 | 0.692308 | 15 | 104 | 4.8 | 0.533333 | 0.416667 | 0.444444 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011236 | 0.144231 | 104 | 6 | 29 | 17.333333 | 0.797753 | 0.153846 | 0 | 0.666667 | 0 | 0 | 0.611765 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
553f35f42274177419415822721a19528e9d395d | 110 | py | Python | src/sage/symbolic/constants_c.py | UCD4IDS/sage | 43474c96d533fd396fe29fe0782d44dc7f5164f7 | [
"BSL-1.0"
] | 1,742 | 2015-01-04T07:06:13.000Z | 2022-03-30T11:32:52.000Z | src/sage/symbolic/constants_c.py | UCD4IDS/sage | 43474c96d533fd396fe29fe0782d44dc7f5164f7 | [
"BSL-1.0"
] | 66 | 2015-03-19T19:17:24.000Z | 2022-03-16T11:59:30.000Z | src/sage/symbolic/constants_c.py | UCD4IDS/sage | 43474c96d533fd396fe29fe0782d44dc7f5164f7 | [
"BSL-1.0"
] | 495 | 2015-01-10T10:23:18.000Z | 2022-03-24T22:06:11.000Z | from sage.misc.lazy_import import lazy_import
lazy_import('sage.symbolic.expression', 'E', deprecation=32386)
| 36.666667 | 63 | 0.818182 | 16 | 110 | 5.4375 | 0.625 | 0.344828 | 0.367816 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048544 | 0.063636 | 110 | 2 | 64 | 55 | 0.796117 | 0 | 0 | 0 | 0 | 0 | 0.227273 | 0.218182 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5594c87f90ecd47ab4412370edc6506a9c9ae5e3 | 168 | py | Python | src/andushu/training/metrics/__init__.py | VisualJoyce/andushu | adf7f32b89c788734d9dff8e96ff55a35488dd51 | [
"MIT"
] | 1 | 2021-09-13T02:20:53.000Z | 2021-09-13T02:20:53.000Z | src/andushu/training/metrics/__init__.py | VisualJoyce/andushu | adf7f32b89c788734d9dff8e96ff55a35488dd51 | [
"MIT"
] | 1 | 2021-11-10T09:43:12.000Z | 2021-12-07T05:53:20.000Z | src/andushu/training/metrics/__init__.py | VisualJoyce/andushu | adf7f32b89c788734d9dff8e96ff55a35488dd51 | [
"MIT"
] | null | null | null | from andushu.training.metrics.token_sequence_accuracy import TokenSequenceAccuracy
from andushu.training.metrics.equation_answer_accuracy import EquationAnswerAccuracy
| 56 | 84 | 0.916667 | 18 | 168 | 8.333333 | 0.666667 | 0.146667 | 0.253333 | 0.346667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 168 | 2 | 85 | 84 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e98bf55b24c350899c58720a6029580d670554cc | 129 | py | Python | news/models/__init__.py | kuc2477/news | 215f87e6ce1a7fc99175596e6fd5b4b50a3179c6 | [
"MIT"
] | 2 | 2016-01-21T04:16:57.000Z | 2016-04-27T04:46:13.000Z | news/models/__init__.py | kuc2477/news | 215f87e6ce1a7fc99175596e6fd5b4b50a3179c6 | [
"MIT"
] | null | null | null | news/models/__init__.py | kuc2477/news | 215f87e6ce1a7fc99175596e6fd5b4b50a3179c6 | [
"MIT"
] | null | null | null | """:mod:`news.models` --- News models
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
News models for schedule and news persistence.
"""
| 18.428571 | 46 | 0.48062 | 12 | 129 | 5.166667 | 0.583333 | 0.483871 | 0.451613 | 0.645161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 129 | 6 | 47 | 21.5 | 0.54386 | 0.930233 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e9d185c838fc19d6deaf18cb0a7692809139cc18 | 646 | py | Python | python/testData/highlighting/fStringHashSigns.py | fduminy/intellij-community | fe13dc9ddb7f0f65397325ded25ecb239675eb59 | [
"Apache-2.0"
] | 2 | 2018-12-29T09:53:39.000Z | 2018-12-29T09:53:42.000Z | python/testData/highlighting/fStringHashSigns.py | tnorbye/intellij-community | f01cf262fc196bf4dbb99e20cd937dee3705a7b6 | [
"Apache-2.0"
] | null | null | null | python/testData/highlighting/fStringHashSigns.py | tnorbye/intellij-community | f01cf262fc196bf4dbb99e20cd937dee3705a7b6 | [
"Apache-2.0"
] | 1 | 2018-10-03T12:35:06.000Z | 2018-10-03T12:35:06.000Z | f'<error descr="'}' is expected">{<error descr="Expression fragments inside f-strings cannot include line comments">#</error></error>'
<error descr="Missing closing quote [']">f'<error descr="'}' is expected">{<error descr="Expression fragments inside f-strings cannot include line comments">#</error></error></error>
f'{<error descr="Expression fragments inside f-strings cannot include line comments">#foo#</error>}'
f'{42:#}'
f'{42:{<error descr="Expression fragments inside f-strings cannot include line comments">#</error>}}'
f'{x <error descr="Expression fragments inside f-strings cannot include line comments">### foo</error>}'
f'{"###"}' | 92.285714 | 182 | 0.72291 | 88 | 646 | 5.306818 | 0.215909 | 0.171306 | 0.214133 | 0.310493 | 0.931478 | 0.931478 | 0.931478 | 0.931478 | 0.931478 | 0.931478 | 0 | 0.006803 | 0.089783 | 646 | 7 | 183 | 92.285714 | 0.787415 | 0 | 0 | 0 | 0 | 0.285714 | 0.758887 | 0.111283 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
e9e23899b2e1a8cfa048d4533b0a7934fa076a55 | 191 | py | Python | bentoml/xgboost.py | francoisserra/BentoML | 213e9e9b39e055286f2649c733907df88e6d2503 | [
"Apache-2.0"
] | 1 | 2022-02-13T05:35:47.000Z | 2022-02-13T05:35:47.000Z | bentoml/xgboost.py | francoisserra/BentoML | 213e9e9b39e055286f2649c733907df88e6d2503 | [
"Apache-2.0"
] | 4 | 2021-05-16T08:06:25.000Z | 2021-11-13T08:46:36.000Z | bentoml/xgboost.py | francoisserra/BentoML | 213e9e9b39e055286f2649c733907df88e6d2503 | [
"Apache-2.0"
] | null | null | null | from ._internal.frameworks.xgboost import load
from ._internal.frameworks.xgboost import save
from ._internal.frameworks.xgboost import load_runner
__all__ = ["load", "load_runner", "save"]
| 31.833333 | 53 | 0.801047 | 24 | 191 | 6 | 0.375 | 0.25 | 0.458333 | 0.604167 | 0.784722 | 0.541667 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094241 | 191 | 5 | 54 | 38.2 | 0.83237 | 0 | 0 | 0 | 0 | 0 | 0.099476 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
75dd5ea60df79db26a5b63d8a1f140df19794cf7 | 116,133 | py | Python | frontback/frontBackCSHORE.py | mailemccann/cmtb | 1e6451fa186f092d8ed1b96dbbfd7a8cedd46b17 | [
"MIT"
] | 3 | 2019-06-28T13:35:39.000Z | 2022-03-18T15:43:19.000Z | frontback/frontBackCSHORE.py | mailemccann/cmtb | 1e6451fa186f092d8ed1b96dbbfd7a8cedd46b17 | [
"MIT"
] | 25 | 2018-11-08T19:43:59.000Z | 2022-01-07T15:26:24.000Z | frontback/frontBackCSHORE.py | mailemccann/cmtb | 1e6451fa186f092d8ed1b96dbbfd7a8cedd46b17 | [
"MIT"
] | 14 | 2019-07-30T12:58:05.000Z | 2021-09-14T20:08:16.000Z | import math
from scipy.interpolate import griddata
from prepdata import inputOutput, prepDataLib
import os
import datetime as DT
import netCDF4 as nc
import numpy as np
from getdatatestbed.getDataFRF import getObs, getDataTestBed
from testbedutils.geoprocess import FRFcoord
from testbedutils.sblib import timeMatch, timeMatch_altimeter, makeNCdir
from testbedutils.anglesLib import geo2STWangle, STWangle2geo, vectorRotation
import plotting.operationalPlots as oP
import makenc
from matplotlib import pyplot as plt
from subprocess import check_output
def CSHORE_analysis(startTime, inputDict):
"""
Args:
startTime (str): this is the time that all the CSHORE runs are tagged by (e.g., '2012-12-31T00:30:30Z')
inputDicts (dict): dictionary input
version_prefix - right now we have MOBILE, MOBILE_RESET. FIXED
workingDir - path to the working directory the user wants
pFlag - do you want plots or not?
netCDFdir - directory where the netCDF files will be saved, like a boss
Returns:
None
"""
version_prefix = inputDict['version_prefix']
workingDir = inputDict['workingDirectory']
pFlag = inputDict['pFlag']
if 'netCDFdir' in inputDict.keys():
netCDFdir = inputDict['netCDFdir']
else:
whoami = check_output('whoami')[:-1]
netCDFdir = 'home/%s/thredds_data' % whoami
if 'THREDDS' in inputDict:
server = inputDict['THREDDS']
else:
print('Chosing CHL thredds by Default, this may be slower!')
server = 'CHL'
model='CSHORE'
#initialize the class
cshore_io = inputOutput.cshoreIO()
# get into the directory I need
start_dir = workingDir
path_prefix = os.path.join(model, version_prefix) # data super directiory
d_s = DT.datetime.strptime(startTime, '%Y-%m-%dT%H:%M:%SZ')
date_str = d_s.strftime('%Y-%m-%dT%H%M%SZ') # THE COLONS!!! startTime has COLONS!!!
params, bc, veg, hydro, sed, morpho, meta = cshore_io.load_CSHORE_results(os.path.join(start_dir, path_prefix, date_str))
# params - metadata about the run
# bc - boundary condition data, but for some reason it does not include the initial conditions?
# veg - vegetation information
# hydro - current and wave information
# sed - sediment information
# morpho - bed elevation information
test = np.append(bc['time_offshore'], max(bc['time_offshore']) + (bc['time_offshore'][1] - bc['time_offshore'][0]))
times = np.array([d_s + DT.timedelta(seconds=s) for s in test])
# change my coordinate system back to FRF!!!!!!!
BC_FRFX = meta["BC_FRF_X"]
BC_FRFY = meta["BC_FRF_Y"]
x_n = BC_FRFX - morpho['x'][0]
model_time = times[-1]
# make the plots like a boss, with greatness
if pFlag:
# A - pull all the the observations that I need and store as dictionaries!!!!!!!
# Altimeter data!!!!!!!!
Alt05 = oP.alt_PlotData('Alt05', model_time, times)
Alt04 = oP.alt_PlotData('Alt04', model_time, times)
Alt03 = oP.alt_PlotData('Alt03', model_time, times)
# go ahead and time match the altimeter data
if Alt05['TS_toggle']:
# ALT05
obs_zb = Alt05['zb']
obs_time = Alt05['time']
obs_loc = round(Alt05['xFRF'])
mod_zb = morpho['zb'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_n, mod_n = timeMatch_altimeter(obs_time, obs_zb, comp_time, mod_zb)
plot_ind = np.where(abs(comp_time_n - model_time) == min(abs(comp_time_n - model_time)), 1, 0)
# delete and re-assign
del Alt05['zb']
del Alt05['time']
del Alt05['plot_ind']
Alt05['zb'] = obs_n
Alt05['time'] = comp_time_n
Alt05['plot_ind'] = plot_ind
if Alt04['TS_toggle']:
# ALT04
obs_zb = Alt04['zb']
obs_time = Alt04['time']
obs_loc = round(Alt04['xFRF'])
mod_zb = morpho['zb'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_n, mod_n = timeMatch_altimeter(obs_time, obs_zb, comp_time, mod_zb)
plot_ind = np.where(abs(comp_time_n - model_time) == min(abs(comp_time_n - model_time)), 1, 0)
# delete and re-assign
del Alt04['zb']
del Alt04['time']
del Alt04['plot_ind']
Alt04['zb'] = obs_n
Alt04['time'] = comp_time_n
Alt04['plot_ind'] = plot_ind
if Alt03['TS_toggle']:
# ALT03
obs_zb = Alt03['zb']
obs_time = Alt03['time']
obs_loc = round(Alt03['xFRF'])
mod_zb = morpho['zb'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_n, mod_n = timeMatch_altimeter(obs_time, obs_zb, comp_time, mod_zb)
plot_ind = np.where(abs(comp_time_n - model_time) == min(abs(comp_time_n - model_time)), 1, 0)
# delete and re-assign
del Alt03['zb']
del Alt03['time']
del Alt03['plot_ind']
Alt03['zb'] = obs_n
Alt03['time'] = comp_time_n
Alt03['plot_ind'] = plot_ind
# wave data & current data!!!
Adopp_35 = oP.wave_PlotData('adop-3.5m', model_time, times)
AWAC6m = oP.wave_PlotData('awac-6m', model_time, times)
# this is just to check to see if I rounded down when i set my bathymetry,
# in which case the 6m AWAC would not be inside the plot limits.
if AWAC6m['xFRF'] > max(x_n):
# if it is, round it down to nearest 1m - this will move it to the boundary if it IS the boundary gage
AWAC6m['xFRF'] = float(int(AWAC6m['xFRF']))
else:
pass
AWAC8m = oP.wave_PlotData('awac-8m', model_time, times)
# this is just to check to see if I rounded down when i set my bathymetry,
# in which case the 8m AWAC would not be inside the plot limits.
if AWAC8m['xFRF'] > max(x_n):
# if it is, round it down to nearest 1m - this will move it to the boundary if it IS the boundary gage
AWAC8m['xFRF'] = float(int(AWAC8m['xFRF']))
else:
pass
# go ahead and time match the wave and current dat!
if Adopp_35['TS_toggle']:
# Adopp_35
# get time-matched data!!!!!! waves
obs_Hs = Adopp_35['Hs']
obs_time = Adopp_35['wave_time']
obs_loc = round(Adopp_35['xFRF'])
mod_Hs = hydro['Hs'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_Hs_n, mod_Hs_n = timeMatch(obs_time, obs_Hs, comp_time, mod_Hs)
plot_ind = np.where(abs(comp_time_n - model_time) == min(abs(comp_time_n - model_time)), 1, 0)
# delete and re-assign
del Adopp_35['Hs']
del Adopp_35['wave_time']
del Adopp_35['plot_ind']
Adopp_35['Hs'] = obs_Hs_n
Adopp_35['wave_time'] = comp_time_n
Adopp_35['plot_ind'] = plot_ind
# get time-matched data!!!!!! currents
# V
obs_V = Adopp_35['V']
obs_time = Adopp_35['cur_time']
obs_loc = round(Adopp_35['xFRF'])
mod_V = hydro['vmean'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_V_n, mod_V_n = timeMatch(obs_time, obs_V, comp_time, mod_V)
plot_ind_V = np.where(abs(comp_time_n - model_time) == min(abs(comp_time_n - model_time)), 1, 0)
# delete and re-assign
del Adopp_35['V']
temp_cur_time = Adopp_35['cur_time']
del Adopp_35['cur_time']
del Adopp_35['plot_ind_V']
Adopp_35['V'] = obs_V_n
Adopp_35['cur_time'] = comp_time_n
Adopp_35['plot_ind_V'] = plot_ind_V
# U
obs_U = Adopp_35['U']
obs_time = temp_cur_time
obs_loc = round(Adopp_35['xFRF'])
mod_U = hydro['umean'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_U_n, mod_U_n = timeMatch(obs_time, obs_U, comp_time, mod_U)
# delete and re-assign
del Adopp_35['U']
Adopp_35['U'] = obs_U_n
if AWAC6m['TS_toggle']:
# AWAC6m
# get time-matched data!!!!!! waves
obs_Hs = AWAC6m['Hs']
obs_time = AWAC6m['wave_time']
obs_loc = round(AWAC6m['xFRF'])
mod_Hs = hydro['Hs'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_Hs_n, mod_Hs_n = timeMatch(obs_time, obs_Hs, comp_time, mod_Hs)
plot_ind = np.where(abs(comp_time_n - model_time) == min(abs(comp_time_n - model_time)), 1, 0)
# delete and re-assign
del AWAC6m['Hs']
del AWAC6m['wave_time']
del AWAC6m['plot_ind']
AWAC6m['Hs'] = obs_Hs_n
AWAC6m['wave_time'] = comp_time_n
AWAC6m['plot_ind'] = plot_ind
# get time-matched data!!!!!! currents
# V
obs_V = AWAC6m['V']
obs_time = AWAC6m['cur_time']
obs_loc = round(AWAC6m['xFRF'])
mod_V = hydro['vmean'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_V_n, mod_V_n = timeMatch(obs_time, obs_V, comp_time, mod_V)
plot_ind_V = np.where(abs(comp_time_n - model_time) == min(abs(comp_time_n - model_time)), 1, 0)
# delete and re-assign
del AWAC6m['V']
temp_cur_time = AWAC6m['cur_time']
del AWAC6m['cur_time']
del AWAC6m['plot_ind_V']
AWAC6m['V'] = obs_V_n
AWAC6m['cur_time'] = comp_time_n
AWAC6m['plot_ind_V'] = plot_ind_V
# U
obs_U = AWAC6m['U']
obs_time = temp_cur_time
obs_loc = round(AWAC6m['xFRF'])
mod_U = hydro['umean'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_U_n, mod_U_n = timeMatch(obs_time, obs_U, comp_time, mod_U)
# delete and re-assign
del AWAC6m['U']
AWAC6m['U'] = obs_U_n
if AWAC8m['TS_toggle']:
# AWAC8m
# get time-matched data!!!!!! waves
obs_Hs = AWAC8m['Hs']
obs_time = AWAC8m['wave_time']
obs_loc = round(AWAC8m['xFRF'])
mod_Hs = hydro['Hs'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_Hs_n, mod_Hs_n = timeMatch(obs_time, obs_Hs, comp_time, mod_Hs)
plot_ind = np.where(abs(comp_time_n - model_time) == min(abs(comp_time_n - model_time)), 1, 0)
# delete and re-assign
del AWAC8m['Hs']
del AWAC8m['wave_time']
del AWAC8m['plot_ind']
AWAC8m['Hs'] = obs_Hs_n
AWAC8m['wave_time'] = comp_time_n
AWAC8m['plot_ind'] = plot_ind
# get time-matched data!!!!!! currents
# V
obs_V = AWAC8m['V']
obs_time = AWAC8m['cur_time']
obs_loc = round(AWAC8m['xFRF'])
mod_V = hydro['vmean'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_V_n, mod_V_n = timeMatch(obs_time, obs_V, comp_time, mod_V)
plot_ind_V = np.where(abs(comp_time_n - model_time) == min(abs(comp_time_n - model_time)), 1, 0)
# delete and re-assign
del AWAC8m['V']
temp_cur_time = AWAC8m['cur_time']
del AWAC8m['cur_time']
del AWAC8m['plot_ind_V']
AWAC8m['V'] = obs_V_n
AWAC8m['cur_time'] = comp_time_n
AWAC8m['plot_ind_V'] = plot_ind_V
# U
obs_U = AWAC8m['U']
obs_time = temp_cur_time
obs_loc = round(AWAC8m['xFRF'])
mod_U = hydro['umean'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_U_n, mod_U_n = timeMatch(obs_time, obs_U, comp_time, mod_U)
# delete and re-assign
del AWAC8m['U']
AWAC8m['U'] = obs_U_n
# LiDAR stuff goes here...
lidar = oP.lidar_PlotData(times)
obs_dict = {'Alt05': Alt05,
'Alt04': Alt04,
'Alt03': Alt03,
'Adopp_35': Adopp_35,
'AWAC6m': AWAC6m,
'AWAC8m': AWAC8m,
'lidar': lidar,
}
# QA/QC plots...!!!!
# 1 - BC Plot
path = os.path.join(start_dir, path_prefix, date_str, 'figures/')
p_dict = {'time': times[1:],
'x': morpho['x'][0],
'zb': morpho['zb'][0],
'init_bathy_stime': meta['bathy_surv_stime'],
'WL': bc['strm_tide_offshore'],
'Hs': bc['Hs_offshore'],
'angle': bc['angle_offshore'],
'Tp': bc['Tp_offshore'],
'p_title': '%s CSHORE %s - Boundary Conditions' % (version_prefix, startTime)}
oP.bc_plot(path +'bc.png' , p_dict)
# 2 - obs V mod bathy
var_name = 'Bathymetry'
p_dict = {'x': x_n,
'obs': morpho['zb'][0],
'obs_time': times[0],
'model': morpho['zb'][-1],
'model_time': model_time,
'Hs': hydro['Hs'][-1],
'sigma_Hs': np.nanstd(hydro['Hs'], 0),
# dont worry about the runtime warning, it just means you have all nans in that x position, so it will return nans!
'WL': bc['strm_tide_offshore'],
# do I want to include this?!? it is specific to bathymetry plotting!!!
'time': times[1:],
'var_name': var_name,
'units': 'm',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, var_name)}
oP.obs_V_mod_bathy(path+'obsVmodBathy.png', p_dict, obs_dict)
# 3 - general model results
p_dict = {'x': x_n,
'zb_m': morpho['zb'][-1],
'sigma_zbm': np.nanstd(morpho['zb'], 0),
# dont worry about the runtime warning, it just means you have all nans in that x position, so it will return nans!
'model_time': model_time,
'Hs_m': hydro['Hs'][-1],
'sigma_Hs': np.nanstd(hydro['Hs'], 0),
# dont worry about the runtime warning, it just means you have all nans in that x position, so it will return nans!
'setup_m': hydro['mwl'][-1] - hydro['swl'][-1],
'sigma_setup': np.nanstd(hydro['mwl'], 0),
# dont worry about the runtime warning, it just means you have all nans in that x position, so it will return nans!
'p_title': '%s CSHORE %s - Model Results' % (version_prefix, startTime),
'time': times[1:]}
oP.mod_results(path+'modResults.png', p_dict, obs_dict)
# 4 - alongshore current plot
p_dict = {'x': x_n,
'zb_m': morpho['zb'][-1],
'model_time': model_time,
'vmean_m': hydro['vmean'][-1],
'sigma_vm': hydro['vstd'][-1],
'Hs_m': hydro['Hs'][-1],
'sigma_Hs': np.nanstd(hydro['Hs'], 0),
# dont worry about the runtime warning, it just means you have all nans in that x position, so it will return nans!
'p_title': '%s CSHORE %s - Alongshore Current' % (version_prefix, startTime),
'time': times[1:]}
oP.als_results(path+'als.png', p_dict, obs_dict)
# now we want time-series comparisons at all my gages!!!!!!!
# this is necessary for the preservation of the empire
# 5 sig wave height at all gages!!!!
# 6 alongshore current at all gages!!!!
# adopp_35
if Adopp_35['TS_toggle']:
# 5 a) adopp_35 Hs
# get time-matched data!!!!!!
obs_Hs = Adopp_35['Hs']
obs_time = Adopp_35['wave_time']
obs_loc = round(Adopp_35['xFRF'])
mod_Hs = hydro['Hs'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_Hs_n, mod_Hs_n = timeMatch(obs_time, obs_Hs, comp_time, mod_Hs)
if len(comp_time_n) <= 1:
pass
else:
p_dict = {'time': comp_time_n,
'obs': obs_Hs_n,
'model': mod_Hs_n,
'var_name': '$H_{s}$',
'units': 'm',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'Adopp 3.5')}
oP.obs_V_mod_TS(path+'adop35_Hs.png', p_dict)
# 6 a) adopp_35 V
# get time-matched data!!!!!!
obs_V = Adopp_35['V']
obs_time = Adopp_35['cur_time']
obs_loc = round(Adopp_35['xFRF'])
mod_V = hydro['vmean'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_V_n, mod_V_n = timeMatch(obs_time, obs_V, comp_time, mod_V)
if len(comp_time_n) <= 1:
pass
else:
p_dict = {'time': comp_time_n,
'obs': obs_V_n,
'model': mod_V_n,
'var_name': '$V$',
'units': 'm/s',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'Adopp 3.5')}
oP.obs_V_mod_TS(path+'adop35_V.png', p_dict)
else:
pass
# AWAC6m
if AWAC6m['TS_toggle']:
# 5 b) AWAC6m Hs
# get time-matched data!!!!!!
obs_Hs = AWAC6m['Hs']
obs_time = AWAC6m['wave_time']
obs_loc = round(AWAC6m['xFRF'])
mod_Hs = hydro['Hs'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_Hs_n, mod_Hs_n = timeMatch(obs_time, obs_Hs, comp_time, mod_Hs)
if len(comp_time_n) <= 1:
pass
else:
p_dict = {'time': comp_time_n,
'obs': obs_Hs_n,
'model': mod_Hs_n,
'var_name': '$H_{s}$',
'units': 'm',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'AWAC 6m')}
oP.obs_V_mod_TS(path + 'AWAC6m_Hs.png', p_dict)
# 6 b) AWAC6m V
# get time-matched data!!!!!!
obs_V = AWAC6m['V']
obs_time = AWAC6m['cur_time']
obs_loc = round(AWAC6m['xFRF'])
mod_V = hydro['vmean'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_V_n, mod_V_n = timeMatch(obs_time, obs_V, comp_time, mod_V)
if len(comp_time_n) <= 1:
pass
else:
p_dict = {'time': comp_time_n,
'obs': obs_V_n,
'model': mod_V_n,
'var_name': '$V$',
'units': 'm/s',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'AWAC 6m')}
oP.obs_V_mod_TS(path+'AWAC6m_V.png', p_dict)
else:
pass
# 5 c) AWAC8m Hs
if AWAC8m['TS_toggle']:
# 5 c) AWAC8m Hs
# get time-matched data!!!!!!
obs_Hs = AWAC8m['Hs']
obs_time = AWAC8m['wave_time']
obs_loc = round(AWAC8m['xFRF'])
mod_Hs = hydro['Hs'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_Hs_n, mod_Hs_n = timeMatch(obs_time, obs_Hs, comp_time, mod_Hs)
if len(comp_time_n) <= 1:
pass
else:
p_dict = {'time': comp_time_n,
'obs': obs_Hs_n,
'model': mod_Hs_n,
'var_name': '$H_{s}$',
'units': 'm',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'AWAC 8m')}
oP.obs_V_mod_TS(path + 'AWAC8m_Hs.png', p_dict)
# 6 c) AWAC8m V
# get time-matched data!!!!!!
obs_V = AWAC8m['V']
obs_time = AWAC8m['cur_time']
obs_loc = round(AWAC8m['xFRF'])
mod_V = hydro['vmean'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
comp_time_n, obs_V_n, mod_V_n = timeMatch(obs_time, obs_V, comp_time, mod_V)
if len(comp_time_n) <= 1:
pass
else:
p_dict = {'time': comp_time_n,
'obs': obs_V_n,
'model': mod_V_n,
'var_name': '$V$',
'units': 'm/s',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'AWAC 8m')}
oP.obs_V_mod_TS(path + 'AWAC8m_V.png', p_dict)
if 'MOBILE' in version_prefix:
# 7 bottom elevation at all gages!!!!
# 7 a) alt 05 bottom elevation
if Alt05['TS_toggle']:
# get time-matched data!!!!!!
obs_zb = Alt05['zb']
obs_time = Alt05['time']
obs_loc = round(Alt05['xFRF'])
mod_zb = morpho['zb'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
prepData = prepDataLib.PrepDataTools()
matchObs = prepData.prep_obs2mod(obs_time, obs_zb, comp_time)
# check to see if we masked anything!!
if np.sum(matchObs['mask']) > 0:
mod_zb = mod_zb[np.where(~matchObs['mask'])]
p_dict = {'time': matchObs['time'],
'obs': matchObs['meanObs'],
'model': mod_zb,
'var_name': 'Bottom Elevation',
'units': 'm',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'Alt-05')}
if np.size(p_dict['obs']) >= 2:
oP.obs_V_mod_TS(path + 'alt05_BE.png', p_dict)
# 7 b) alt 04 bottom elevation
if Alt04['TS_toggle']:
# get time-matched data!!!!!!
obs_zb = Alt04['zb']
obs_time = Alt04['time']
obs_loc = round(Alt04['xFRF'])
mod_zb = morpho['zb'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
prepData = prepDataLib.PrepDataTools()
matchObs = prepData.prep_obs2mod(obs_time, obs_zb, comp_time)
# check to see if we masked anything!!
if np.sum(matchObs['mask']) > 0:
mod_zb = mod_zb[np.where(~matchObs['mask'])]
p_dict = {'time': matchObs['time'],
'obs': matchObs['meanObs'],
'model': mod_zb,
'var_name': 'Bottom Elevation',
'units': 'm',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'Alt-04')}
if np.size(p_dict['obs']) >= 2:
oP.obs_V_mod_TS(path + 'alt04_BE.png', p_dict)
# 7 c) alt 03 bottom elevation
if Alt03['TS_toggle']:
# get time-matched data!!!!!!
obs_zb = Alt03['zb']
obs_time = Alt03['time']
obs_loc = round(Alt03['xFRF'])
mod_zb = morpho['zb'][:, np.where(abs(x_n - obs_loc) == min(abs(x_n - obs_loc)), 1, 0) == 1].squeeze()
comp_time = times[1:]
prepData = prepDataLib.PrepDataTools()
matchObs = prepData.prep_obs2mod(obs_time, obs_zb, comp_time)
# check to see if we masked anything!!
if np.sum(matchObs['mask']) > 0:
mod_zb = mod_zb[np.where(~matchObs['mask'])]
else:
pass
p_dict = {'time': matchObs['time'],
'obs': matchObs['meanObs'],
'model': mod_zb,
'var_name': 'Bottom Elevation',
'units': 'm',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'Alt-03')}
if np.size(p_dict['obs']) >= 2:
oP.obs_V_mod_TS(path + 'alt03_BE.png', p_dict)
# 8 - LiDAR runup comparison plots
if lidar['TS_toggle']:
# get time-matched data!!!!!!
obs_runupMean = lidar['runupMean']
obs_time = lidar['runupTime']
mod_runupMean = hydro['runup_mean']
comp_time = times[1:]
comp_time_n, obs_runupMean_n, mod_runupMean_n = timeMatch(obs_time, obs_runupMean, comp_time, mod_runupMean)
if len(comp_time_n) <= 1:
pass
else:
p_dict = {'time': comp_time_n,
'obs': obs_runupMean_n,
'model': mod_runupMean_n,
'var_name': 'Mean Run-up',
'units': 'm',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'LiDAR')}
oP.obs_V_mod_TS(path + 'runupMean.png', p_dict)
# 8 b) 2% runup
# get time-matched data!!!!!!
obs_runup = lidar['runup2perc']
obs_time = lidar['runupTime']
mod_runup = hydro['runup_2_percent']
comp_time = times[1:]
comp_time_n, obs_runup_n, mod_runup_n = timeMatch(obs_time, obs_runup, comp_time, mod_runup)
if len(comp_time_n) <= 1:
pass
else:
p_dict = {'time': comp_time_n,
'obs': obs_runup_n,
'model': mod_runup_n,
'var_name': '$2\%$ Exceedance Run-up',
'units': 'm',
'p_title': '%s CSHORE %s - %s' % (version_prefix, startTime, 'LiDAR')}
oP.obs_V_mod_TS(path + 'runup2perc.png', p_dict)
# make the nc files
nc_dict = makeCSHORE_ncdict(startTime=startTime, inputDict=inputDict)
globalYaml = None
yaml_dir = os.path.dirname(os.path.abspath(__file__))
# need to go up one
yaml_dir = os.path.dirname(yaml_dir)
if 'MOBILE' in version_prefix:
globalYaml = os.path.join(yaml_dir, 'yaml_files/CSHORE/CSHORE_mobile_global.yml')
varYaml = os.path.join(yaml_dir, 'yaml_files/CSHORE/CSHORE_mobile_var.yml')
elif 'FIXED' in version_prefix:
globalYaml = os.path.join(yaml_dir, 'yaml_files/CSHORE/CSHORE_fixed_global.yml')
varYaml = os.path.join(yaml_dir, 'yaml_files/CSHORE/CSHORE_fixed_var.yml')
else:
raise NotImplementedError('please check version prefix')
assert globalYaml is not None, 'CSHORE_analysis Error: Version prefix not recognized'
NCpath = makeNCdir(netCDFdir, version_prefix, date_str, model='CSHORE')
# make the name of this nc file your OWN SELF BUM!
NCname = 'CMTB-morphModels_CSHORE_%s_%s.nc' %(version_prefix, date_str)
makenc.makenc_CSHORErun(os.path.join(NCpath, NCname), nc_dict, globalYaml, varYaml)
t = 1
def makeCSHORE_ncdict(startTime,inputDict):
"""
Args:
startTime (str): this is the time that all the CSHORE runs are tagged by (e.g., '2012-12-31T00:30:30Z')
inputDicts (dict): keys are
version_prefix - right now we have MOBILE, MOBILE_RESET. FIXED
workingDir - path to the working directory the user wants
Returns:
nc_dict (dict): the dictionary with keys that you hand to the nc file
(the data has been rotated back to the standard FRF conventions)
"""
version_prefix = inputDict['version_prefix']
workingDir = inputDict['workingDirectory']
model = 'CSHORE'
# initialize the class
cshore_io = inputOutput.cshoreIO()
# get into the directory I need
start_dir = workingDir
path_prefix = "%s/" % version_prefix # data super directiory
d_s = DT.datetime.strptime(startTime, '%Y-%m-%dT%H:%M:%SZ')
date_str = d_s.strftime('%Y-%m-%dT%H%M%SZ') # THE COLONS!!! startTime has COLONS!!!
params, bc, veg, hydro, sed, morpho, meta = cshore_io.load_CSHORE_results(os.path.join(start_dir, model, path_prefix, date_str))
# params - metadata about the run
# bc - boundary condition data, but for some reason it does not include the initial conditions?
# veg - vegetation information
# hydro - current and wave information
# sed - sediment information
# morpho - bed elevation information
# try out the makeCSHORE_ncdict function!!
nc_dict = {}
dim_t = len(hydro['umean'])
dim_x = len(hydro['umean'][0])
step = hydro['time_end'][1] - hydro['time_end'][0]
pierang = 71.8
# get my time stuff!!!
times = np.array([d_s + DT.timedelta(seconds=s) for s in np.ravel(hydro['time_end'] + step)])
# The 1800 will set the time to be IN BETWEEN the start time and end time
timeunits = 'seconds since 1970-01-01 00:00:00'
nc_dict['time'] = nc.date2num(times, timeunits)
# get my cross-shore X (and Y!!!) in FRF coords!!!
nc_dict['xFRF'] = meta["BC_FRF_X"] - morpho['x'][0]
nc_dict['yFRF'] = meta["BC_FRF_Y"]
# get my stuff that needs to be rotated
test_fun = lambda x: vectorRotation(x, theta=pierang + (90 - pierang) + pierang)
nc_dict['aveE'] = np.zeros([dim_t, dim_x])
nc_dict['aveN'] = np.zeros([dim_t, dim_x])
nc_dict['stdE'] = np.zeros([dim_t, dim_x])
nc_dict['stdN'] = np.zeros([dim_t, dim_x])
nc_dict['waveMeanDirection'] = np.zeros([dim_t, dim_x]) + np.nan
nc_dict['qbx'] = np.zeros([dim_t, dim_x])
nc_dict['qby'] = np.zeros([dim_t, dim_x])
nc_dict['qsx'] = np.zeros([dim_t, dim_x])
nc_dict['qsy'] = np.zeros([dim_t, dim_x])
for ii in range(0, len(hydro['umean'])):
# current stuff
newV = [test_fun(x) for x in zip(hydro['umean'][ii][:], hydro['vmean'][ii][:])]
nc_dict['aveE'][ii] = zip(*newV)[0]
nc_dict['aveN'][ii] = zip(*newV)[1]
newStd = [test_fun(x) for x in zip(hydro['ustd'][ii][:], hydro['vstd'][ii][:])]
nc_dict['stdE'][ii] = zip(*newStd)[0]
nc_dict['stdN'][ii] = zip(*newStd)[1]
# wave angle
t1 = 360 / float(2 * np.pi) * hydro['stheta'][ii]
t1[~np.isnan(t1)] = STWangle2geo(t1[~np.isnan(t1)], pierang=pierang)
nc_dict['waveMeanDirection'][ii] = t1
# sediment transport rate
new_qb = [test_fun(x) for x in zip(sed['qbx'][ii][:], sed['qby'][ii][:])]
nc_dict['qbx'][ii] = zip(*new_qb)[0]
nc_dict['qby'][ii] = zip(*new_qb)[1]
new_qs = [test_fun(x) for x in zip(sed['qsx'][ii][:], sed['qsy'][ii][:])]
nc_dict['qsx'][ii] = zip(*new_qs)[0]
nc_dict['qsy'][ii] = zip(*new_qs)[1]
# wave and WL stuff!!!!
nc_dict['waveHs'] = hydro['Hs']
nc_dict['waterLevel'] = hydro['mwl']
nc_dict['stdWaterLevel'] = hydro['sigma']
nc_dict['setup'] = hydro['mwl'] - hydro['swl']
# runup stuff!
nc_dict['runup2perc'] = hydro['runup_2_percent']
nc_dict['runupMean'] = hydro['runup_mean']
# other sediment stuff
nc_dict['probabilitySuspension'] = sed['ps']
nc_dict['probabilityMovement'] = sed['pb']
nc_dict['suspendedSedVolume'] = sed['vs']
# bathymetry
nc_dict['bottomElevation'] = morpho['zb'] # you may have to screw with this with fixed vs. mobile???
# if the fixed bed just copies the same bathy to each time-step, you will need to just take the first one!!!
nc_dict['surveyNumber'] = np.zeros([dim_t]) + meta['bathy_surv_num']
nc_dict['profileNumber'] = np.zeros([dim_t]) + meta['bathy_prof_num']
nc_dict['bathymetryDate'] = nc.date2num(np.array([meta['bathy_surv_stime'] + DT.timedelta(hours=0) for i in xrange(dim_t)]), timeunits)
return nc_dict
def CSHOREsimSetup(startTime, inputDict):
"""Author: David Young, Master of the Universe
Association: USACE CHL Field Research Facility
Project: Coastal Model Test Bed
This Function is the master call for the data preperation for the Coastal Model
Test Bed (CMTB). It is designed to pull from GetData and utilize
prep_datalib for development of the FRF CMTB
NOTE: input to the function is the end of the duration. All Files are labeled by this convention
all time stamps otherwise are top of the data collection
Args:
startTime (str): this is the start time for the simulation (string in format e.g., '2016-06-02T10:00:00Z' )
THIS MAY NOT BE THE SAME AS THE ONE IN INPUT DICT
i.e., if it is looping over a bunch of 24 hour simulations
that is also why it is a seperate variable
inputDict (dict): input dictionary with keys
simulationDuration - duration of each simulation in hours
version_prefix - right now we have FIXED, MOBILE, and MOBILE_RESET
profileNumber - this is either the survery profile number or the alongshore location for the integrated bathy
bathyLoc - where are we getting our bathy data (surveys or integrated bathy)
workindDir - location where the user wants to have all the data and stuff
"""
# pull the stuff I need out of the dict
timerun = inputDict['simulationDuration']
version_prefix = inputDict['version_prefix']
profile_num = inputDict['profileNumber']
bathy_loc = inputDict['bathyLoc']
workingDir = inputDict['workingDirectory']
if 'THREDDS' in inputDict:
server = inputDict['THREDDS']
else:
print('Chosing CHL thredds by Default, this may be slower!')
server = 'CHL'
# ____________________GENERAL ASSUMPTION VARIABLES__________
model = 'CSHORE'
path_prefix = os.path.join(workingDir, model, '%s/' % version_prefix)
time_step = 1 # time step for model in hours
dx = 1 # cross-shore grid spacing (FRF coord units - m)
fric_fac = 0.015
# ______________________________________________________________________________
# _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
# Time Stuff!
if type(timerun) == str:
timerun = int(timerun)
start_time = DT.datetime.strptime(startTime, '%Y-%m-%dT%H:%M:%SZ')
# scream at them if the simulation does not start on a whole hour!
assert start_time.minute == 0 and start_time.second == 0 and start_time.microsecond == 0, 'Your simulation must start on the hour!'
end_time = start_time + DT.timedelta(days=0, hours=timerun) # removed for ilab=1 , minutes=1)
date_str = start_time.strftime('%Y-%m-%dT%H%M%SZ')
# start making my metadata dict
meta_dict = {'startTime': startTime,
'timerun': timerun,
'time_step': time_step,
'dx': dx,
'fric_fac': fric_fac,
'version': version_prefix}
ftime = timerun * 3600 # [sec] final time, dictates model duration
dt = time_step * 3600 # time interval (sec) for wave and water level conditions
BC_dict = {'timebc_wave': np.arange(0, ftime + dt, dt)}
# ______________________________________________________________________________
# __________________Make Diretories_____________________________________________
if not os.path.exists(path_prefix + date_str): # if it doesn't exist
os.makedirs(path_prefix + date_str) # make the directory
if not os.path.exists(path_prefix + date_str + "/figures/"):
os.makedirs(path_prefix + date_str + "/figures/")
print "Model Time Start : %s Model Time End: %s" % (start_time, end_time)
print u"Files will be placed in {0} folder".format(path_prefix + date_str)
# decision time - fixed vs mobile
if version_prefix == 'FIXED':
# if it is fixed, first thing to do is get waves
## _____________WAVES____________________________
frf_Data = getObs(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS=server)
# Attempt to get 8m array first!!!
try:
wave_data = frf_Data.getWaveSpec(gaugenumber=12)
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere( dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), np.divide(1, wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), geo2STWangle(wave_data['waveDp'], zeroAngle=71.8, fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that craps out, try to get the 6m AWAC!!!
try:
wave_data = frf_Data.getWaveSpec(gaugenumber=4)
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (
np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), np.divide(1, wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), geo2STWangle(wave_data['waveDp'], zeroAngle=71.8, fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Wave data are missing for both 8m array and 6m AWAC.!'
# ____________ BATHY ______________________
print '\n____________________\nGetting Bathymetric Data\n'
# what do I use as my initial bathy?
bathy_loc_List = np.array(['integrated_bathy', 'survey'])
assert bathy_loc in bathy_loc_List, "Please enter a valid source bathymetry location \n Assigned location = %s must be in List %s" % (bathy_loc, bathy_loc_List)
if bathy_loc == 'survey':
# is this profile number in the survey?
prof_nums = frf_Data.getBathyTransectProfNum()
assert profile_num in prof_nums, 'Please begin simulations with a survey that includes profile number %s.' %(str(profile_num))
# go ahead and proceed as normal
bathy_data = frf_Data.getBathyTransectFromNC(profilenumbers=profile_num)
# calculate some stuff about the along-shore variation of your transect!
meta_dict['bathy_surv_num'] = np.unique(bathy_data['surveyNumber']) # tag the survey number!
meta_dict['bathy_surv_stime'] = bathy_data['time'][0] # tag the survey start time!
meta_dict['bathy_surv_etime'] = bathy_data['time'][-1] # tag the survey end time!
meta_dict['bathy_prof_num'] = profile_num # tag the profile number!
meta_dict['bathy_y_max_diff'] = bathy_data['yFRF'].max() - bathy_data['yFRF'].min() # what is the difference between the largest y-position of this transect and the smallest y-position of this transect in FRF coordinates (FRF units)
meta_dict['bathy_y_sdev'] = np.std(bathy_data['yFRF']) # standard deviation of the y-positions of this transect in FRF coordinate (FRF units)
master_bathy = {'xFRF': np.asarray(range(int(math.ceil(min(bathy_data['xFRF']))), int(max(bathy_data['xFRF']) + dx), dx))} # xFRF coordinates of master bathy indices in m
master_bathy['elev'] = np.interp(master_bathy['xFRF'], bathy_data['xFRF'], bathy_data['elevation']) # elevation at master bathy nodes in m
# actually convert the bathy_data to the coordinates of the model!
bc_coords = FRFcoord(wave_data['lon'], wave_data['lat'])
meta_dict['BC_FRF_X'] = int(round(bc_coords['xFRF'])) # this is because I force the gage to be at a grid node
meta_dict['BC_FRF_Y'] = bc_coords['yFRF']
# check that the gage is inside my "master_bathy" x bounds
assert bc_coords['xFRF'] <= max(master_bathy['xFRF']) and bc_coords['xFRF'] >= min(master_bathy['xFRF']), 'The wave gage selected as the boundary condition is outside the known bathymetry.'
# make the shift from "master bathy" to model bathy convention (zero at forcing instrument and positive increasing towards shore)
BC_dict['x'] = np.flipud(int(round(bc_coords['xFRF'])) - master_bathy['xFRF'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['zb'] = np.flipud(master_bathy['elev'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['fw'] = np.flipud(fric_fac * np.ones(BC_dict['x'].size)) # cross-shore values of the bottom friction (just sets it to be fric_fac at every point in the array)
elif bathy_loc == 'integrated_bathy':
# pull the bathymetry from the integrated product - see Spike's getdatatestbed function
cmtb_data = getDataTestBed(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS)
bathy_data = cmtb_data.getBathyIntegratedTransect()
# get my master bathy x-array
master_bathy = {'xFRF': np.asarray(range(int(math.ceil(min(bathy_data['xFRF']))), int(max(bathy_data['xFRF']) + dx), dx))} # xFRF coordinates of master bathy indices in m
elev_mat = bathy_data['elevation']
xFRF_mat = np.matlib.repmat(bathy_data['xFRF'], np.shape(elev_mat)[0], 1)
yFRF_mat = np.matlib.repmat(bathy_data['yFRF'].T, np.shape(elev_mat)[1], 1).T
"""
# did I do this right?
fig_loc = 'C:\Users\dyoung8\Desktop\David Stuff\Projects\CSHORE'
fig_name = 'test_bathy' + '.png'
plt.pcolor(xFRF_mat, yFRF_mat, elev_mat, cmap=plt.cm.jet, vmin=-13, vmax=5)
cbar = plt.colorbar()
cbar.set_label('(m)')
plt.xlabel('xFRF (m)')
plt.ylabel('yFRF (m)')
plt.savefig(os.path.join(fig_loc, fig_name))
plt.close()
"""
# have to do 2D interpolation instead of 1D!!!!!!
points = np.array((xFRF_mat.flatten(), yFRF_mat.flatten())).T
values = elev_mat.flatten()
interp_pts = np.array((master_bathy['xFRF'], profile_num * np.ones(np.shape(master_bathy['xFRF'])))).T
master_bathy['elev'] = griddata(points, values, interp_pts)
""""
# did this work?
fig_loc = 'C:\Users\dyoung8\Desktop\David Stuff\Projects\CSHORE'
fig_name = 'test_transect' + '.png'
plt.plot(master_bathy['xFRF'], master_bathy['elev'])
plt.xlabel('xFRF (m)')
plt.ylabel('elevation (m)')
plt.savefig(os.path.join(fig_loc, fig_name))
plt.close()
"""
# calculate some stuff about the along-shore variation of your transect!
meta_dict['bathy_surv_num'] = np.unique(bathy_data['surveyNumber']) # tag the survey number!
meta_dict['bathy_surv_stime'] = bathy_data['time'] # same for integrated bathy
meta_dict['bathy_surv_etime'] = bathy_data['time'] # same for integrated bathy
meta_dict['bathy_prof_num'] = profile_num # tag the profile number!
meta_dict['bathy_y_max_diff'] = 0 # always zero for intergrated bathy
meta_dict['bathy_y_sdev'] = 0 # always zero for intergrated bathy
# actually convert the bathy_data to the coordinates of the model!
bc_coords = FRFcoord(wave_data['lon'], wave_data['lat'])
meta_dict['BC_FRF_X'] = int(round(bc_coords['xFRF'])) # this is because I force the gage to be at a grid node
meta_dict['BC_FRF_Y'] = bc_coords['yFRF']
# check that the gage is inside my "master_bathy" x bounds
assert bc_coords['xFRF'] <= max(master_bathy['xFRF']) and bc_coords['xFRF'] >= min(master_bathy['xFRF']), 'The wave gage selected as the boundary condition is outside the known bathymetry.'
# make the shift from "master bathy" to model bathy convention (zero at forcing instrument and positive increasing towards shore)
BC_dict['x'] = np.flipud(int(round(bc_coords['xFRF'])) - master_bathy['xFRF'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['zb'] = np.flipud(master_bathy['elev'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['fw'] = np.flipud(fric_fac * np.ones(BC_dict['x'].size)) # cross-shore values of the bottom friction (just sets it to be fric_fac at every point in the array)
else:
# if you still have no bathy, Error out the simualtion
raise EnvironmentError('No Bathymetry available')
elif version_prefix == 'MOBILE':
# try to get it from prior cshore run!!!!
try:
Time_O = (DT.datetime.strptime(startTime, '%Y-%m-%dT%H:%M:%SZ') - DT.timedelta(days=1)).strftime('%Y-%m-%dT%H%M%SZ')
# initialize the class
cshore_io_O = inputOutput.cshoreIO()
# get into the directory I need
start_dir_O = workingDir
path_prefix_O = path_prefix
params0, bc0, veg0, hydro0, sed0, morpho0, meta0 = cshore_io_O.load_CSHORE_results(path_prefix_O + Time_O)
# calculate some stuff about the along-shore variation of your transect!
meta_dict['bathy_surv_num'] = meta0['bathy_surv_num']
meta_dict['bathy_surv_stime'] = meta0['bathy_surv_stime']
meta_dict['bathy_surv_etime'] = meta0['bathy_surv_etime']
meta_dict['bathy_prof_num'] = meta0['bathy_prof_num'] # tag the profile number!
meta_dict['bathy_y_max_diff'] = meta0['bathy_y_max_diff'] # what is the difference between the largest y-position of this transect and the smallest y-position of this transect in FRF coordinates (FRF units)
meta_dict['bathy_y_sdev'] = meta0['bathy_y_sdev'] # standard deviation of the y-positions of this transect in FRF coordinate (FRF units)
meta_dict['BC_FRF_X'] = meta0["BC_FRF_X"]
meta_dict['BC_FRF_Y'] = meta0["BC_FRF_Y"]
BC_dict['x'] = morpho0['x'][-1]
BC_dict['zb'] = morpho0['zb'][-1]
BC_dict['fw'] = np.flipud(fric_fac * np.ones(BC_dict['x'].size))
prev_wg = meta0['BC_gage']
# which gage was it?
frf_Data = getObs(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS=server)
wave_data8m = frf_Data.getWaveSpec(gaugenumber=12)
wave_data6m = frf_Data.getWaveSpec(gaugenumber=4)
if prev_wg == wave_data6m['name']:
# go straight to 6m awac
try:
wave_data = wave_data6m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]),wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]),np.divide(1, wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), geo2STWangle(wave_data['waveDp'], zeroAngle=71.8, fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Previous simulation ran from 6m AWAC and wave data missing 6m AWAC.!'
else:
# go through my normal decision tree
# Attempt to get 8m array first!!!
try:
wave_data = wave_data8m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (
np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]),np.divide(1, wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that craps out, try to get the 6m AWAC!!!
try:
wave_data = wave_data6m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), np.divide(1,wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Wave data are missing for both 8m array and 6m AWAC.!'
# check to see if we stepped down and I have to adjust my x, zb
if prev_wg == wave_data8m['name'] and meta_dict['BC_gage'] == wave_data6m['name']:
# re-assign this to my new WG
bc_coords = FRFcoord(wave_data['lon'], wave_data['lat'])
meta_dict['BC_FRF_X'] = int(round(bc_coords['xFRF'])) # this is because I force the gage to be at a grid node
meta_dict['BC_FRF_Y'] = bc_coords['yFRF']
# if I was at the 8m but now I'm at the 6, I need to shave off some of my domain!
nBC_x = BC_dict['x']
nBC_zb = BC_dict['zb']
nBC_fw = BC_dict['fw']
# change the bc stuff so that 0 is at the 6m awac now, then drop all points with less than zero!
nBC_x = nBC_x - (max(nBC_x) - meta_dict['BC_FRF_X'])
keep_ind = np.where(nBC_x >= 0)
nBC_zb = nBC_zb[keep_ind]
nBC_fw = nBC_fw[keep_ind]
nBC_x = nBC_x[keep_ind]
# delete the old ones and re-assign
del BC_dict['x']
del BC_dict['zb']
del BC_dict['fw']
BC_dict['x'] = nBC_x
BC_dict['zb'] = nBC_zb
BC_dict['fw'] = nBC_fw
except:
# couldnt find old simulation - this is the start of a new simulation - proceed as normal
frf_Data = getObs(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS=server)
# Attempt to get 8m array first!!!
try:
wave_data = frf_Data.getWaveSpec(gaugenumber=12)
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]),wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]),np.divide(1, wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that craps out, try to get the 6m AWAC!!!
try:
wave_data = frf_Data.getWaveSpec(gaugenumber=4)
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (
np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]),wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), np.divide(1, wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]), geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Wave data are missing for both 8m array and 6m AWAC.!'
# now we get the bathy from our chosen location...
if bathy_loc == 'survey':
# is this profile number in the survey?
prof_nums = frf_Data.getBathyTransectProfNum()
assert profile_num in prof_nums, 'Please begin simulations with a survey that includes profile number %s.' % (str(profile_num))
# go ahead and proceed as normal
bathy_data = frf_Data.getBathyTransectFromNC(profilenumbers=profile_num)
# calculate some stuff about the along-shore variation of your transect!
meta_dict['bathy_surv_num'] = np.unique(bathy_data['surveyNumber']) # tag the survey number!
meta_dict['bathy_surv_stime'] = bathy_data['time'][0] # tag the survey start time!
meta_dict['bathy_surv_etime'] = bathy_data['time'][-1] # tag the survey end time!
meta_dict['bathy_prof_num'] = profile_num # tag the profile number!
meta_dict['bathy_y_max_diff'] = bathy_data['yFRF'].max() - bathy_data['yFRF'].min() # what is the difference between the largest y-position of this transect and the smallest y-position of this transect in FRF coordinates (FRF units)
meta_dict['bathy_y_sdev'] = np.std(bathy_data['yFRF']) # standard deviation of the y-positions of this transect in FRF coordinate (FRF units)
master_bathy = {'xFRF': np.asarray(range(int(math.ceil(min(bathy_data['xFRF']))), int(max(bathy_data['xFRF']) + dx),dx))} # xFRF coordinates of master bathy indices in m
master_bathy['elev'] = np.interp(master_bathy['xFRF'], bathy_data['xFRF'], bathy_data['elevation']) # elevation at master bathy nodes in m
# actually convert the bathy_data to the coordinates of the model!
bc_coords = FRFcoord(wave_data['lon'], wave_data['lat'])
meta_dict['BC_FRF_X'] = int(round(bc_coords['xFRF'])) # this is because I force the gage to be at a grid node
meta_dict['BC_FRF_Y'] = bc_coords['yFRF']
# check that the gage is inside my "master_bathy" x bounds
assert bc_coords['xFRF'] <= max(master_bathy['xFRF']) and bc_coords['xFRF'] >= min(master_bathy['xFRF']), 'The wave gage selected as the boundary condition is outside the known bathymetry.'
# make the shift from "master bathy" to model bathy convention (zero at forcing instrument and positive increasing towards shore)
BC_dict['x'] = np.flipud(int(round(bc_coords['xFRF'])) - master_bathy['xFRF'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['zb'] = np.flipud(master_bathy['elev'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['fw'] = np.flipud(fric_fac * np.ones(BC_dict['x'].size)) # cross-shore values of the bottom friction (just sets it to be fric_fac at every point in the array)
elif bathy_loc == 'integrated_bathy':
# pull the bathymetry from the integrated product - see Spike's getdatatestbed function
cmtb_data = getDataTestBed(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS)
bathy_data = cmtb_data.getBathyIntegratedTransect()
# get my master bathy x-array
master_bathy = {'xFRF': np.asarray(range(int(math.ceil(min(bathy_data['xFRF']))), int(max(bathy_data['xFRF']) + dx),dx))} # xFRF coordinates of master bathy indices in m
elev_mat = bathy_data['elevation']
xFRF_mat = np.matlib.repmat(bathy_data['xFRF'], np.shape(elev_mat)[0], 1)
yFRF_mat = np.matlib.repmat(bathy_data['yFRF'].T, np.shape(elev_mat)[1], 1).T
"""
# did I do this right?
fig_loc = 'C:\Users\dyoung8\Desktop\David Stuff\Projects\CSHORE'
fig_name = 'test_bathy' + '.png'
plt.pcolor(xFRF_mat, yFRF_mat, elev_mat, cmap=plt.cm.jet, vmin=-13, vmax=5)
cbar = plt.colorbar()
cbar.set_label('(m)')
plt.xlabel('xFRF (m)')
plt.ylabel('yFRF (m)')
plt.savefig(os.path.join(fig_loc, fig_name))
plt.close()
"""
# have to do 2D interpolation instead of 1D!!!!!!
points = np.array((xFRF_mat.flatten(), yFRF_mat.flatten())).T
values = elev_mat.flatten()
interp_pts = np.array((master_bathy['xFRF'], profile_num * np.ones(np.shape(master_bathy['xFRF'])))).T
master_bathy['elev'] = griddata(points, values, interp_pts)
""""
# did this work?
fig_loc = 'C:\Users\dyoung8\Desktop\David Stuff\Projects\CSHORE'
fig_name = 'test_transect' + '.png'
plt.plot(master_bathy['xFRF'], master_bathy['elev'])
plt.xlabel('xFRF (m)')
plt.ylabel('elevation (m)')
plt.savefig(os.path.join(fig_loc, fig_name))
plt.close()
"""
# calculate some stuff about the along-shore variation of your transect!
meta_dict['bathy_surv_num'] = np.unique(bathy_data['surveyNumber']) # tag the survey number!
meta_dict['bathy_surv_stime'] = bathy_data['time'] # same for integrated bathy
meta_dict['bathy_surv_etime'] = bathy_data['time'] # same for integrated bathy
meta_dict['bathy_prof_num'] = profile_num # tag the profile number!
meta_dict['bathy_y_max_diff'] = 0 # always zero for intergrated bathy
meta_dict['bathy_y_sdev'] = 0 # always zero for intergrated bathy
# actually convert the bathy_data to the coordinates of the model!
bc_coords = FRFcoord(wave_data['lon'], wave_data['lat'])
meta_dict['BC_FRF_X'] = int(round(bc_coords['xFRF'])) # this is because I force the gage to be at a grid node
meta_dict['BC_FRF_Y'] = bc_coords['yFRF']
# check that the gage is inside my "master_bathy" x bounds
assert bc_coords['xFRF'] <= max(master_bathy['xFRF']) and bc_coords['xFRF'] >= min(master_bathy['xFRF']), 'The wave gage selected as the boundary condition is outside the known bathymetry.'
# make the shift from "master bathy" to model bathy convention (zero at forcing instrument and positive increasing towards shore)
BC_dict['x'] = np.flipud(int(round(bc_coords['xFRF'])) - master_bathy['xFRF'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['zb'] = np.flipud(master_bathy['elev'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['fw'] = np.flipud(fric_fac * np.ones(BC_dict['x'].size)) # cross-shore values of the bottom friction (just sets it to be fric_fac at every point in the array)
else:
raise EnvironmentError('No Bathymetry available')
elif version_prefix == 'MOBILE_RESET':
# do I want the integrated bathy or the survey
if bathy_loc == 'survey':
# pull down the survey dates.
frf_Data = getObs(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS=server)
# is this profile number in the survey?
prof_nums = frf_Data.getBathyTransectProfNum()
assert profile_num in prof_nums, 'Please begin simulations with a survey that includes profile number %s.' %(str(profile_num))
# what time am I dealing with?
bathy_data = frf_Data.getBathyTransectFromNC(profilenumbers=profile_num)
wave_data8m = frf_Data.getWaveSpec(gaugenumber=12)
wave_data6m = frf_Data.getWaveSpec(gaugenumber=4)
check_time = max(bathy_data['time'])
if DT.timedelta(hours=24) >= start_time - check_time:
# we are reseting today!
# go through waves decision tree
# Attempt to get 8m array first!!!
try:
wave_data = wave_data8m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]),wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'], helper(wave_data['time'] - wave_data['time'][0]),np.divide(1, wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that craps out, try to get the 6m AWAC!!!
try:
wave_data = wave_data6m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), np.divide(1, wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Wave data are missing for both 8m array and 6m AWAC.!'
# then we are going to use this bathy!
# calculate some stuff about the along-shore variation of your transect!
meta_dict['bathy_surv_num'] = np.unique(bathy_data['surveyNumber']) # tag the survey number!
meta_dict['bathy_surv_stime'] = bathy_data['time'][0] # tag the survey start time!
meta_dict['bathy_surv_etime'] = bathy_data['time'][-1] # tag the survey end time!
meta_dict['bathy_prof_num'] = profile_num # tag the profile number!
meta_dict['bathy_y_max_diff'] = bathy_data['yFRF'].max() - bathy_data['yFRF'].min() # what is the difference between the largest y-position of this transect and the smallest y-position of this transect in FRF coordinates (FRF units)
meta_dict['bathy_y_sdev'] = np.std(bathy_data['yFRF']) # standard deviation of the y-positions of this transect in FRF coordinate (FRF units)
master_bathy = {'xFRF': np.asarray(range(int(math.ceil(min(bathy_data['xFRF']))), int(max(bathy_data['xFRF']) + dx),dx))} # xFRF coordinates of master bathy indices in m
master_bathy['elev'] = np.interp(master_bathy['xFRF'], bathy_data['xFRF'],bathy_data['elevation']) # elevation at master bathy nodes in m
# actually convert the bathy_data to the coordinates of the model!
bc_coords = FRFcoord(wave_data['lon'], wave_data['lat'])
meta_dict['BC_FRF_X'] = int(round(bc_coords['xFRF'])) # this is because I force the gage to be at a grid node
meta_dict['BC_FRF_Y'] = bc_coords['yFRF']
# check that the gage is inside my "master_bathy" x bounds
assert bc_coords['xFRF'] <= max(master_bathy['xFRF']) and bc_coords['xFRF'] >= min(master_bathy['xFRF']), 'The wave gage selected as the boundary condition is outside the known bathymetry.'
# make the shift from "master bathy" to model bathy convention (zero at forcing instrument and positive increasing towards shore)
BC_dict['x'] = np.flipud(int(round(bc_coords['xFRF'])) - master_bathy['xFRF'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['zb'] = np.flipud(master_bathy['elev'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['fw'] = np.flipud(fric_fac * np.ones(BC_dict['x'].size)) # cross-shore values of the bottom friction (just sets it to be fric_fac at every point in the array)
else:
# we are not resetting and have to pull the previously written file.
try:
Time_O = (DT.datetime.strptime(startTime, '%Y-%m-%dT%H:%M:%SZ') - DT.timedelta(days=1)).strftime('%Y-%m-%dT%H%M%SZ')
# initialize the class
cshore_io_O = inputOutput.cshoreIO()
# get into the directory I need
start_dir_O = workingDir
path_prefix_O = path_prefix
params0, bc0, veg0, hydro0, sed0, morpho0, meta0 = cshore_io_O.load_CSHORE_results(path_prefix_O + Time_O)
# calculate some stuff about the along-shore variation of your transect!
meta_dict['bathy_surv_num'] = meta0['bathy_surv_num']
meta_dict['bathy_surv_stime'] = meta0['bathy_surv_stime']
meta_dict['bathy_surv_etime'] = meta0['bathy_surv_etime']
meta_dict['bathy_prof_num'] = meta0['bathy_prof_num'] # tag the profile number!
meta_dict['bathy_y_max_diff'] = meta0['bathy_y_max_diff'] # what is the difference between the largest y-position of this transect and the smallest y-position of this transect in FRF coordinates (FRF units)
meta_dict['bathy_y_sdev'] = meta0['bathy_y_sdev'] # standard deviation of the y-positions of this transect in FRF coordinate (FRF units)
meta_dict['BC_FRF_X'] = meta0["BC_FRF_X"]
meta_dict['BC_FRF_Y'] = meta0["BC_FRF_Y"]
BC_dict['x'] = morpho0['x'][-1]
BC_dict['zb'] = morpho0['zb'][-1]
BC_dict['fw'] = np.flipud(fric_fac * np.ones(BC_dict['x'].size))
prev_wg = meta0['BC_gage']
# which gage was it?
frf_Data = getObs(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS=server)
wave_data8m = frf_Data.getWaveSpec(gaugenumber=12)
wave_data6m = frf_Data.getWaveSpec(gaugenumber=4)
if prev_wg == wave_data6m['name']:
# go straight to 6m awac
try:
wave_data = wave_data6m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % ( np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), np.divide(1,wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Previous simulation ran from 6m AWAC and wave data missing 6m AWAC.!'
else:
# go through my normal decision tree
# Attempt to get 8m array first!!!
try:
wave_data = wave_data8m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (
np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), np.divide(1,wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that craps out, try to get the 6m AWAC!!!
try:
wave_data = wave_data6m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), np.divide(1,wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Wave data are missing for both 8m array and 6m AWAC.!'
# check to see if we stepped down and I have to adjust my x, zb
if prev_wg == wave_data8m['name'] and meta_dict['BC_gage'] == wave_data6m['name']:
# re-assign this to my new WG
bc_coords = FRFcoord(wave_data['lon'], wave_data['lat'])
meta_dict['BC_FRF_X'] = int(round(bc_coords['xFRF'])) # this is because I force the gage to be at a grid node
meta_dict['BC_FRF_Y'] = bc_coords['yFRF']
# if I was at the 8m but now I'm at the 6, I need to shave off some of my domain!
nBC_x = BC_dict['x']
nBC_zb = BC_dict['zb']
nBC_fw = BC_dict['fw']
# change the bc stuff so that 0 is at the 6m awac now, then drop all points with less than zero!
nBC_x = nBC_x - (max(nBC_x) - meta_dict['BC_FRF_X'])
keep_ind = np.where(nBC_x >= 0)
nBC_zb = nBC_zb[keep_ind]
nBC_fw = nBC_fw[keep_ind]
nBC_x = nBC_x[keep_ind]
# delete the old ones and re-assign
del BC_dict['x']
del BC_dict['zb']
del BC_dict['fw']
BC_dict['x'] = nBC_x
BC_dict['zb'] = nBC_zb
BC_dict['fw'] = nBC_fw
except:
raise EnvironmentError('No Bathymetry available')
elif bathy_loc == 'integrated_bathy':
# pull the bathymetry from the integrated product - see Spike's getdatatestbed function
cmtb_data = getDataTestBed(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS=server)
frf_Data = getObs(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS=server)
bathy_data = cmtb_data.getBathyIntegratedTransect()
wave_data8m = frf_Data.getWaveSpec(gaugenumber=12)
wave_data6m = frf_Data.getWaveSpec(gaugenumber=4)
check_time = bathy_data['time']
if DT.timedelta(hours=24) >= (start_time - check_time) + DT.timedelta(minutes=1):
# we are resetting today!
# go through waves decision tree
# Attempt to get 8m array first!!!
try:
wave_data = wave_data8m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), np.divide(1,wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that craps out, try to get the 6m AWAC!!!
try:
wave_data = wave_data6m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), np.divide(1,wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Wave data are missing for both 8m array and 6m AWAC.!'
# then we are going to use this bathy!
# get my master bathy x-array
master_bathy = {'xFRF': np.asarray(range(int(math.ceil(min(bathy_data['xFRF']))), int(max(bathy_data['xFRF']) + dx), dx))} # xFRF coordinates of master bathy indices in m
elev_mat = bathy_data['elevation']
xFRF_mat = np.matlib.repmat(bathy_data['xFRF'], np.shape(elev_mat)[0], 1)
yFRF_mat = np.matlib.repmat(bathy_data['yFRF'].T, np.shape(elev_mat)[1], 1).T
# have to do 2D interpolation instead of 1D!!!!!!
points = np.array((xFRF_mat.flatten(), yFRF_mat.flatten())).T
values = elev_mat.flatten()
interp_pts = np.array((master_bathy['xFRF'], profile_num * np.ones(np.shape(master_bathy['xFRF'])))).T
master_bathy['elev'] = griddata(points, values, interp_pts)
# calculate some stuff about the along-shore variation of your transect!
meta_dict['bathy_surv_num'] = np.unique(bathy_data['surveyNumber']) # tag the survey number!
meta_dict['bathy_surv_stime'] = bathy_data['time'] # same for integrated bathy
meta_dict['bathy_surv_etime'] = bathy_data['time'] # same for integrated bathy
meta_dict['bathy_prof_num'] = profile_num # tag the profile number!
meta_dict['bathy_y_max_diff'] = 0 # always zero for intergrated bathy
meta_dict['bathy_y_sdev'] = 0 # always zero for intergrated bathy
# actually convert the bathy_data to the coordinates of the model!
bc_coords = FRFcoord(wave_data['lon'], wave_data['lat'])
meta_dict['BC_FRF_X'] = int(round(bc_coords['xFRF'])) # this is because I force the gage to be at a grid node
meta_dict['BC_FRF_Y'] = bc_coords['yFRF']
# check that the gage is inside my "master_bathy" x bounds
assert bc_coords['xFRF'] <= max(master_bathy['xFRF']) and bc_coords['xFRF'] >= min(master_bathy['xFRF']), 'The wave gage selected as the boundary condition is outside the known bathymetry.'
# make the shift from "master bathy" to model bathy convention (zero at forcing instrument and positive increasing towards shore)
BC_dict['x'] = np.flipud(int(round(bc_coords['xFRF'])) - master_bathy['xFRF'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['zb'] = np.flipud(master_bathy['elev'][np.argwhere(master_bathy['xFRF'] <= int(round(bc_coords['xFRF'])))].flatten())
BC_dict['fw'] = np.flipud(fric_fac * np.ones(BC_dict['x'].size)) # cross-shore values of the bottom friction (just sets it to be fric_fac at every point in the array)
else:
# we are not resetting and have to pull the previously written file.
try:
Time_O = (DT.datetime.strptime(startTime, '%Y-%m-%dT%H:%M:%SZ') - DT.timedelta(days=1)).strftime('%Y-%m-%dT%H%M%SZ')
# initialize the class
cshore_io_O = inputOutput.cshoreIO()
# get into the directory I need
start_dir_O = workingDir
path_prefix_O = path_prefix
params0, bc0, veg0, hydro0, sed0, morpho0, meta0 = cshore_io_O.load_CSHORE_results(path_prefix_O + Time_O)
# calculate some stuff about the along-shore variation of your transect!
meta_dict['bathy_surv_num'] = meta0['bathy_surv_num']
meta_dict['bathy_surv_stime'] = meta0['bathy_surv_stime']
meta_dict['bathy_surv_etime'] = meta0['bathy_surv_etime']
meta_dict['bathy_prof_num'] = meta0['bathy_prof_num'] # tag the profile number!
meta_dict['bathy_y_max_diff'] = meta0['bathy_y_max_diff'] # what is the difference between the largest y-position of this transect and the smallest y-position of this transect in FRF coordinates (FRF units)
meta_dict['bathy_y_sdev'] = meta0['bathy_y_sdev'] # standard deviation of the y-positions of this transect in FRF coordinate (FRF units)
meta_dict['BC_FRF_X'] = meta0["BC_FRF_X"]
meta_dict['BC_FRF_Y'] = meta0["BC_FRF_Y"]
BC_dict['x'] = morpho0['x'][-1]
BC_dict['zb'] = morpho0['zb'][-1]
BC_dict['fw'] = np.flipud(fric_fac * np.ones(BC_dict['x'].size))
prev_wg = meta0['BC_gage']
# which gage was it?
frf_Data = getObs(start_time, end_time + DT.timedelta(days=0, hours=0, minutes=1), THREDDS=server)
wave_data8m = frf_Data.getWaveSpec(gaugenumber=12)
if 'name' not in wave_data8m.keys():
wave_data8m['name'] = 'FRF 8m Array'
else:
pass
wave_data6m = frf_Data.getWaveSpec(gaugenumber=4)
if 'name' not in wave_data6m.keys():
wave_data6m['name'] = 'FRF 6m AWAC'
else:
pass
if prev_wg == wave_data6m['name']:
# go straight to 6m awac
try:
wave_data = wave_data6m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), np.divide(1,wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],
helper(wave_data['time'] - wave_data['time'][0]),
geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
# TODO @David use a constant exposed parameter for pier angle (at the top - a global variable) this way if it needs to be changed it's changed everywhere
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Previous simulation ran from 6m AWAC and wave data missing 6m AWAC.!'
else:
# go through my normal decision tree
# Attempt to get 8m array first!!!
try:
wave_data = wave_data8m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]), np.divide(1,wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that craps out, try to get the 6m AWAC!!!
try:
wave_data = wave_data6m
meta_dict['BC_gage'] = wave_data['name']
print "_________________\nGathering Wave Data from %s" % (wave_data['name'])
# check to see if I am missing my first and last points - if so then I can't interpolate
assert start_time in wave_data['time'] and end_time in wave_data['time'], 'Wave data are missing for simulation start time or end time!'
# if I am missing more than 1/4 of the data I should have, abort the run
assert len(wave_data['Hs']) > 0.75 * len(BC_dict['timebc_wave']), 'Missing more than 25% of wave data'
# get missing wave data if there is any!
date_list = np.array([start_time + DT.timedelta(hours=x) for x in range(0, timerun + 1)])
dum_var = [x not in wave_data['time'] for x in date_list]
if sum(dum_var) == 0:
meta_dict['blank_wave_data'] = np.nan
else:
meta_dict['blank_wave_data'] = date_list[np.argwhere(dum_var).flatten()] # this will list all the times that wave data should exist, but doesn't
print "%d wave records with %d interpolated points" % (np.shape(wave_data['dWED'])[0], timerun + 1 - len(wave_data['dWED']))
helper = np.vectorize(lambda x: x.total_seconds())
BC_dict['Hs'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),wave_data['Hs']) # WE ARE USING Hmo (Hs in wave_data dictionary) INSTEAD OF Hs!!!!! -> convert during infile write!!!
BC_dict['Tp'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),np.divide(1, wave_data['peakf'])) # we are inverting the peak frequency to get peak period
BC_dict['angle'] = np.interp(BC_dict['timebc_wave'],helper(wave_data['time'] - wave_data['time'][0]),geo2STWangle(wave_data['waveDp'], zeroAngle=71.8,fixanglesout=1)) # i am assuming that the angle is the peak of the directional spectra, not the MEAN ANGLE!!! also I'm assuming 0-360 degrees!
except:
# If that doesn't work, you are done....
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Wave data are missing for both 8m array and 6m AWAC.!'
# check to see if we stepped down and I have to adjust my x, zb
if prev_wg == wave_data8m['name'] and meta_dict['BC_gage'] == wave_data6m['name']:
# re-assign this to my new WG
bc_coords = FRFcoord(wave_data['lon'], wave_data['lat'])
meta_dict['BC_FRF_X'] = int(round(bc_coords['xFRF'])) # this is because I force the gage to be at a grid node
meta_dict['BC_FRF_Y'] = bc_coords['yFRF']
# if I was at the 8m but now I'm at the 6, I need to shave off some of my domain!
nBC_x = BC_dict['x']
nBC_zb = BC_dict['zb']
nBC_fw = BC_dict['fw']
# change the bc stuff so that 0 is at the 6m awac now, then drop all points with less than zero!
nBC_x = nBC_x - (max(nBC_x) - meta_dict['BC_FRF_X'])
keep_ind = np.where(nBC_x >= 0)
nBC_zb = nBC_zb[keep_ind]
nBC_fw = nBC_fw[keep_ind]
nBC_x = nBC_x[keep_ind]
# delete the old ones and re-assign
del BC_dict['x']
del BC_dict['zb']
del BC_dict['fw']
BC_dict['x'] = nBC_x
BC_dict['zb'] = nBC_zb
BC_dict['fw'] = nBC_fw
except:
raise EnvironmentError('No Bathymetry available')
# check to see if I actually have wave data after all this...
assert 'Hs' in BC_dict.keys(), 'Simulation broken. Wave data are missing for both 8m array and 6m AWAC!'
## ___________WATER LEVEL__________________
print '_________________\nGetting Water Level Data'
try:
# Pull water level data
dum_class = prepDataLib.PrepDataTools()
wl_data = dum_class.prep_WL(frf_Data.getWL(), date_list)
BC_dict['swlbc'] = wl_data['avgWL'] #gives me the avg water level at "date_list"
BC_dict['Wsetup'] = np.zeros(len(BC_dict['timebc_wave'])) # we are HARD CODING the wave setup to always be zero!!!
meta_dict['blank_wl_data'] = wl_data['time'][np.argwhere(wl_data['flag']==1)]
print 'number of WL records %d, with %d interpolated points' % (np.size(wl_data['time']), sum(wl_data['flag']))
except (RuntimeError, TypeError):
wl_data = None
# ___________TEMP AND SALINITY ______________________
ctd_data = frf_Data.getCTD()
if ctd_data == None:
BC_dict['salin'] = 30 # salin in ppt
BC_dict['temp'] = 15 # water temp in degrees C
else:
BC_dict['salin'] = ctd_data['salin'] # salin in ppt
BC_dict['temp'] = ctd_data['temp'] # water temp in degrees C
# Last thing to do ... write files
print 'WRITING simulation Files'
cshore_io = inputOutput.cshoreIO()
# since we are changing ilab to be 1, I need to pare down my input stuff
BC_dict['angle'] = BC_dict['angle'][0:-1]
BC_dict['timebc_wave'] = BC_dict['timebc_wave'][0:-1]
BC_dict['Hs'] = BC_dict['Hs'][0:-1]
BC_dict['Wsetup'] = BC_dict['Wsetup'][0:-1]
BC_dict['Tp'] = BC_dict['Tp'][0:-1]
BC_dict['swlbc'] = BC_dict['swlbc'][0:-1]
# write infile
cshore_io.make_CSHORE_infile(path_prefix + date_str + '/infile', BC_dict, meta_dict)
# write metadata file
# cshore_io.write_flags(date_str, path_prefix, wavepacket, windpacket, WLpacket, curpacket, gridFlag)
| 61.122632 | 327 | 0.56877 | 15,840 | 116,133 | 3.938889 | 0.050505 | 0.060649 | 0.029427 | 0.018464 | 0.85293 | 0.838858 | 0.833104 | 0.825603 | 0.815072 | 0.807652 | 0 | 0.016318 | 0.312435 | 116,133 | 1,899 | 328 | 61.154818 | 0.765057 | 0.21242 | 0 | 0.743568 | 0 | 0 | 0.164272 | 0.008588 | 0 | 0 | 0 | 0.000527 | 0.047303 | 0 | null | null | 0.012448 | 0.012448 | null | null | 0.034855 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f9534f385bd3fed837e0fa18d8de424faa2b6294 | 22,905 | py | Python | sdk/python/pulumi_azure/dataprotection/backup_policy_disk.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2018-06-18T00:19:44.000Z | 2022-02-20T05:32:57.000Z | sdk/python/pulumi_azure/dataprotection/backup_policy_disk.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 663 | 2018-06-18T21:08:46.000Z | 2022-03-31T20:10:11.000Z | sdk/python/pulumi_azure/dataprotection/backup_policy_disk.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 41 | 2018-07-19T22:37:38.000Z | 2022-03-14T10:56:26.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['BackupPolicyDiskArgs', 'BackupPolicyDisk']
@pulumi.input_type
class BackupPolicyDiskArgs:
def __init__(__self__, *,
backup_repeating_time_intervals: pulumi.Input[Sequence[pulumi.Input[str]]],
default_retention_duration: pulumi.Input[str],
vault_id: pulumi.Input[str],
name: Optional[pulumi.Input[str]] = None,
retention_rules: Optional[pulumi.Input[Sequence[pulumi.Input['BackupPolicyDiskRetentionRuleArgs']]]] = None):
"""
The set of arguments for constructing a BackupPolicyDisk resource.
:param pulumi.Input[Sequence[pulumi.Input[str]]] backup_repeating_time_intervals: Specifies a list of repeating time interval. It should follow `ISO 8601` repeating time interval . Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] default_retention_duration: The duration of default retention rule. It should follow `ISO 8601` duration format. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] vault_id: The ID of the Backup Vault within which the Backup Policy Disk should exist. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] name: The name which should be used for this Backup Policy Disk. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[Sequence[pulumi.Input['BackupPolicyDiskRetentionRuleArgs']]] retention_rules: One or more `retention_rule` blocks as defined below. Changing this forces a new Backup Policy Disk to be created.
"""
pulumi.set(__self__, "backup_repeating_time_intervals", backup_repeating_time_intervals)
pulumi.set(__self__, "default_retention_duration", default_retention_duration)
pulumi.set(__self__, "vault_id", vault_id)
if name is not None:
pulumi.set(__self__, "name", name)
if retention_rules is not None:
pulumi.set(__self__, "retention_rules", retention_rules)
@property
@pulumi.getter(name="backupRepeatingTimeIntervals")
def backup_repeating_time_intervals(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Specifies a list of repeating time interval. It should follow `ISO 8601` repeating time interval . Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "backup_repeating_time_intervals")
@backup_repeating_time_intervals.setter
def backup_repeating_time_intervals(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "backup_repeating_time_intervals", value)
@property
@pulumi.getter(name="defaultRetentionDuration")
def default_retention_duration(self) -> pulumi.Input[str]:
"""
The duration of default retention rule. It should follow `ISO 8601` duration format. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "default_retention_duration")
@default_retention_duration.setter
def default_retention_duration(self, value: pulumi.Input[str]):
pulumi.set(self, "default_retention_duration", value)
@property
@pulumi.getter(name="vaultId")
def vault_id(self) -> pulumi.Input[str]:
"""
The ID of the Backup Vault within which the Backup Policy Disk should exist. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "vault_id")
@vault_id.setter
def vault_id(self, value: pulumi.Input[str]):
pulumi.set(self, "vault_id", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name which should be used for this Backup Policy Disk. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="retentionRules")
def retention_rules(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['BackupPolicyDiskRetentionRuleArgs']]]]:
"""
One or more `retention_rule` blocks as defined below. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "retention_rules")
@retention_rules.setter
def retention_rules(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['BackupPolicyDiskRetentionRuleArgs']]]]):
pulumi.set(self, "retention_rules", value)
@pulumi.input_type
class _BackupPolicyDiskState:
def __init__(__self__, *,
backup_repeating_time_intervals: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
default_retention_duration: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
retention_rules: Optional[pulumi.Input[Sequence[pulumi.Input['BackupPolicyDiskRetentionRuleArgs']]]] = None,
vault_id: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering BackupPolicyDisk resources.
:param pulumi.Input[Sequence[pulumi.Input[str]]] backup_repeating_time_intervals: Specifies a list of repeating time interval. It should follow `ISO 8601` repeating time interval . Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] default_retention_duration: The duration of default retention rule. It should follow `ISO 8601` duration format. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] name: The name which should be used for this Backup Policy Disk. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[Sequence[pulumi.Input['BackupPolicyDiskRetentionRuleArgs']]] retention_rules: One or more `retention_rule` blocks as defined below. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] vault_id: The ID of the Backup Vault within which the Backup Policy Disk should exist. Changing this forces a new Backup Policy Disk to be created.
"""
if backup_repeating_time_intervals is not None:
pulumi.set(__self__, "backup_repeating_time_intervals", backup_repeating_time_intervals)
if default_retention_duration is not None:
pulumi.set(__self__, "default_retention_duration", default_retention_duration)
if name is not None:
pulumi.set(__self__, "name", name)
if retention_rules is not None:
pulumi.set(__self__, "retention_rules", retention_rules)
if vault_id is not None:
pulumi.set(__self__, "vault_id", vault_id)
@property
@pulumi.getter(name="backupRepeatingTimeIntervals")
def backup_repeating_time_intervals(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Specifies a list of repeating time interval. It should follow `ISO 8601` repeating time interval . Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "backup_repeating_time_intervals")
@backup_repeating_time_intervals.setter
def backup_repeating_time_intervals(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "backup_repeating_time_intervals", value)
@property
@pulumi.getter(name="defaultRetentionDuration")
def default_retention_duration(self) -> Optional[pulumi.Input[str]]:
"""
The duration of default retention rule. It should follow `ISO 8601` duration format. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "default_retention_duration")
@default_retention_duration.setter
def default_retention_duration(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "default_retention_duration", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name which should be used for this Backup Policy Disk. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="retentionRules")
def retention_rules(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['BackupPolicyDiskRetentionRuleArgs']]]]:
"""
One or more `retention_rule` blocks as defined below. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "retention_rules")
@retention_rules.setter
def retention_rules(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['BackupPolicyDiskRetentionRuleArgs']]]]):
pulumi.set(self, "retention_rules", value)
@property
@pulumi.getter(name="vaultId")
def vault_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the Backup Vault within which the Backup Policy Disk should exist. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "vault_id")
@vault_id.setter
def vault_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vault_id", value)
class BackupPolicyDisk(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
backup_repeating_time_intervals: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
default_retention_duration: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
retention_rules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BackupPolicyDiskRetentionRuleArgs']]]]] = None,
vault_id: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Manages a Backup Policy Disk.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
rg = azure.core.ResourceGroup("rg", location="West Europe")
example_backup_vault = azure.dataprotection.BackupVault("exampleBackupVault",
resource_group_name=rg.name,
location=rg.location,
datastore_type="VaultStore",
redundancy="LocallyRedundant")
example_backup_policy_disk = azure.dataprotection.BackupPolicyDisk("exampleBackupPolicyDisk",
vault_id=example_backup_vault.id,
backup_repeating_time_intervals=["R/2021-05-19T06:33:16+00:00/PT4H"],
default_retention_duration="P7D",
retention_rules=[
azure.dataprotection.BackupPolicyDiskRetentionRuleArgs(
name="Daily",
duration="P7D",
priority=25,
criteria=azure.dataprotection.BackupPolicyDiskRetentionRuleCriteriaArgs(
absolute_criteria="FirstOfDay",
),
),
azure.dataprotection.BackupPolicyDiskRetentionRuleArgs(
name="Weekly",
duration="P7D",
priority=20,
criteria=azure.dataprotection.BackupPolicyDiskRetentionRuleCriteriaArgs(
absolute_criteria="FirstOfWeek",
),
),
])
```
## Import
Backup Policy Disks can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:dataprotection/backupPolicyDisk:BackupPolicyDisk example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/group1/providers/Microsoft.DataProtection/backupVaults/vault1/backupPolicies/backupPolicy1
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Sequence[pulumi.Input[str]]] backup_repeating_time_intervals: Specifies a list of repeating time interval. It should follow `ISO 8601` repeating time interval . Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] default_retention_duration: The duration of default retention rule. It should follow `ISO 8601` duration format. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] name: The name which should be used for this Backup Policy Disk. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BackupPolicyDiskRetentionRuleArgs']]]] retention_rules: One or more `retention_rule` blocks as defined below. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] vault_id: The ID of the Backup Vault within which the Backup Policy Disk should exist. Changing this forces a new Backup Policy Disk to be created.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: BackupPolicyDiskArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a Backup Policy Disk.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
rg = azure.core.ResourceGroup("rg", location="West Europe")
example_backup_vault = azure.dataprotection.BackupVault("exampleBackupVault",
resource_group_name=rg.name,
location=rg.location,
datastore_type="VaultStore",
redundancy="LocallyRedundant")
example_backup_policy_disk = azure.dataprotection.BackupPolicyDisk("exampleBackupPolicyDisk",
vault_id=example_backup_vault.id,
backup_repeating_time_intervals=["R/2021-05-19T06:33:16+00:00/PT4H"],
default_retention_duration="P7D",
retention_rules=[
azure.dataprotection.BackupPolicyDiskRetentionRuleArgs(
name="Daily",
duration="P7D",
priority=25,
criteria=azure.dataprotection.BackupPolicyDiskRetentionRuleCriteriaArgs(
absolute_criteria="FirstOfDay",
),
),
azure.dataprotection.BackupPolicyDiskRetentionRuleArgs(
name="Weekly",
duration="P7D",
priority=20,
criteria=azure.dataprotection.BackupPolicyDiskRetentionRuleCriteriaArgs(
absolute_criteria="FirstOfWeek",
),
),
])
```
## Import
Backup Policy Disks can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:dataprotection/backupPolicyDisk:BackupPolicyDisk example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/group1/providers/Microsoft.DataProtection/backupVaults/vault1/backupPolicies/backupPolicy1
```
:param str resource_name: The name of the resource.
:param BackupPolicyDiskArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(BackupPolicyDiskArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
backup_repeating_time_intervals: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
default_retention_duration: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
retention_rules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BackupPolicyDiskRetentionRuleArgs']]]]] = None,
vault_id: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = BackupPolicyDiskArgs.__new__(BackupPolicyDiskArgs)
if backup_repeating_time_intervals is None and not opts.urn:
raise TypeError("Missing required property 'backup_repeating_time_intervals'")
__props__.__dict__["backup_repeating_time_intervals"] = backup_repeating_time_intervals
if default_retention_duration is None and not opts.urn:
raise TypeError("Missing required property 'default_retention_duration'")
__props__.__dict__["default_retention_duration"] = default_retention_duration
__props__.__dict__["name"] = name
__props__.__dict__["retention_rules"] = retention_rules
if vault_id is None and not opts.urn:
raise TypeError("Missing required property 'vault_id'")
__props__.__dict__["vault_id"] = vault_id
super(BackupPolicyDisk, __self__).__init__(
'azure:dataprotection/backupPolicyDisk:BackupPolicyDisk',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
backup_repeating_time_intervals: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
default_retention_duration: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
retention_rules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BackupPolicyDiskRetentionRuleArgs']]]]] = None,
vault_id: Optional[pulumi.Input[str]] = None) -> 'BackupPolicyDisk':
"""
Get an existing BackupPolicyDisk resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Sequence[pulumi.Input[str]]] backup_repeating_time_intervals: Specifies a list of repeating time interval. It should follow `ISO 8601` repeating time interval . Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] default_retention_duration: The duration of default retention rule. It should follow `ISO 8601` duration format. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] name: The name which should be used for this Backup Policy Disk. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BackupPolicyDiskRetentionRuleArgs']]]] retention_rules: One or more `retention_rule` blocks as defined below. Changing this forces a new Backup Policy Disk to be created.
:param pulumi.Input[str] vault_id: The ID of the Backup Vault within which the Backup Policy Disk should exist. Changing this forces a new Backup Policy Disk to be created.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _BackupPolicyDiskState.__new__(_BackupPolicyDiskState)
__props__.__dict__["backup_repeating_time_intervals"] = backup_repeating_time_intervals
__props__.__dict__["default_retention_duration"] = default_retention_duration
__props__.__dict__["name"] = name
__props__.__dict__["retention_rules"] = retention_rules
__props__.__dict__["vault_id"] = vault_id
return BackupPolicyDisk(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="backupRepeatingTimeIntervals")
def backup_repeating_time_intervals(self) -> pulumi.Output[Sequence[str]]:
"""
Specifies a list of repeating time interval. It should follow `ISO 8601` repeating time interval . Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "backup_repeating_time_intervals")
@property
@pulumi.getter(name="defaultRetentionDuration")
def default_retention_duration(self) -> pulumi.Output[str]:
"""
The duration of default retention rule. It should follow `ISO 8601` duration format. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "default_retention_duration")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name which should be used for this Backup Policy Disk. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="retentionRules")
def retention_rules(self) -> pulumi.Output[Optional[Sequence['outputs.BackupPolicyDiskRetentionRule']]]:
"""
One or more `retention_rule` blocks as defined below. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "retention_rules")
@property
@pulumi.getter(name="vaultId")
def vault_id(self) -> pulumi.Output[str]:
"""
The ID of the Backup Vault within which the Backup Policy Disk should exist. Changing this forces a new Backup Policy Disk to be created.
"""
return pulumi.get(self, "vault_id")
| 53.391608 | 249 | 0.681161 | 2,621 | 22,905 | 5.748188 | 0.085082 | 0.069361 | 0.050179 | 0.044139 | 0.880127 | 0.868114 | 0.85079 | 0.838112 | 0.832869 | 0.812757 | 0 | 0.010184 | 0.232657 | 22,905 | 428 | 250 | 53.516355 | 0.847007 | 0.442436 | 0 | 0.642512 | 1 | 0 | 0.150083 | 0.097497 | 0 | 0 | 0 | 0 | 0 | 1 | 0.154589 | false | 0.004831 | 0.033816 | 0 | 0.280193 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f9c68335546180892aabd8167939a87a94e83696 | 4,760 | py | Python | DES.py | xyhlk520/GPG-Decrypt | 43efec3890b5ff9c905180eef6778a94f10bb6b0 | [
"Apache-2.0"
] | null | null | null | DES.py | xyhlk520/GPG-Decrypt | 43efec3890b5ff9c905180eef6778a94f10bb6b0 | [
"Apache-2.0"
] | null | null | null | DES.py | xyhlk520/GPG-Decrypt | 43efec3890b5ff9c905180eef6778a94f10bb6b0 | [
"Apache-2.0"
] | null | null | null | import DES_core as _core
def encrypt(data,key,mode="ECB",iv=None):
if mode == "CBC" or mode == "OFB":
iv = _core._bytes_to_bit_array(iv)
if mode == "ECB" or mode == "CBC":
data = _core._add_padding(data)
subkeys = _core._generate_subkeys(key)
data = _core._bytes_to_bit_array(data)
result = []
for pt_block in _core._nsplit(data, 64):
if mode == "ECB":
ct_block = _core._encrypt_block(pt_block, subkeys)
result += ct_block
elif mode == "CBC":
ct_block = _core._xor(pt_block.iv)
ct_block = _core._encrypt_block(ct_block,subkeys)
iv = ct_block
result += ct_block
elif mode =="OFB":
iv = _core._crypt_block(iv,subkeys)
ct_block = _core._xor(pt_block.iv)
result += ct_block
else:
raise ValueError("The mode '{}' is invalid".format(mode))
result = _core._bit_array_to_bytes(result)
return result
def decrypt(data,key,mode="ECB",iv=None):
subkeys = _core._generate_subkeys(key)
if mode == "ECB" or mode == "CBC":
subkeys = list(reversed(subkeys))
if mode == "CBC" or mode == "OFB":
iv = _core._bytes_to_bit_array(iv)
data = _core._bytes_to_bit_array(data)
result = []
for ct_block in _core._nsplit(data,64):
if mode == "ECB":
pt_block = _core._encrypt_block(ct_block, subkeys)
result += pt_block
elif mode == "CBC":
pt_block = _core._encrypt_block(ct_block, subkeys)
pt_block = _core._xor(pt_block, iv)
iv = ct_block
result += pt_block
elif mode == "OFB":
iv = _core._encrypt_block(iv,subkeys)
pt_block = _core._xor(ct_block, iv)
result += pt_block
else:
raise ValueError("The mode '{}' is invalid".format(mode))
result = _core._bit_array_to_bytes(result)
if mode == "ECB" or mode == "CBC":
result = _core._remove_padding(result)
return result
def tdencrypt(data,key,mode="ECB",iv=None):
if mode == "CBC" or mode == "OFB":
iv = _core._bytes_to_bit_array(iv)
if mode == "ECB" or mode == "CBC":
data = _core._add_padding(data)
subkeys = _core._generate_subkeys(key)
data = _core._bytes_to_bit_array(data)
result = []
for pt_block in _core._nsplit(data, 64):
if mode == "ECB":
ct_block = _core._encrypt_block(pt_block, subkeys)
result += ct_block
elif mode == "CBC":
ct_block = _core._xor(pt_block.iv)
ct_block = _core._encrypt_block(ct_block,subkeys)
iv = ct_block
result += ct_block
elif mode =="OFB":
iv = _core._crypt_block(iv,subkeys)
ct_block = _core._xor(pt_block.iv)
result += ct_block
else:
raise ValueError("The mode '{}' is invalid".format(mode))
result = _core._bit_array_to_bytes(result)
return result
def tddecrypt(data,key,mode="ECB",iv=None):
key1,key2,key3 = _core._split_encryption_keys(key)
subkeys1 = _core._generate_subkeys(key1)
subkeys2 = _core._generate_subkeys(key2)
subkeys3 = _core._generate_subkeys(key3)
if mode == "OFB":
subkeys2 = list(reversed(subkeys2))
else:
subkeys1 = list(reversed(subkeys1))
subkeys3 = list(reversed(subkeys3))
if mode == "CBC" or mode == "OFB":
iv = _core._bytes_to_bit_array(iv)
data = _core._bytes_to_bit_array(data)
result = []
for ct_block in _core._nsplit(data,64):
if mode == "ECB":
pt_block = _core._encrypt_block(ct_block, subkeys3)
pt_block = _core._encrypt_block(pt_block, subkeys2)
pt_block = _core._encrypt_block(pt_block, subkeys1)
result += pt_block
elif mode == "CBC":
pt_block = _core._encrypt_block(ct_block, subkeys3)
pt_block = _core._encrypt_block(pt_block, subkeys2)
pt_block = _core._encrypt_block(pt_block, subkeys1)
pt_block = _core._xor(pt_block, iv)
iv = ct_block
result += pt_block
elif mode == "OFB":
iv = _core._encrypt_block(iv,subkeys1)
iv = _core._encrypt_block(iv,subkeys2)
iv = _core._encrypt_block(iv,subkeys3)
pt_block = _core._xor(ct_block, iv)
result += pt_block
else:
raise ValueError("The mode '{}' is invalid".format(mode))
result = _core._bit_array_to_bytes(result)
if mode == "ECB" or mode == "CBC":
result = _core._remove_padding(result)
return result | 35.522388 | 69 | 0.584244 | 605 | 4,760 | 4.209917 | 0.095868 | 0.087947 | 0.10051 | 0.09894 | 0.868473 | 0.838634 | 0.815862 | 0.815862 | 0.810365 | 0.810365 | 0 | 0.009639 | 0.302521 | 4,760 | 134 | 70 | 35.522388 | 0.75753 | 0 | 0 | 0.846154 | 0 | 0 | 0.042218 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034188 | false | 0 | 0.008547 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f9d6fe3a30f998fe9dc57a8f10189d4e874bae0d | 16,827 | py | Python | code/GenGraph/bwmorph.py | DeVriesMatt/VGN | 31afd45f1c7d339ec6dacdfd21f4258bd77e603c | [
"MIT"
] | 70 | 2019-10-02T09:12:22.000Z | 2022-03-21T07:50:30.000Z | code/GenGraph/bwmorph.py | DeVriesMatt/VGN | 31afd45f1c7d339ec6dacdfd21f4258bd77e603c | [
"MIT"
] | 14 | 2020-02-05T13:38:51.000Z | 2021-08-09T12:28:28.000Z | code/GenGraph/bwmorph.py | DeVriesMatt/VGN | 31afd45f1c7d339ec6dacdfd21f4258bd77e603c | [
"MIT"
] | 28 | 2019-10-04T23:27:36.000Z | 2021-12-28T07:56:32.000Z | # Duplication of 'bwmorph' in matlab
# referred to
# https://gist.github.com/joefutrelle/562f25bbcf20691217b8
import numpy as np
from scipy import ndimage as ndi
OPS = ['dilate', 'fill', 'thin', 'branchpoints', 'endpoints']
# lookup tables
LUT_THIN_1 = ~np.array([0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,0,1,1,0,0,1,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,0,0,1,1,0,0,1,1,0,0,1,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,0,0,0,1,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,0,0,1,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,0,0,0,1,0,0,1,1,0,0,1,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,0,0,0,1,0,0,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1], dtype=np.bool)
LUT_THIN_2 = ~np.array([0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,0,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,0,1,0,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,0,0,1,0,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,0,1,1,1,0,0,1,1,0,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,0,0,1,0,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,0,1,1,1,0,0,1,1,0,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,0,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,0,0,1,0,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,0,1,1,1,0,0,1,1,0,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,1,1,1,1,0,0,1,0,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,0,1,1,1,0,0,1,1,0,1,1,1], dtype=np.bool)
LUT_ENDPOINTS = ~np.array([0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,0,1,1,1,1,0,1,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,1,0,1,1,0,0,0,0,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,0,0,0,0,1,1,0,1,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,0,0,0,0,1,1,0,1,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,1,0,1,1,0,0,0,0,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,1,0,1,1,0,0,0,0,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,0,0,0,0,1,1,0,1,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,1,0,1,1,1,1,0,1,1,1,1,0], dtype=np.bool)
LUT_BRANCHPOINTS = ~np.array([0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,1,0,0,0,1,0,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,1,0,1,1,1,0,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,1,0,1,1,1,0,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,1,0,1,1,1,0,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,1,0,1,1,1,0,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1], dtype=np.bool)
LUT_BACKCOUNT4 = np.array([0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,2,1,1,1,1,2,1,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,1,2,1,1,2,2,3,2,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,3,2,2,1,1,2,1,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,2,3,2,2,2,2,3,2,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,3,2,2,2,2,3,2,3,3,3,2,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,2,3,2,2,3,3,4,3,3,3,3,2,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,3,2,2,1,1,2,1,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,2,3,2,2,2,2,3,2,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,3,2,2,2,2,3,2,3,3,3,2,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,1,2,1,1,2,2,3,2,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,3,4,3,3,2,2,3,2,3,3,3,2,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,2,3,2,2,2,2,3,2,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,3,2,2,2,2,3,2,3,3,3,2,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,1,2,1,1,2,2,3,2,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,3,2,2,1,1,2,1,2,2,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,1,2,1,1,1,1,2,1,1,1,1,0], dtype=np.uint8)
LUT_DILATE = np.array([0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1], dtype=np.bool)
LUT_FILL = np.array([0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1], dtype=np.bool)
def bwmorph(image, op, n_iter=None):
# check parameters
if op not in OPS:
raise ValueError('Undefined OP is used')
if n_iter is None:
n = -1
elif n_iter <= 0:
raise ValueError('n_iter must be > 0')
else:
n = n_iter
# check that we have a 2d binary image, and convert it
# to uint8
bw = np.array(image).astype(np.uint16)
if bw.ndim != 2:
raise ValueError('2D array required')
if not np.all(np.in1d(image.flat,(0,1))):
raise ValueError('Image contains values other than 0 and 1')
# neighborhood mask
mask = np.array([[ 1, 8, 64],
[ 2, 16,128],
[ 4, 32,256]],dtype=np.uint16)
# iterate either 1) indefinitely or 2) up to iteration limit
while n != 0:
before = np.sum(bw) # count points before thinning
if op == 'dilate':
bw[np.take(LUT_DILATE, ndi.correlate(bw, mask, mode='constant'))] = 1
elif op == 'fill':
bw[np.take(LUT_FILL, ndi.correlate(bw, mask, mode='constant'))] = 1
elif op == 'thin':
# for each subiteration
for lut in [LUT_THIN_1, LUT_THIN_2]:
# correlate image with neighborhood mask
N = ndi.correlate(bw, mask, mode='constant')
# take deletion decision from this subiteration's LUT
D = np.take(lut, N)
# perform deletion
bw[D] = 0
elif op == 'branchpoints':
# Initial branch point candidates
C = np.copy(bw)
C[np.take(LUT_BRANCHPOINTS, ndi.correlate(bw, mask, mode='constant'))] = 0
C = C.astype(np.bool)
# Background 4-Connected Object Count (Vp)
B = np.take(LUT_BACKCOUNT4, ndi.correlate(bw, mask, mode='constant'))
# End Points (Vp = 1)
E = (B == 1)
# Final branch point candidates
F = (~E)*C
# Generate mask that defines pixels for which Vp = 2 and no
# foreground neighbor q for which Vq > 2
# Vp = 2 Mask
Vp = ((B == 2) & (~E))
# Vq > 2 Mask
Vq = ((B > 2) & (~E))
# Dilate Vq
D = np.copy(Vq)
D[np.take(LUT_DILATE, ndi.correlate(Vq, mask, mode='constant'))] = 1
# Intersection between dilated Vq and final candidates w/ Vp = 2
M = (F & Vp) & D
# Final Branch Points
bw = F & (~M)
break
elif op == 'endpoints':
# correlate image with neighborhood mask
N = ndi.correlate(bw, mask, mode='constant')
# take deletion decision from the LUT
D = np.take(LUT_ENDPOINTS, N)
# perform deletion
bw[D] = 0
else:
pass
after = np.sum(bw) # count points after thinning
if before == after:
# iteration had no effect: finish
break
# count down to iteration limit (or endlessly negative)
n -= 1
return bw.astype(np.bool) | 47.533898 | 86 | 0.331194 | 4,066 | 16,827 | 1.364732 | 0.038121 | 0.598666 | 0.826635 | 1.021445 | 0.750225 | 0.73761 | 0.71202 | 0.706794 | 0.706794 | 0.691476 | 0 | 0.381613 | 0.430499 | 16,827 | 354 | 87 | 47.533898 | 0.197433 | 0.058656 | 0 | 0.739286 | 0 | 0 | 0.013979 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003571 | false | 0.003571 | 0.007143 | 0 | 0.014286 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
fb187958c03307f1177760015795da235775755f | 32,239 | py | Python | Python_Code/Atom/pp.py | TomerEven/Pocket_Dictionary | 29f906855f5e52eec61d00cb6e8fa0fb5e90a8e4 | [
"Apache-2.0"
] | 2 | 2019-12-10T03:41:02.000Z | 2021-11-14T12:30:36.000Z | Python_Code/Atom/pp.py | TomerEven/Pocket_Dictionary | 29f906855f5e52eec61d00cb6e8fa0fb5e90a8e4 | [
"Apache-2.0"
] | null | null | null | Python_Code/Atom/pp.py | TomerEven/Pocket_Dictionary | 29f906855f5e52eec61d00cb6e8fa0fb5e90a8e4 | [
"Apache-2.0"
] | 2 | 2019-12-10T03:41:07.000Z | 2020-06-02T16:57:31.000Z | from math import *
from random import *
LENGTH = 8
# def number_to_32
def to_bin_with_leading_zeros(n: int, length: int) -> str:
if n >= 0:
s = bin(n)[2:]
else:
s = bin(n)[3:]
diff = length - len(s)
if diff > 0:
return "0" * diff + s
elif diff == 0:
return s
else:
assert False
def to_bin_with_leading_zeros2(n: int, length: int) -> str:
if n >= 0:
s = bin(n)[2:]
else:
s = bin(n)[3:]
diff = length - len(s)
if diff > 0:
return "0" * diff + s
return s
# elif diff == 0:
# return s
# else:
# assert False
def my_bin(n: int, length: int = LENGTH) -> str:
return to_bin_with_leading_zeros(n, length)
def my_bin2(n: int, length: int = LENGTH) -> str:
return to_bin_with_leading_zeros(n, length)
def pp_s_bin(s: str):
temp = [s[i*LENGTH:(i+1)*LENGTH] for i in range(len(s) // LENGTH)]
temp = [i[:LENGTH//2] + "." + i[LENGTH//2:] for i in temp]
return "|".join(temp)
def pp_bin(n: int, length: int = 32):
return pp_s_bin(my_bin(n, length))
# temp = [i[:LENGTH//2] + "." + i[LENGTH//2:] for i in temp]
# return "|".join(temp)
def pp_bin_inv(s: str) -> int:
temp = [i for i in s if i == '0' or i == '1']
s = "".join(temp)
return int(s, 2)
def split_by_fp(cm: str, fp_size: int) -> list:
# todo plus 1 problem?/home/tomer
return [int(cm[i*fp_size:(i+1)*fp_size], 2) for i in range(len(cm) // fp_size)]
def split_by_fp_inv(l: list, fp_size: int) -> str:
# todo plus 1 problem?
temp = [my_bin(i, fp_size) for i in l]
temp_s = "".join(temp)
temp_s += "0"*(32-len(temp_s))
return temp_s
def list_to_consecutive_memory(l: list, length: int = LENGTH) -> str:
return "".join([(my_bin(i)) for i in l])
temp = [i[:LENGTH//2] + "." + i[LENGTH//2:] for i in temp]
return "|".join(temp)
# return "|".join([(my_bin(i)) for i in l])
def cons_mem_print(s: str) -> str:
assert len(s) % LENGTH == 0
temp = [s[i*LENGTH:(i+1)*LENGTH] for i in range(len(s) // LENGTH)]
temp = [i[:LENGTH//2] + "." + i[LENGTH//2:] for i in temp]
res = "|".join(temp)
print("", res)
return res
def number_to_split_run_list(n: int, fp_size: int) -> list:
s = bin(n)[2:]
if len(s) < 32:
s = my_bin(n, 32)
return split_by_fp(s, fp_size)
def my_bit_count(n: int) -> int:
s = bin(n)[2:]
return sum([1 for i in range(len(s)) if s[i] == '1'])
def b32_to_b64(w1: int, w2: int) -> int:
assert ((0 <= w1 < (1 << 32)) and (0 <= w2 < (1 << 32)))
return (w1 << 32) | w2
#
# def vector_get_inteval(vec:str, quotient:int)->tuple:
# zero_counter = 0
# continue_from_index = 0
# start_index = -1
# end_index = -1
#
# for i in range(len(vec)):
# if zero_counter >= quotient - 1:
# start_index = i
# continue_from_index = i
# break
# if vec[i] == '0':
# zero_counter += 1
#
# # print("h continue_from_index {:}".format( continue_from_index))
# for i in range(continue_from_index, len(vec)):
# if vec[i] == '0':
# return start_index, i
# # end_index = i
#
# # zero_counter += 1
# # if zero_counter == quotient + 1:
# # end_index = i
# # return start_index, end_index
# assert False
64 + 7*64
def naive_get_interval(vec: str, quotient: int) -> tuple:
if quotient == 0:
if vec[0] == '0':
return 0, 0
else:
return 0, vec.find('0')
indexes = [i for i in range(len(vec)) if vec[i] == '0']
return indexes[quotient - 1] + 1, indexes[quotient]
def naive_get_interval_from_int(n: int, quotient: int) -> tuple:
return naive_get_interval(my_bin(n, 32), quotient)
# def b32_get_interval(n:int, quotient:int)->tuple:
# return vector_get_inteval(my_bin(n,32),quotient)
3221225472 >> 22
3221225472 & ((1<<32) - 1)
bin(768)
# conv_32b_to_64b
bin(850)[2:]
bin(3911)[2:]
f = 240
l = 12
192*2 / 32
f2 = 192
l2 = 11
q2 = 62
s2 ="000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000011000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"
s2[12*32:12*32 + 32]
int(s2[12*32:12*32 + 32],2)
"000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000011000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"
s2[f2*2:f2*2 +l2]
int(s2[f2*2:f2*2 +l2],2)
s2.find('11')
# s2[63:].index('1') + 63
184*2
s2[120]
s = "000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000010010011101000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000(q2 + 1)*000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"
s[30]
s[f * 2:f * 2 + l]
int(s[f * 2:f * 2 + l], 2)
s[38 * 2:38 * 2 + l]
int(s[38 * 2:38 * 2 + l],2)
s = "000000000000000000000000000000001"
len(s)
s.count('0')
assert False
x = 4827189762586116096
pp_bin(x,64)
pp_bin(y,64)
y = (x<<15) & ((1<<65) - 1)
# def __pdep32(a, mask):
# tmp := a
# dst := 0
# m := 0
# k := 0
# DO WHILE m < 32
# IF mask[m] == 1
# dst[m] := tmp[k]
# k := k + 1
# FI
# m := m + 1
# OD
def get_bit(x, i):
return x & (1 << i)
# def set_bit(x, i, b):
# return x &
def pdep(a, mask, size: int = 32):
temp = 0
dst = 0
k = 0
b = 1
for i in range(size):
if mask & b:
dst
def count_ones(x):
r = 0
temp = bin(x)[2:].count('1')
while x:
r += x & 1
x >>= 1
assert (r == temp)
return r
count_ones(1)
x = 1
bin(x)[2:].count('1')
def get_lowerest_bit(x):
return x & -x
for i in range(65):
print(i, f(i), count_ones(i))
pp_bin(113983158044280366, 64)
pp_bin(113980000000000000, 64)
pp_bin(72685035695, 64)
pp_bin(322613908935, 64)
pp_bin(13497929568569990414, 64)
pp_bin(2147483648)
pp_bin(1107591176)
naive_get_interval_from_int(1107591176, 11)
pp_bin(1107361809)
pp_bin(1107361808)
b32_get_interval(1107591176, 11)
pp_bin(67108864)
w1 = 1107361792
pp_bin(1107361792)
pp_bin(1107591168)
b32_get_interval(1107361792, 11)
w1, w2 = 1107591176, 2147483648
w1, w2 ==
pp_bin(w1)
s1 = my_bin(w1, 32)
s2 = my_bin(w2, 32)
vector_get_inteval(s1 + s2, 24)
b32_to_b64(w1, w2)
w1, w2 = 67108864, 0
my_bin(w1, 32)
s = my_bin(w1, 32)
zeros = [i for i in range(len(s)) if s[i] == '0']
zeros[27]
my_bin(1107296256, 32)
pp_bin(1107296256)
# my_bin(1299600389,32)[10]
# end = 42
# pp_bin(945117276)
# pp_bin(942321848)
# pp_bin(950710456)
#
# my_bin(2147483647,32)
# my_bin(1967513926,32)
# my_bin(1787544204,32)
# end = 35
# pp_bin(1102520059)
# pp_bin(1356566397)
# pp_bin(1222348669)
#
#
#
# my_bin(847090147,32)
# pp_bin(846930886)
# pp_bin(847090147)
# pp_bin(844992995)
# s
# s = pp_bin(not_mask,64)[32:]
# pp_bin(mask)
# mask = 4194303
# not_mask = 18446744073705357312
# bin(18446744069414584320)
# mask32 = (1<<32) - 1
# s = "00110010011110110010001111000110011001000011110010011000011010010110011000110011010010000111001101110100101100001101110001010001"
# a_i = 1714636915
# slot = 7364309478734299135
# (a_i << 32) + mask32 == slot
# s[70:80]
# s[:32].count('0')
# s[32:32*2].count('0')
# s[:72].count('0')
# s[:73].count('0')
# cons_mem_print(s)
# 1<<32
# 4294967296 - 1
#
# my_bit_count(32)
# my_bit_count(32 + 16)
# my_bit_count(32 + 16 + 8 + 4 + 1 + 2358)
#
# 88 % 32
# 1<<(32 - (87 % 32))
# assert False
#
# v1 = 262144
# v2 = 524288
#
# 1<<18
# 1<<19
# my_bin(v1,32)
# my_bin(v2,32)
# v1 *2 == v2
# fp73 = 28
# before = "00101110100010010100010010100000100111001111100100101110000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"
#
# after = "00101110100010010100010010101001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"
#
# mid = "00101110100010010100010010101001100111001111100100101110000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"
#
# vec = "0010111010001001010001001010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"
#
# for i in range(len(vec)):
# if vec[i] != mid[i]:
# print(i)
#
# pp_bin(780747936, 32)
#
# int(before[28:28 + 28],2)
# mid == vec
# pp_bin(b_list[1],fp73)
# pp_bin(m_list[1],fp73)
# pp_bin(a_list[1],fp73)
# pp_bin(v_list[1],fp73)
# len(b_list)
# b_list[:5]
# m_list[:5]
# a_list[:5]
# v_list[:5]
# b_list = split_by_fp(before, fp73)
# m_list = split_by_fp(mid, fp73)
# a_list = split_by_fp(after, fp73)
# v_list = split_by_fp(vec, fp73)
# before[:32]
# mid[:32]
#
# log2(103265643)
# l64 = [before[:64], mid[:64], after[:64], vec[:64]]
# for i in l64:
# print(i)
# for i in l64:
# print(pp_s_bin(i[28:]))
#
# before[:64]
# mid[:64]
# mid == before
# after == mid
# 56 // 32
# 56 % 32
# before[:32]
# mid[:32]
# res = vec[:64]
# res == before[:64]
# res == mid[:64]
# res == after[:64]
# vec == after
# vec[32:].count("1")
# vec[:32]
# after[:32]
#
#
# t = "10111110111111000100111110111011111"
# t[-fp73:]
#
# mid == after
# len(mid)
# len(after)
# len(before)
# mid_before_diff = [i for i in range(len(after)) if mid[i] != before[i]]
# mid_before_diff
# s1 = before[:64]
# s2 = after[:64]
#
# pp_s_bin(s1)
# pp_s_bin(s2)
#
# len(before)
# len(after)
# diff = [i for i in range(len(after)) if after[i] != before[i]]
# len(diff)
# diff
# before == after
# for i in range(len(after)):
# if i
#
#
# my_bin(780747936,32)
# pp_bin(780747936)
# pp_bin(780747945)
# pp_bin(10287406fo)
#
# s = "0000100010000110010000101001100011101000"
# len(s)
# s1 = s[:32]
# s2 = s[32:]
# s3 = s2 + "0"*24
#
#
# s2 = my_bin(int(s[32:],2),32)
# s2
# int(s2,2)
# vl = [143016600, 3892314112]
# int(s[:32],2)
# int(s1[32:],2)
# # log2(3758096384)
# # 3758096384 < ((1<<32) - 1)
# # l = [143016600, 3758096384]
# # s1 = my_bin(l[0],32) + my_bin(l[1],32)
# # s1
# # s3 = "0000100010000110010000101001100011100000"
# # s3[32 + 29: + 5]
# # s3[32 + 3: 32 + 3 + 5]
# # len(s3)
# # s3[6*5:7*5]
# # 30
# # pp_s_bin(s1)
# # s1[4]
# # s1[5]
# # s1[6]
# # s1[7]
# # split_s1 = s1.split("0")
# # runs = [i + '0' for i in split_s1]
# # runs
#
# s2 = split_by_fp_inv([1,2,3,4,5,6,7,0],5)
# s1
# s2
# indexes = [i for i in range(len(s1)) if s1[i] == '0']
# sx = "0000000001000100001100100001010011000000"
# sy = '0000000001000100001100100001010011011000'
# log2(4194304)
# 1<<22
#
#
# temp_l = sx.split("0")
# temp_l = [i + "0" for i in temp_l]
# temp_l
# int(sx[32:],2)
# int(sy[32:],2)
#
# split_by_fp(s,5)
# split_by_fp_inv([0, 1, 2, 3, 4, 5, 6, 24],5)
# # l = [1 << i for i in range(8)]
# # l
# # list_to_consecutive_memory(l)
# # list_to_consecutive_memory(l)
# # k = list_to_consecutive_memory(l)
# # k
# # cons_mem_print(k)
# #
# #
# # def conse_p(l: str):
# # pass
# #
# #
# # # example:
# # x = 406847488
# # s = pp_bin(x)
# # s
# # pp_bin_inv(s)
# # x == pp_bin_inv(s)
# #
# # res = split_by_fp(my_bin(x, 32), 5)
# # res
# # split_by_fp_inv(res, 5) == my_bin(x, 32)
# # res_inv = split_by_fp_inv(res, 5)
# # res_inv
# # pp_s_bin(res_inv)
# #
# # # end
# #
# # y = 406847488
# # pp_bin(x)
# # pp_bin(y)
# #
# # # step 1
# # res1 = [1] + [0]*7
# # s_res1 = split_by_fp_inv(res1, 5)
# # s_res1
# # s_res1_1 = s_res1[:32]
# # int(s_res1_1, 2)
# # pp_bin(134217728)
# # # temp42 = pp_bin(34359738368)
# # # pp_s_bin(s_res1)
# # pp_bin_inv('0000.1000|0000.0000|0000.0000|0000.0000')
# # # pp_bin(34359738368)
# # s3 = pp_bin(x)
# # s3
# # s2 = '00011000010000000000000000000000'
# #
# # x = 134217728
# # y = 134217728
# # pp_bin(x)
# #
# #
# # # step 2
# #
# # res2 = [1, 2] + [0]*6
# # s_res2 = split_by_fp_inv(res2, 5)
# # pp_s_bin(s_res2)
# # y = int(s_res2[:32],2)
# # y
# #
# # x = 142606336
# # x == y
# #
# # pp_bin(x)
# # pp_bin(y)
# #
# #
# # # step 3
# #
# # res3 = [1, 2, 3] + [0]*5
# # s_res3 = split_by_fp_inv(res3, 5)
# # pp_s_bin(s_res3)
# # y = int(s_res3[:32],2)
# # y
# #
# # x = 415498240
# # x == y
# #
# # pp_bin(x)
# # pp_bin(y)
# #
# # number_to_split_run_list(x,5)
# # number_to_split_run_list(y,5)
# #
# # number_to_split_run_list(701216768,3)
# #
# #
# #
# #
# #
# # r7 = list(range(8))
# # r7
# # s_r7 = split_by_fp_inv(r7, 3)
# # pp_s_bin(s_r7)
# # y = int(s_r7[:32],2)
# # y
# #
# # x = 701216768
# # x == y
# #
# # pp_bin(x)
# # pp_bin(y)
# #
# # number_to_split_run_list(x,5)
# # number_to_split_run_list(y,5)
# #
# #
# #
# #
# #
# #
# #
# #
# #
# #
# #
# # res3 = [1,2,3] + [0]*5
# # s_res = split_by_fp_inv(res3, 5)
# # s_res
# # s_res3_1 = s_res[:32]
# # s_res3_2 = s_res[32:]
# # int(s_res1, 2)
# # int(s_res3_1, 2)
# # int(s_res3_2, 2)
# #
# # pp_s_bin(s_res3_1)
# #
# # s3 = pp_bin(x)
# # s3
# # s2 = '00011000010000000000000000000000'
# #
# # x = 134217728
# # y = 134217728
# # pp_bin(x)
# #
# #
# #
# # int(s2, 2) == x
# # # for i in range(1000,2000):
# # # assert(pp_bin_inv(pp_bin(i)) == i)
# # l3 = [i for i in s3 if i == '0' or i == 1]
# # l3
# #
# #
# # a = list(range(16))
# # b = [str(i) for i in a]
# # c = "".join(b)
# # c
# #
# #
# # res = split_by_fp(my_bin(x, 32), 5)
# # split_by_fp_inv(res, 5) == my_bin(x, 32)
# #
# # [1, 2] + [0]*6
# # des_res = split_by_fp_inv([1, 2] + [0]*6, 5)
# # pp_s_bin(des_res)
# # pp_bin(x)
# #
# #
# # y= 1<<27
# # y >> 32
# # 0<<32
# # up = 134217728
# # pp_bin(up)
#
# 204 % 28
| 49.905573 | 3,375 | 0.833028 | 2,100 | 32,239 | 12.606667 | 0.121905 | 0.011332 | 0.007026 | 0.007064 | 0.081854 | 0.062099 | 0.047934 | 0.042797 | 0.042268 | 0.033807 | 0 | 0.747701 | 0.089116 | 32,239 | 645 | 3,376 | 49.982946 | 0.153817 | 0.56748 | 0 | 0.185629 | 0 | 0 | 0.631425 | 0.629624 | 0 | 1 | 0 | 0.00155 | 0.02994 | 0 | null | null | 0 | 0.011976 | null | null | 0.017964 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
34acd69dcc0154a0c8b6e89da5da793312a96aaf | 160 | py | Python | userbot/plugins/__init__.py | RiderFA/Dark_Userbot | 480df539bfeae994d59649a54d2478ed24b445bb | [
"MIT"
] | null | null | null | userbot/plugins/__init__.py | RiderFA/Dark_Userbot | 480df539bfeae994d59649a54d2478ed24b445bb | [
"MIT"
] | null | null | null | userbot/plugins/__init__.py | RiderFA/Dark_Userbot | 480df539bfeae994d59649a54d2478ed24b445bb | [
"MIT"
] | null | null | null | from userbot import *
from userbot.utils import *
from userbot.Config import Config
from userbot.helpers.functions import *
from userbot.cmdhelp import CmdHelp
| 26.666667 | 39 | 0.825 | 22 | 160 | 6 | 0.363636 | 0.416667 | 0.386364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 160 | 5 | 40 | 32 | 0.942857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
55148a858982ea22e37919f3a2866db2d9bcd68a | 210,906 | py | Python | projects/src/main/python/CodeJam/Y14R5P1/kusano/generated_py_8361a6aa1aa9434295df0068046a0efe.py | DynamicCodeSearch/CodeSeer | ee985ece7691691585952eb88565f0e08bdc9113 | [
"MIT"
] | 5 | 2020-04-05T18:04:13.000Z | 2021-04-13T20:34:19.000Z | projects/src/main/python/CodeJam/Y14R5P1/kusano/generated_py_8361a6aa1aa9434295df0068046a0efe.py | DynamicCodeSearch/CodeSeer | ee985ece7691691585952eb88565f0e08bdc9113 | [
"MIT"
] | 1 | 2020-04-29T21:42:26.000Z | 2020-05-01T23:45:45.000Z | projects/src/main/python/CodeJam/Y14R5P1/kusano/generated_py_8361a6aa1aa9434295df0068046a0efe.py | DynamicCodeSearch/CodeSeer | ee985ece7691691585952eb88565f0e08bdc9113 | [
"MIT"
] | 3 | 2020-01-27T16:02:14.000Z | 2021-02-08T13:25:15.000Z | import sys
sys.path.append('/home/george2/Raise/ProgramRepair/CodeSeer/projects/src/main/python')
from CodeJam.Y14R5P1.kusano.A import *
def func_423b915d9ce740ba94f31ad435e988c2(D):
a -= 1
B += D[a]
return B
def func_b94e6667fda54c809e409e83d306253b(D):
a -= 1
B += D[a]
return a
def func_01f4c9980c1e402db83c6b2b44957ea7(D, a):
B += D[a]
A -= D[a]
return A
def func_a6da5619e76d4b5b899ae8c98b2a8895(D, a):
B += D[a]
A -= D[a]
return B
def func_48c6027e0d8f49bcb228a602fd904ca2(D):
a -= 1
B += D[a]
A -= D[a]
return B
def func_72f1e170c35643878cee7af17f621942(D):
a -= 1
B += D[a]
A -= D[a]
return A
def func_00e15a6009724540b1e8c65086c6d98d(D):
a -= 1
B += D[a]
A -= D[a]
return a
def func_ca0ad3a583b84900ae17904df72d19ff(D, a):
B -= D[a]
A += D[a]
return B
def func_a73f62a8aaa541148bd6b4a43ca354b3(D, a):
B -= D[a]
A += D[a]
return A
def func_91fea46230c44582b41482c58c57a419(D):
A += D[a]
a += 1
return A
def func_a3989d71f18d4e5c9e40284adbe9f35b(D):
A += D[a]
a += 1
return a
def func_1f157e3e7b0f47e1975c0532cb0e6261(C, B, A):
a += 1
t = max(A, B, C)
return a
def func_226808347d7d4d479eb30ebab4af1533(C, B, A):
a += 1
t = max(A, B, C)
return t
def func_e959cf105f9d4ba49a4e520762b40dc2(D):
B -= D[a]
A += D[a]
a += 1
return B
def func_51cc323a577b4f66b43a43f525231061(D):
B -= D[a]
A += D[a]
a += 1
return A
def func_24493df319fa46949023b48a7293ab0c(D):
B -= D[a]
A += D[a]
a += 1
return a
def func_84115b907e8b45eca7fcbcf566584a30(C, B, D):
A += D[a]
a += 1
t = max(A, B, C)
return t
def func_a871538e9489492a912b0e85bb526a17(C, B, D):
A += D[a]
a += 1
t = max(A, B, C)
return A
def func_d9f777c452154c96aef0c14e1fc564dc(C, B, D):
A += D[a]
a += 1
t = max(A, B, C)
return a
def func_bab03939161f42d098797010f9d79207(C, D):
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
return t
def func_7a1a0ca356414b05ba5071f9a62daaea(C, D):
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
return B
def func_571a239af28d4d5694c5d45bce687b04(C, D):
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
return a
def func_6efed0d2d6cf491aaacbca184552e776(C, D):
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
return A
def func_c2352deb110a442188b2c76a643abc18(D):
b += 1
C -= D[b]
return C
def func_02faa767a67646f6ab5e465012c29197(D):
b += 1
C -= D[b]
return b
def func_8620ab6842af46f4bf11c3744eaf3d25(b, D):
C -= D[b]
B += D[b]
return C
def func_e756efaa56434464a2046204173de711(b, D):
C -= D[b]
B += D[b]
return B
def func_57cbfd0df9a44afca326c0ca7c064031(C, b, D, A):
B += D[b]
p = max(A, B, C)
return B
def func_717ac3ae6ef04ca78327f3d4411a95a0(C, b, D, A):
B += D[b]
p = max(A, B, C)
return p
def func_87d62cd5a7d9423da8e178433d11d0bd(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return t
def func_729f4b0abdee45db92697ad60f22de43(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return B
def func_224443b73bf94833951150b9c0ac39d6(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return A
def func_046e3efa7fa4462f844b092d96dac21a(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return a
def func_bb0e4f90a1d24f84a234dc3da71019cf(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return p
def func_d57a3651beb14ede861a92d8e43841bd(C, b, D):
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_cd1127704f9f4e1e9955f038cf61d483(C, b, D):
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_0b04e22065404608b3ff4537fcebd37b(C, b, D):
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_e15be5cd479f4d02854035982ffe0a59(C, b, D):
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_b0ffc645b8fb4cd38b1f208ee5ba1bd9(C, b, D):
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_d68e7071a5634c2e9b44bcdd8af173b0(C, b, D):
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_94b93592fa214040b0ee2e0724c2c401(D):
b += 1
C -= D[b]
B += D[b]
return B
def func_add4a515daa14ce798537566a9dc9cb2(D):
b += 1
C -= D[b]
B += D[b]
return C
def func_4522067fe54a4a62aa1e715ea4165da7(D):
b += 1
C -= D[b]
B += D[b]
return b
def func_093e85d8dfb84c5195be05ad7e63fa96(b, D, A):
C -= D[b]
B += D[b]
p = max(A, B, C)
return p
def func_e5ac5214bc254084b78cd58ab3edc207(b, D, A):
C -= D[b]
B += D[b]
p = max(A, B, C)
return C
def func_91342846479b47fcaa0fccc8e5941249(b, D, A):
C -= D[b]
B += D[b]
p = max(A, B, C)
return B
def func_06aee82aae7a435e833abc86781dd193(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return A
def func_b0e69cf895814692be2e6194658be459(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return B
def func_4693be6469624743a699ce6802c68bec(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return a
def func_a16820b104c24240a6775804f8ea3fb1(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return t
def func_b4bb4e9f3b4946b6a152524ced22a653(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return p
def func_5045ca50c3b54f33a6342da74145fbff(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_d565540fc63340caa481d7287cdb79e6(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_ead070062a6c4ec2b22cffb9686c5e1a(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_1b1a94425dad491f88b0e9d2dce424e4(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_dc11df51f13c4fa2a71e1e08531d58f5(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_09cdf7823cf347edb7fd107fe0523a92(C, b, D):
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_b7f26f3bd92a4191bb753eaf109ee17b(D, A):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
return C
def func_1237eaa84afb42a6baf3c9bcde60cb80(D, A):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
return B
def func_d352534af53743a8896443e167957ef7(D, A):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
return b
def func_205834f4921f43718794db4aad46e681(D, A):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
return p
def func_4636f88e7e8149509666c4a0f77a6a1c(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return a
def func_60d699f801044335a7cf6cc099672ac3(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return B
def func_4b770b1da3a74355b5bc5f99fc0a5b9c(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return A
def func_7c7dbbb02bfe464b8d77b89aaaa71f35(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return t
def func_ef1bb43a9a0845b8bf9a3f5f451f2781(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return C
def func_60587891540f4b6b8afaec272e0dba02(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return p
def func_eea62e88afa14356a8636655d66ac3ec(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_b0d62419532947a2a882e2ac9396b0fd(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_513cbae0935c4ef78074473a1f9cf460(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_8551c01cd99a4aa08dac538e638cce40(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_d4e7160c5b8c4a78b969f0d4167ee648(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_2bfc4c0e003e4b6899f704ad19903af1(C, b, D):
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_f4d9daf8a11749ff8f0c707bc53825df(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return A
def func_f40259ae0d0e4fe3b6e95d7a123993ab(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return a
def func_0e4e2d99a20a4b868695c927020acbfc(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return p
def func_849e78c2b375411f9e1b999d17e5049a(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return C
def func_48f49f45a01f46e0bb2a8bd1f6b954d6(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return b
def func_e0738f948c1840198b33d6e121c65b35(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return t
def func_8ffda86e7ae44bbaa3de46db46849a5c(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
return B
def func_6f9cec6475e6472bae15bb2231c01194(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_a37aa14bd819438dac3063dd9654ba56(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return C
def func_b00fc1ca89974c498133f3d1695d6780(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_208a626afe914577998941d9c00e1f94(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_601d4fc7c389475786e0d28136cc2f97(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_f25990e5948343ff95324577215b414a(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_d4cc96206bab4a7ba193dd3c4e284415(b, D):
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_7b9db45f94594df7868bfadb4755b35e(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_03f5fe90345c4b18a9bd9083f7e0bf8a(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_204dcbbdd84c4eef90b346df2e11ea0f(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_39e853c236f5459bac8a0ebd3e0ff18a(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return C
def func_161a53e252e443ea9f9af8814d2a1d99(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_ed111e966af1489bbee835645ece951b(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_6eaa604331d447b5a28886ce4bb21e8f(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return b
def func_adba13b2f7634f71b2a19c37eb5a40e5(D):
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_b29c7c0c51ae4ccc8b40ec50c95874cd(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
return D
def func_53990c1ff8fb4e67b40620cf59832eb7(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
return s
def func_ca2c811eaafe4278a5f91eec779dec01(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
return p
def func_e90f23ba841a407eaa962995d020f1d6(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
return q
def func_03ee32faf8fd4cc68b8c3cd88a502774(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
return r
def func_7f04ebfc611541279b5050d64758c0ea(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
return N
def func_0e7b93d693064da0a6fa7083e079557b(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
return i
def func_aa1f8b1e4c6744cd87f013d94f092ed5(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return S
def func_f2d3668677a44defb7001061d31f75ed(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return i
def func_59e368f6ae674b07a58de0d6c80c2266(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return D
def func_0d7ad9bc79a14c84b31f38d7b49af6ea(D):
S = sum(D)
ans = S
return ans
def func_1c4494b20d8b4bf1adff5177a5a78dd8(D):
S = sum(D)
ans = S
return S
def func_380a548337b64bb4953d41ca29e65156(S):
ans = S
A, B, C = 0, 0, S
return B
def func_751dda3a0a28454886eb7e85956f7cd1(S):
ans = S
A, B, C = 0, 0, S
return A
def func_5a6b0a82f228401ba61deca4f3f15ab6(S):
ans = S
A, B, C = 0, 0, S
return C
def func_2eb6863211384db6badad2ae757fa715(S):
ans = S
A, B, C = 0, 0, S
return ans
def func_57223073dc1d4346b65dce4284846f29(S):
A, B, C = 0, 0, S
a = 0
return C
def func_1918663383014a3c8abfcc7b756ae15f(S):
A, B, C = 0, 0, S
a = 0
return B
def func_70abf792ab004a3a8bf85612cfba78e5(S):
A, B, C = 0, 0, S
a = 0
return a
def func_c8b8e6ca02d942208b2e25ab0840405f(S):
A, B, C = 0, 0, S
a = 0
return A
def func_f96ab5b3a26a4cb08127bb2bf665a5e3():
a = 0
b = -1
return b
def func_53508b4c6a304d6583989fe5fe7fb329():
a = 0
b = -1
return a
def func_b695eb1318304200b8f485c498725842(N, D):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return C
def func_397a5d49a7024201b7a9decad1ac2fde(N, D):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_bae34d5f504e4432b44203cd7f212004(N, D):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return b
def func_46b7996263b047d7b40b6dc2c3eee795(N, D):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_1f5e63441c4c4860a1234c4e6e9be9ed(N, D):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_2d14014ffd0846dc99df24e54f62719e(N, D):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_230f966429c04acd811d201ac8f669d5(N, D):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_af2b058b6d114ccb8aae33a07b135a2b(N, D):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_717fd323131744e9b2b6878b3eab0aa4(N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return A
def func_257b34dbfd6d476db9dc00d2406353dc(N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return ans
def func_bfe071e90c25467fbac5195713b79277(N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return b
def func_1d57c7773cca4c6187b3611673df57b4(N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return p
def func_fcd4de94d5a44636ac8141a681ee6529(N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return t
def func_b54b2e75a9fa48dd8a8c5050f200e13b(N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return C
def func_9ffda53a6f4c4516807856f8879f55d3(N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return B
def func_6cf18d7a325a49f6a0079caaef676757(N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return a
def func_43e8d741bce54757a8527db442ec6871(test, S):
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return ans
def func_948ba0681f184124bc73aecdf05d0092(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return p
def func_7ab88615cedf4fa89b02aedc5e3ebb13(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return D
def func_05e01430ad654618999f5817d9d7f9bf(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return N
def func_47f49f214e484c40884800cdd7df9658(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return i
def func_7008cbf76fb4433abd29b4c88d746c3e(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return s
def func_5f6094facfcc4ffb8a4025bd2fa3c79a(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return r
def func_8408ffb7f11d4a4fad1d9b5186beb14e(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return q
def func_0e24c0871df6483f95958b2b7ef393aa(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
return S
def func_52017bba51aa4828a9b21266f3df75f0(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return i
def func_891a64f840ba4ec1b327ee0a946ade85(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return S
def func_5a0faebc7cc24999b8ef857c32f5b053(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return D
def func_fd016bfdf1b949e98586e07fb34cc552(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return ans
def func_540bc82d469f4d72a3ecf7827de5b8f2(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
return S
def func_7bb92fabcad0491293180ad9edff446a(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
return A
def func_aca7977c5ae34ed7a55f54cb27f1e884(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
return C
def func_0130a984633f40349fd74bdcec7887b3(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
return B
def func_73bb2fc31cb14d7b8bddb35332da3947(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
return ans
def func_1e87eeeab94944bf878dc9117b091172(S):
ans = S
A, B, C = 0, 0, S
a = 0
return ans
def func_a9b09644267d49e5aa8161ca59e839c5(S):
ans = S
A, B, C = 0, 0, S
a = 0
return A
def func_8ec5675e55ed44588fa254f0acc351b3(S):
ans = S
A, B, C = 0, 0, S
a = 0
return C
def func_f3fd8108bcaa41ffb94fec3be8c953ec(S):
ans = S
A, B, C = 0, 0, S
a = 0
return B
def func_b0517029f4da413fa31163fb7f537290(S):
ans = S
A, B, C = 0, 0, S
a = 0
return a
def func_183c191e70174e1f81cd67af51859d51(S):
A, B, C = 0, 0, S
a = 0
b = -1
return B
def func_628bf600829e42ab8c75e2b5205aafa5(S):
A, B, C = 0, 0, S
a = 0
b = -1
return A
def func_fb258ef32e1a4b389961fbb8512e81a5(S):
A, B, C = 0, 0, S
a = 0
b = -1
return b
def func_6862c5f0173d414a8e88b3ab0fd0bed4(S):
A, B, C = 0, 0, S
a = 0
b = -1
return C
def func_4402fafd705642168e8b2ec94d690bc1(S):
A, B, C = 0, 0, S
a = 0
b = -1
return a
def func_57d334016a49432ba487d80702120862(N, D):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_4493e7fc793c4315b41bf6cecc20dc17(N, D):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_5920afff56454a0585815cde78877dd3(N, D):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_e45a1ab94ece472497319f4f76d7af4f(N, D):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_6b8065b723224ebd91c5726e70b87aa7(N, D):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return C
def func_f02c6b26409849ad84cf2abb077cba6c(N, D):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return b
def func_4e491f09b6f34a2b88a06a6e1850c1dd(N, D):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_5b292946fb694907bc25f35bda198fd8(N, D):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_258b5273d582480393cba1f5eb64e6ca(N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return b
def func_8bd02b9bd07c44a0bb34bbe0568beb9f(N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return C
def func_b7a2792dbbc64804b12e70c6e089f62f(N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return t
def func_51dff6a48fa14c288b806ad71f80dc3b(N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return B
def func_c224efb5f9fe492da52011ef09108533(N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return p
def func_ad66e90e1261493ead85a07a1e4c7cc0(N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return A
def func_d4497a8cb6a048dbb2d1172e5f395992(N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return a
def func_83ef9e345a154b66965b760bb4ec5aa2(N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return ans
def func_204f2a1d08cd43db8de5da23dc06373a(test, N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return t
def func_2b8115d3162e40b0ab2bee17696fba8a(test, N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return b
def func_0611b3524e184010acf6a96ded350abf(test, N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return B
def func_2c80ba83aa4e48f0947c67e9f41ae37e(test, N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return a
def func_14f4f53474f3432b945676fcd6eed701(test, N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return A
def func_64358314b5134d01875f25dd56bbc46e(test, N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return ans
def func_83824331347148688ec3e3b061f2be2a(test, N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return p
def func_cabad9292f174156993290d4a674bee1(test, N, D, S):
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return C
def func_443a14950525476db36329b0d1edf3fe(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return i
def func_74ba61accbbf48a7b63351658e95e75a(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return s
def func_e610d06af7084e5e8e4a305b4b40e1d8(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return p
def func_3617a13b9e444a95867627c849f65089(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return ans
def func_775a7a1f89194a08b17df16a5adf3e5f(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return q
def func_aa0572463ce94df3a6627d428c1e5e7d(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return r
def func_deb7db4ce536450b885843ea2ad35b2c(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return N
def func_5a40ed02cf41483680c93c94f84b43b0(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return D
def func_6ceaf5f0135b4666bda909bf5191dfcd(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
return S
def func_68091c835c2f4b618cc6029e490a64e9(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return ans
def func_79ab8bddbcb34b36b95d9f837b261851(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return B
def func_483d2a7799c84fd6b3350e18d10ebd23(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return S
def func_5070fce62dba424fb19dd270f045bbd7(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return A
def func_402e686a05114d3f86c927c648241afe(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return i
def func_f44f33e103f543419a1dcb8c9bab2103(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return C
def func_51b005fbf8e74403aa64e1e560f4a121(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return D
def func_d80b5445bcb8423ea115253bdafb1a49(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return S
def func_c5c1cd9252a44a0fb2eb6bbc6cd0f635(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return ans
def func_8b0b1f540a134849aaf84f4e85719f85(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return A
def func_4e279cb869f14253b9cbe076a362d83a(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return C
def func_f40ec97aa3644f42a9c2594f5f938438(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return B
def func_0f932a0c32b9485c925e60dbac99e6f5(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return a
def func_eae41f32b71340c3b66c7a6a5d285881(S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return a
def func_ca7a426aa9444f3abd5c4abf82b5acf1(S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return B
def func_8de30364d1d748998bad9c5638546b12(S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return C
def func_904c3271d17c4de9b2e444b2bbf608b3(S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return b
def func_6b8a5929a0a4401f949aefc5f6257b5e(S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return A
def func_2396a2baecd84eea8232d5140016d6bf(S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return ans
def func_fb33cb9fed75486f8dec617b2a1f691e(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_09d5055fdeb94fa8bad4f7d371486581(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_455be6a458934a9ebbac6910cf5ce21f(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return b
def func_7dd36c38fa874062b025eaaaf8a79a59(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return C
def func_f698eab025f14ab48a0ee6d93383b33e(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_961ae086e67b4957bf9fb8ce7e679ae7(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_76c99b831d36499ca897111baa5be2a9(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_314645dcd1514998a4b6ec54aa9d6e56(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_c484d27f1e4a4aa9a72bfb03a0c515b9(N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return p
def func_3319764df52e4a2981112f4e91efb612(N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return t
def func_2b4f3fd635024a038365a3f10b460a1a(N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return a
def func_d5d3a78783db4755ade1e3476cd397a2(N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return C
def func_f2bf762e911444538048ea0e2fb32d39(N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return B
def func_98be0c08e8154c04a7b5931de028af0b(N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return A
def func_a75deda4154542b19592cf828e1b2e67(N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return ans
def func_bd8f76c1a8e24a16a650774aea6ec77e(N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return b
def func_08ffe01beade426096338bc313c372d3(test, N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return C
def func_26fed2582bc745c6890827ca2b870191(test, N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return ans
def func_5f2f1362d14f4f77aa48e4bc1621c805(test, N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return a
def func_9bb72e15203e4b28803c7cc2de028345(test, N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return A
def func_cd4353fedacf43f9885b4b819bacf057(test, N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return B
def func_ebc6b7bc403447b5b39a07e277c1b68e(test, N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return b
def func_3e35a66e82f646aeb9ce4d1d7f5cee57(test, N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return t
def func_017579d6f05c44bdb34e72c341b84ee1(test, N, D, S):
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return p
def func_73fb18d061d6427db1d00f516dba1fcf(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return C
def func_c0937cc2530640638dbdbc7f54044aa3(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return r
def func_b0523d7e505e437bb0896946e80e7303(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return p
def func_c0fcb93707b441d991212fe104f6e348(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return s
def func_4f32e626d4fd4dd9ac41253d7f125400(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return ans
def func_5dcd061366074b65b053b95018c1ef7e(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return B
def func_1584aee8a94b415ba7eff415908e4881(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return A
def func_3459130adbb643b083cd79db17b6e122(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return N
def func_5a1412a25c4a49b4bd570c295090615d(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return D
def func_a14d16ec22084d64bc34d76d585e2a00(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return S
def func_2950405b74f8438589cbc18255eea9f0(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return i
def func_5982fcfa758d46088321c4f7b49cfdfd(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
return q
def func_b016333e6bc84a69b16c1c6f97be72b5(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return i
def func_e6c2503c5a7c4a88880b325faed50e97(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return A
def func_13f512c18572474eb65c4e2f3331086d(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return C
def func_509af0d1b89043af9edb7fceb7102046(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return B
def func_3d62a59161524034939a156d654b6ff6(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return D
def func_fb8d06d29df9400ba2515361655a7aa2(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return a
def func_db5ce9a299d34d9fbc488da52739bb9a(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return S
def func_89cdb493506a4ec5bf3768f7ada202c1(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return ans
def func_7015030deb014967be9f5506c8daf32a(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return A
def func_0963020a05bc426183d8a91ba75a6b5a(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return b
def func_52acc34ea8b0492f95803240b0820aa8(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return a
def func_cd2cc3ec4e1e49bf987ba6b021edd4dd(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return ans
def func_7f447c5c1e634764a46ebbe8a82bd676(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return C
def func_c9fc8e641877498aac97aeadd677b1ea(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return S
def func_1f922088ad834325a0559d7a74c156e5(D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return B
def func_c4475b735e064052b789831142a23b9f(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_b30c24e8a99a496a9a0fea5e6524d39c(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_36d8baf3471f494c9ce526a9fac9bfd4(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return b
def func_5d80915e721946a797ad1b36d84398a6(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_3759cd1166764c2ead072e067dc11625(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_4b09477054e7473280e82c17ebb2b8c2(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_dd305eb429ac4e1cb4e0e835c244c3d1(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_7de2e01234b242b58913d87e03356a74(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return C
def func_7f71c78daeb34c63858c4ecf84049f56(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return C
def func_729a4a07e89842fdb19edc7e6f44973d(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return B
def func_a772240ad8ab45cf9227696edbfed95e(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return t
def func_085773e7b2db485eada79dfb9202c158(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return b
def func_d15bc591418347f994b7b36f294d4b1c(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return ans
def func_8f9f239a3fd2407993435a7a7e1f646c(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return a
def func_28238e5ff01e4a6e98558710ae4ff1e8(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return p
def func_4cb5baeb8267447b93086eb2015cb68b(N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return A
def func_bacf4995f9474f2a950bb31ee68bf170(test, N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return p
def func_2842535b07d04837bd4205d2184b7b13(test, N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return ans
def func_33ede790103140c5a0e9a47d5c503ec6(test, N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return a
def func_60b63f0d364d45658a4f3224f2a467f7(test, N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return t
def func_9de3e9bece7147cf863f23115e6390cd(test, N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return C
def func_40773cc421594a51b6754632c7764098(test, N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return b
def func_3027784b449841eebf32218f658297f4(test, N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return B
def func_ec45e213612245be9364f70731d368bc(test, N, D, S):
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return A
def func_c20bab9c6e12402b97fb53576edce663(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return ans
def func_010b28a09184443aaae97ab3790e4cc6(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return D
def func_8617f358a098459fa92032a8a7dc9550(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return N
def func_be50d8a759af47d9b89027417a3b745a(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return B
def func_4a496ee6402949f3a8e38b612474f38f(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return a
def func_29ea20dd34144da59f46063b95946b07(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return i
def func_c503def9130748fe95137c124b882aca(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return s
def func_aa843791ac1241b3a5b56971f3ba9b2b(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return p
def func_4734c8fd372a4b978e77d14ac5d02090(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return C
def func_91c1fca34bcd4802a486ab25a155244d(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return r
def func_692eae3cff3c49f8990979f06aec7eb5(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return A
def func_b375b1930de0434b89b036d4f0566625(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return S
def func_7ba51259bf5a46a5b36ff57878b40987(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
return q
def func_162bc3fb60a545909da577eac6bcec4c(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return a
def func_bff3d15956c64112ab9ab56e0b7133e8(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return S
def func_49b25fd17235415696034b60ecf7effa(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return b
def func_a6a50c339ad84d6b8c7f15eff824876e(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return C
def func_b061b79e8ca94e44941abfc134747f71(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return i
def func_163fcbe781964d58b702f095f4d6a85a(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return A
def func_64bc378fa31742589cc65125b8a36e54(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return ans
def func_69970127228947fda91cefe5b14de164(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return B
def func_935086d1c7934bde95f80a8d29b1619c(r, p, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return D
def func_873aa780e0234547a3d97c6677b466ec(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return C
def func_d077cd1976f74c5a96896cd4837ac29c(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_a4fc32284ac341fe8b4af3375eec63b0(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_59612589f617455a9b948b5f08cf1d14(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_2fa450e5301e40f59db6b6a5e4655da8(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_185eefb746f742679e04ba34753568f0(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return S
def func_f9b24485fce54ac0b3504d3d90b51f21(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_fa6ae7351c00497f9c59d65516a0cfa6(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_8434807f302341d4821e7fe7ef8aaa9c(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return b
def func_9937d07bd06e4daead3ac43d97f96f34(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return p
def func_68d29b3b887841e3afad74f5b725f2fc(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return ans
def func_960986d49672403cbee2ccacc3b04f71(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return B
def func_fc60f2eeaf0d46edb7c44c24f213e1d5(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return t
def func_182df9a32a7e41b687aaeb2b19f3296c(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return b
def func_fc21a49efab3477f8e4e95b803166a2e(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return A
def func_8d91ef7bda574d67a7ee1de48e4e317b(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return a
def func_f7794097883c48a2a832357c547e0a74(N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return C
def func_512c7b12097e471c9a86b44a01df14f7(test, N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return p
def func_5b230b19425d4631a08ee8852d2ad07f(test, N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return B
def func_56daceaef4c741239e4967470a991694(test, N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return ans
def func_7d8a91ade46e47d09ca269e3b7a3b353(test, N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return C
def func_dda33e5f53744966b1cd8cd971537dd5(test, N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return A
def func_16e324bda34345d7abc2236a063be04b(test, N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return b
def func_70470d66d47d4369b3dd88d09b4212c5(test, N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return t
def func_b1e690d18b6e486aa457baaadbd1f2ab(test, N, D, S):
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return a
def func_f2217ff76c4b4f438b27194af2cbf0a8(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return r
def func_fb0ea2aa707e46afb6bc718169b0ad2f(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return q
def func_76390d49f9df4253b6f77dbbd3f9f180(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return N
def func_6920f0f278534951bf4e077cec76a599(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return C
def func_4a15d76b452046899402553f95306c07(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return S
def func_70b894c9b22542fb961d0d3558b55810(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return ans
def func_efb9c2b022884c268c7db6cca168ae59(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return D
def func_df0537b762034f97a519d7b29aaf54d7(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return b
def func_f16c71150045463297b3e2e8c28718be(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return i
def func_2078346f33e148d8bb3bbc36977a89ba(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return B
def func_d6bb917fd32e475ba4b491b5261704c2(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return A
def func_8b27cd3ebf59480aa2cf4839f196377a(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return s
def func_090e9e20d9eb4f7c833143bfc107afbd(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return a
def func_7bc9c86ea4e549ed99297150fe5b45d8(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
return p
def func_ffccfba1f25e4a608e7a0979f9cd0ae7(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_3debb131f81b403fa31b0e2f212c135e(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_008ca5c4795c42a299ca9d0c3cc36696(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_812ec0f799f44a27b0af0d4b361cecde(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_bb8ba4aa2bbf400eac1ab18803156a2d(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return i
def func_e49853ffa868469caa21fed1b0974cd1(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return b
def func_34488b73c126447fb27cfbd52cbf344a(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_ca0c52126d094843a7fd2197a007d744(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return S
def func_8e5dd125631d473f88ce8ee82d06913b(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return D
def func_a9848526c6cd4ba183ec66fc0ccd9b90(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_0a5aa722f1e3419fb4eef21077a10381(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return C
def func_c40c5f2aa22646159ee1e151264fdc6d(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return b
def func_0685c590e52c4c19bb531e512e870f3a(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return B
def func_dcbcde6cb4e94757af869c3d53a52f39(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return p
def func_2826a1642ae3445188411ae60524e689(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return t
def func_891713351c9841cb9268514a5296b1c2(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return A
def func_78ded00109c14957aa3d3ec9c29786d9(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return S
def func_50d651b599f94fb9ab8fddaae5928de5(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return ans
def func_8aeea14840a6415ab51f28c509c95a64(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return a
def func_1ef869d3b1944dd985faa504aa34695d(N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return C
def func_10ce8592aad748f59dc974c93ae5fac8(test, N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return p
def func_fd421d4abf4547c4a718479f3c9a3eef(test, N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return t
def func_dfce3f9b3d79451786574d15c3d0fea8(test, N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return C
def func_3ebe0d54479249e08e7171e9a9f7f4f3(test, N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return b
def func_4ce6f82c02e94d1598a2ba1f86f1d06c(test, N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return ans
def func_2ef82ccb538a48659576eeff63cca899(test, N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return A
def func_747f475b4bff402f9453b4185b726016(test, N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return a
def func_dbb0b112a7474c09a481f489f763e9db(test, N, D, S):
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return B
def func_563f732d2574488bbcf6fa4ee9fffe08(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return q
def func_f5855ea5ac874e3f8c316f84d883a70b(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return i
def func_6573b2f8e11e47c79aaf33c043ab2c4c(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return s
def func_b473b3a985d14d2bbaa0997f40f48468(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return S
def func_a9331f8e900a4184802d2774aee3e022(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return D
def func_58718dc4ea1b434082928ed0e52519e7(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return A
def func_ce73cda9a599486880c44bf57909aeaa(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return a
def func_4ed679f83a8d4b7195721d9d77a6e156(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return t
def func_fbc939b236514079b54526929c930246(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return r
def func_69e1a19ad187486fb83c3521d88e70cd(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return b
def func_d6e982002a094b138337f71729f1f910(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return p
def func_f974f8700bff4fa991826b753c59332b(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return ans
def func_d94d3d9ab70944efa02cb8c2d4e1d3cf(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return N
def func_7034d50ae53245e68625a23b9291daaa(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return B
def func_db34e22e8a64484484be32ac179afe80(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
return C
def func_9904e94971a14b1a8b9eb7dd54dd80ab(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return ans
def func_c5b90ae2231b4184bd9d8c5a0b908d2d(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return b
def func_c917279678694ace9ee7df628f006e7f(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return t
def func_258524f459364d7fa64acb78b083a0bc(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return i
def func_58e6c56b2505462caf9bbc9c83317748(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return S
def func_89e275f4037845a6ba76246e9a9b0120(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return B
def func_6e961f1e396d495db1d05d9fbdd34b40(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return C
def func_d15a48030f564715853fc835576ae819(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return p
def func_c82fe9e601a7488e9e6bedcdef19c5b8(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return a
def func_861a2a36e09c4211ad6c1f93882fa3e2(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return A
def func_8ac373f0b58c4f80a2a6c5b1152ffbf0(r, s, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return D
def func_b9594001be2543359b732fd54ee0d6dd(test, N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return b
def func_051af4430bd240da8d703fcbf5be3a2e(test, N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return A
def func_25e91e5495b04b9db58fe4ed602581c8(test, N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return B
def func_40f2054886854ef885fcdfee5b47590c(test, N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return t
def func_3ec9de9a396847b78009e7d8b4eae973(test, N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return S
def func_65f1e60aec364668adb36a07121ad701(test, N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return ans
def func_bafb526c7e3144d1b0a97a4273525c9a(test, N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return C
def func_7547066c8f5341299536f025addd18b2(test, N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return a
def func_ad79f8895fde4bb1b56285375bab7557(test, N, D):
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return p
def func_44dba9c45f6f47e4abdd2bc980c0e419(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return p
def func_185103f211d540efbdc061d9042dd27c(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return ans
def func_683ebaaec42a48c5b44e091d1d55584e(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return b
def func_0880b53fe22944b59b111da015b24f7e(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return a
def func_ba6c961b799d4a0397a54780664091f4(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return D
def func_9ac8938ed15e4f508270e8f7f54e884b(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return i
def func_68f038f87ada4adaa7e85139ac3ce17f(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return A
def func_ce3927a5f7b845d5b18b043efcecc191(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return r
def func_993ca548b5b240abbbfa7539be026c78(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return S
def func_4a521010ce4c4f689fdba5438fa7c618(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return q
def func_66b91bdaac75425aab5ff150e474c849(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return t
def func_4aa1beea5cb14c84bdd81325c6f5f043(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return s
def func_bb973f45b4004331b1003ecfa2a8a94f(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return B
def func_f97582c4b1d846fb98a22b6c38cec026(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return N
def func_6005deb0b4484242840c05ede46286c4(infile):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
return C
def func_ab8e2f3128284aed96511110610ed3bd(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return A
def func_95c283e7513241568ff8a5440fd4bb34(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return B
def func_776a586fff62439687456e6f38e5e23b(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return i
def func_a494b16d845a4754be303f6df380eee9(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return ans
def func_669a31b8af774ae2b18ea6057794c002(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return D
def func_b8bb919eba6b48379335e61decdb43c7(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return a
def func_5292093bae3b42b493cd422f35201198(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return b
def func_743f059ae2f1470f859bf2af52c17ad2(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return C
def func_37b7e73a2f834f59af2981518fec2b21(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return p
def func_f9bce26221344320a5d8d61666e59f6b(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return S
def func_c320620e98664719a48562088131bb1b(r, s, test, q, N):
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return t
def func_e139d18a7d874f0aa09903d0b5fdb4e3(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return s
def func_4aefd44d5602426294cfdbfc3906983f(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return ans
def func_78b63dccafef48558d04490ba1e730e4(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return N
def func_b3a716466bd547918061666279563cae(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return C
def func_ab29a11338a74ba58496860acc6ff411(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return A
def func_f3c6300eea5149fbb7f7c8be4855083e(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return r
def func_b2823f83839542fe81030c6ecc6ae48f(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return D
def func_c6b5116a870c479fa3087975b4904b45(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return a
def func_efa61d385cff49b89bd059be54ac0b49(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return S
def func_448180d03a8049adbaa769b1cbe3dc31(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return i
def func_b9f02433c9a94e0983caca1d4f3cbe9e(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return b
def func_2ca8c25a92af4c91a514b3fc985a64b1(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return p
def func_fdce78fc85cf43429d4bcdd7cf092e0f(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return q
def func_2a5920d17524466787b46bad7b1a0590(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return t
def func_ab05bf03bdc7404bb667dfa45788e368(infile, test):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S('Case #%s: %.16f' % (test + 1, ans))
return B
def func_c3d7636150cc4cd4a09f9dae2edc9ce9():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return S
def func_4fdd4effe37d4939a307a5d3de85de54():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return q
def func_6abc34df56df4ba6b11b6e657a236de3():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return r
def func_96dcc41cdfc341408370f55b8c509168():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return t
def func_24c1bb648d9e4e7da718203195b0a5dc():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return test
def func_74c1b76a32954d479f9aeafa901e9eff():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return N
def func_1e420ec17af642c1b41d4c7756356b0c():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return D
def func_df169c322a5a4d6c97b09473c6fdb993():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return infile
def func_ab06422dd5bb4a46847a38c7661447c9():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return C
def func_75c637d9ad244bb0b8fb2fbf62ac8714():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return b
def func_3fa6c62e2b1943f2bdddb1411f81a60c():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return p
def func_43b573f14a3d4d6fa07bef5977a55140():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return ans
def func_55a50206605c4e38a183ae6e997f068a():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return B
def func_171551ff488441148f69f4b6514df082():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return i
def func_5950f6f57ed548ec804a775b28822609():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return A
def func_d688e7f2d6f44b1bacd2a20a5f2c8fa0():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return a
def func_0b1c94ef5ed74644860d862b57fafbea():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
return s
def func_410b049f503f4e38881fa2141c176947(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return p
def func_85943584a33d4b959eb4c249a46b8c11(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return ans
def func_6369321d3ac44dfd936ff4703d801909(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return C
def func_d141750250824621a2c648877e90c9c0(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return test
def func_e929bd088b6740cbb9bcf0d7f6217198(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return S
def func_5e35218e133e4519be4da9cc82f8a487(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return B
def func_507aba8b9ce74a31aa78ab302635fc5c(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return A
def func_300b76e068084dca928c421ffb9cd1a9(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return a
def func_244efe9e21774b0db7255ed3b9bed349(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return s
def func_aafe1dd3eefe46588c6a04ec53f17d6d(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return N
def func_2b80f44fc0fe4599b0f29f82b7d58dfe(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return r
def func_8136c2c855054178a0e74d642193371d(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return D
def func_be066bc3e2f24db4a8cff98e27d9205e(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return q
def func_419aa477399c4133959e23c15bc75550(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return t
def func_e6e276ba5e774b4aaf0e4c447b4fb7dc(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return i
def func_f9c03cc188834c0bac4b556e10d984f3(infile):
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return b
def func_d32823bcf0f54f5aa5fc66c6480f7e40():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return t
def func_fd450b8e8bbe4256a2fee67704a8d691():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return ans
def func_4ccf53b13e6b4652a8672d867b0d2e76():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return A
def func_8ffc3bde3c2a4590b3568d35be214df7():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return i
def func_686af94694fb4b72b9374caeef616e6a():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return N
def func_b13d1b9ba262421aadb768b1453ab60f():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return b
def func_c36cace41d1a4e5eb160ac9c13c72b79():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return s
def func_e3c55139f0ad4e7baad096e444a345b6():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return test
def func_694c79aecbf445e38d9d7a8dfc17a51d():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return a
def func_cb2d1f71eb684cbc8c9466fb81c398a7():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return q
def func_e870d90994b7441f8f1d5b4d9dc001ca():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return p
def func_ec928d9b735b4217af7d4abbeb603590():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return D
def func_cab8ca8c7ba94f7dae8111c7ebedb000():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return B
def func_9e55fe3c9d8941029188dd6121214667():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return S
def func_3e268f4913c4470193495b76dfe0ed8b():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return r
def func_c984367e591f4e819e8766fe34e6a63b():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return infile
def func_3218aaa5971841c4aeac3bd2dcc2b66a():
infile = open('codejam/test_files/Y14R5P1/A.in')
for test in range(int(infile.readline())):
N, p, q, r, s = map(int, infile.readline().split())
D = [((i * p + q) % r + s) for i in range(N)]
S = sum(D)
ans = S
A, B, C = 0, 0, S
a = 0
b = -1
while b < N - 1:
b += 1
C -= D[b]
B += D[b]
p = max(A, B, C)
while a < b:
B -= D[a]
A += D[a]
a += 1
t = max(A, B, C)
if t >= p:
a -= 1
B += D[a]
A -= D[a]
break
p = t
ans = min(ans, p)
ans = float(S - ans) / S
print 'Case #%s: %.16f' % (test + 1, ans)
infile.close()
return C
| 21.851015 | 86 | 0.338819 | 30,821 | 210,906 | 2.301093 | 0.018137 | 0.038296 | 0.042088 | 0.037224 | 0.76821 | 0.767646 | 0.766405 | 0.7657 | 0.76508 | 0.764347 | 0 | 0.128593 | 0.513181 | 210,906 | 9,651 | 87 | 21.853279 | 0.562163 | 0 | 0 | 0.941476 | 0 | 0 | 0.014277 | 0.005315 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.000231 | null | null | 0.005783 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9b78fd484d185c312ecf581b4a2e323e3084355c | 171 | py | Python | music_tree/base/__init__.py | deitry/music-tree | 04983d77c1fc8899a42982ce8cedb8500fb731e5 | [
"MIT"
] | null | null | null | music_tree/base/__init__.py | deitry/music-tree | 04983d77c1fc8899a42982ce8cedb8500fb731e5 | [
"MIT"
] | null | null | null | music_tree/base/__init__.py | deitry/music-tree | 04983d77c1fc8899a42982ce8cedb8500fb731e5 | [
"MIT"
] | null | null | null | from music_tree.base.node import *
from music_tree.base.timecode import Timecode
from music_tree.base.text import Text
from music_tree.base.composition import Composition
| 34.2 | 51 | 0.853801 | 27 | 171 | 5.259259 | 0.333333 | 0.253521 | 0.366197 | 0.478873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093567 | 171 | 4 | 52 | 42.75 | 0.916129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
9b8e410374d21a68672966bc8d6eb79033f8ba5e | 1,948 | py | Python | 0703-kth-largest-element-in-stream/naive.py | NotGoodAtCoding/leetcode-samples | f248e8f6b03ac0aa8e32fdc87798a489ddd458ed | [
"BSD-2-Clause"
] | null | null | null | 0703-kth-largest-element-in-stream/naive.py | NotGoodAtCoding/leetcode-samples | f248e8f6b03ac0aa8e32fdc87798a489ddd458ed | [
"BSD-2-Clause"
] | null | null | null | 0703-kth-largest-element-in-stream/naive.py | NotGoodAtCoding/leetcode-samples | f248e8f6b03ac0aa8e32fdc87798a489ddd458ed | [
"BSD-2-Clause"
] | null | null | null | # naive solution
# sort incoming list O(nlogn)
# when adding, use bin search insert O(logn)
# return sorted[k]
class KthLargest:
def __init__(self, k: int, nums: List[int]):
self.nums = sorted(nums)[k:]
self.k = k
def _insert(self, val: int):
# binary insert
low, high = 0, len(self.nums)
while low <= high:
mid = low + (high - low) //2
if mid == len(self.nums):
break
elif val == self.nums[mid]:
low = mid +1
break
elif (val > self.nums[mid]):
low = mid + 1
else:
high = mid -1
self.nums = self.nums[:low] + [val] + self.nums[low:]
def add(self, val: int) -> int:
if len(self.nums) == k:
if val > self.nums[0]:
del self.nums[0]
self._insert(val)
else:
self._insert(val)
return self.nums[0]
# bin insert solution with limited heap to k
class KthLargest:
def __init__(self, k: int, nums: List[int]):
self.nums = sorted(nums)
if k < len(nums):
self.nums = self.nums[len(nums)-k:]
self.k = k
def _insert(self, val: int):
# binary insert
low, high = 0, len(self.nums)
while low <= high:
mid = low + (high - low) //2
if mid == len(self.nums):
break
elif val == self.nums[mid]:
low = mid +1
break
elif (val > self.nums[mid]):
low = mid + 1
else:
high = mid -1
self.nums = self.nums[:low] + [val] + self.nums[low:]
def add(self, val: int) -> int:
if len(self.nums) == self.k:
if val > self.nums[0]:
del self.nums[0]
self._insert(val)
else:
self._insert(val)
return self.nums[0]
| 26.324324 | 61 | 0.462526 | 249 | 1,948 | 3.562249 | 0.176707 | 0.234498 | 0.099211 | 0.072153 | 0.834273 | 0.834273 | 0.834273 | 0.834273 | 0.834273 | 0.834273 | 0 | 0.013937 | 0.410678 | 1,948 | 73 | 62 | 26.684932 | 0.758711 | 0.088809 | 0 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.185185 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bd4edf426201ca2e2e93d10d9b1e033eb7cb1489 | 234 | py | Python | lybzcy/mc.py | kq18G8/mcmk | 07a263298f14798cd4410dcb273448fefaa93d6f | [
"MIT"
] | null | null | null | lybzcy/mc.py | kq18G8/mcmk | 07a263298f14798cd4410dcb273448fefaa93d6f | [
"MIT"
] | null | null | null | lybzcy/mc.py | kq18G8/mcmk | 07a263298f14798cd4410dcb273448fefaa93d6f | [
"MIT"
] | 2 | 2018-06-14T11:36:13.000Z | 2019-09-09T02:16:04.000Z | 0,0,0,0,0,0,0,0,0,0,1
0,0,0,0,0,0,0,0,0,1,1
0,0,0,0,0,0,0,0,1,0,1
0,0,0,0,0,0,0,2,0,0,1
0,0,0,0,0,0,3,0,0,0,1
0,0,0,0,0,13,0,0,0,0,1
0,0,0,0,86,0,0,0,0,0,1
0,0,0,9,0,0,0,0,0,0,1
0,0,16,0,0,0,0,0,0,0,1
0,63,0,0,0,0,0,0,0,0,1
| 21.272727 | 23 | 0.487179 | 110 | 234 | 1.036364 | 0.081818 | 1.280702 | 1.5 | 1.508772 | 0.903509 | 0.903509 | 0.903509 | 0.903509 | 0.903509 | 0.263158 | 0 | 0.53271 | 0.08547 | 234 | 10 | 24 | 23.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
950848aef90ebe0168d77713e39e8725385d4006 | 2,655 | py | Python | exercism/51.twelve-days/twelve_days.py | fenilgandhi/100_Days_of_Python | 1b28de7cffc2ca72c83a57791940341bd7a2e1ac | [
"MIT"
] | null | null | null | exercism/51.twelve-days/twelve_days.py | fenilgandhi/100_Days_of_Python | 1b28de7cffc2ca72c83a57791940341bd7a2e1ac | [
"MIT"
] | null | null | null | exercism/51.twelve-days/twelve_days.py | fenilgandhi/100_Days_of_Python | 1b28de7cffc2ca72c83a57791940341bd7a2e1ac | [
"MIT"
] | null | null | null | song = '''
On the first day of Christmas my true love gave to me, a Partridge in a Pear Tree.
On the second day of Christmas my true love gave to me, two Turtle Doves, and a Partridge in a Pear Tree.
On the third day of Christmas my true love gave to me, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.
On the fourth day of Christmas my true love gave to me, four Calling Birds, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.
On the fifth day of Christmas my true love gave to me, five Gold Rings, four Calling Birds, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.
On the sixth day of Christmas my true love gave to me, six Geese-a-Laying, five Gold Rings, four Calling Birds, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.
On the seventh day of Christmas my true love gave to me, seven Swans-a-Swimming, six Geese-a-Laying, five Gold Rings, four Calling Birds, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.
On the eighth day of Christmas my true love gave to me, eight Maids-a-Milking, seven Swans-a-Swimming, six Geese-a-Laying, five Gold Rings, four Calling Birds, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.
On the ninth day of Christmas my true love gave to me, nine Ladies Dancing, eight Maids-a-Milking, seven Swans-a-Swimming, six Geese-a-Laying, five Gold Rings, four Calling Birds, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.
On the tenth day of Christmas my true love gave to me, ten Lords-a-Leaping, nine Ladies Dancing, eight Maids-a-Milking, seven Swans-a-Swimming, six Geese-a-Laying, five Gold Rings, four Calling Birds, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.
On the eleventh day of Christmas my true love gave to me, eleven Pipers Piping, ten Lords-a-Leaping, nine Ladies Dancing, eight Maids-a-Milking, seven Swans-a-Swimming, six Geese-a-Laying, five Gold Rings, four Calling Birds, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.
On the twelfth day of Christmas my true love gave to me, twelve Drummers Drumming, eleven Pipers Piping, ten Lords-a-Leaping, nine Ladies Dancing, eight Maids-a-Milking, seven Swans-a-Swimming, six Geese-a-Laying, five Gold Rings, four Calling Birds, three French Hens, two Turtle Doves, and a Partridge in a Pear Tree.'''.split("\n\n")
# Not challenging enough.
def recite(start_verse, end_verse):
if (start_verse > end_verse) or (start_verse < 1) or (end_verse > 12):
raise IndexError("Thanos got all six stones and removed that verse")
return song[start_verse:end_verse+1]
| 80.454545 | 336 | 0.764218 | 490 | 2,655 | 4.12449 | 0.17551 | 0.029688 | 0.083127 | 0.095002 | 0.859476 | 0.859476 | 0.859476 | 0.859476 | 0.846611 | 0.656606 | 0 | 0.001819 | 0.171751 | 2,655 | 32 | 337 | 82.96875 | 0.917235 | 0.008663 | 0 | 0 | 0 | 0.647059 | 0.925475 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1f04a97a03a4ce5a9046199423fb298336d0332c | 30,309 | py | Python | source/deepsecurity/api/administrator_roles_api.py | felipecosta09/cloudone-workload-controltower-lifecycle | 7927c84d164058b034fc872701b5ee117641f4d1 | [
"Apache-2.0"
] | 1 | 2021-10-30T16:40:09.000Z | 2021-10-30T16:40:09.000Z | source/deepsecurity/api/administrator_roles_api.py | felipecosta09/cloudone-workload-controltower-lifecycle | 7927c84d164058b034fc872701b5ee117641f4d1 | [
"Apache-2.0"
] | 1 | 2021-07-28T20:19:03.000Z | 2021-07-28T20:19:03.000Z | source/deepsecurity/api/administrator_roles_api.py | felipecosta09/cloudone-workload-controltower-lifecycle | 7927c84d164058b034fc872701b5ee117641f4d1 | [
"Apache-2.0"
] | 1 | 2021-10-30T16:40:02.000Z | 2021-10-30T16:40:02.000Z | # coding: utf-8
"""
Trend Micro Deep Security API
Copyright 2018 - 2020 Trend Micro Incorporated.<br/>Get protected, stay secured, and keep informed with Trend Micro Deep Security's new RESTful API. Access system data and manage security configurations to automate your security workflows and integrate Deep Security into your CI/CD pipeline. # noqa: E501
OpenAPI spec version: 12.5.841
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from deepsecurity.api_client import ApiClient
class AdministratorRolesApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_administrator_role(self, administratorrole, api_version, **kwargs): # noqa: E501
"""Create an Administrator Role # noqa: E501
Create a new administrator role. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_administrator_role(administratorrole, api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Role administratorrole: The settings of the new administrator role. (required)
:param str api_version: The version of the api being called. (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_administrator_role_with_http_info(administratorrole, api_version, **kwargs) # noqa: E501
else:
(data) = self.create_administrator_role_with_http_info(administratorrole, api_version, **kwargs) # noqa: E501
return data
def create_administrator_role_with_http_info(self, administratorrole, api_version, **kwargs): # noqa: E501
"""Create an Administrator Role # noqa: E501
Create a new administrator role. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_administrator_role_with_http_info(administratorrole, api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Role administratorrole: The settings of the new administrator role. (required)
:param str api_version: The version of the api being called. (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['administratorrole', 'api_version'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_administrator_role" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'administratorrole' is set
if ('administratorrole' not in params or
params['administratorrole'] is None):
raise ValueError("Missing the required parameter `administratorrole` when calling `create_administrator_role`") # noqa: E501
# verify the required parameter 'api_version' is set
if ('api_version' not in params or
params['api_version'] is None):
raise ValueError("Missing the required parameter `api_version` when calling `create_administrator_role`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
if 'api_version' in params:
header_params['api-version'] = params['api_version'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'administratorrole' in params:
body_params = params['administratorrole']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['DefaultAuthentication'] # noqa: E501
return self.api_client.call_api(
'/roles', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Role', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_administrator_role(self, role_id, api_version, **kwargs): # noqa: E501
"""Delete an Administrator Role # noqa: E501
Delete an administrator role by ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_administrator_role(role_id, api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int role_id: The ID number of the administrator role to delete. (required)
:param str api_version: The version of the api being called. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_administrator_role_with_http_info(role_id, api_version, **kwargs) # noqa: E501
else:
(data) = self.delete_administrator_role_with_http_info(role_id, api_version, **kwargs) # noqa: E501
return data
def delete_administrator_role_with_http_info(self, role_id, api_version, **kwargs): # noqa: E501
"""Delete an Administrator Role # noqa: E501
Delete an administrator role by ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_administrator_role_with_http_info(role_id, api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int role_id: The ID number of the administrator role to delete. (required)
:param str api_version: The version of the api being called. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['role_id', 'api_version'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_administrator_role" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'role_id' is set
if ('role_id' not in params or
params['role_id'] is None):
raise ValueError("Missing the required parameter `role_id` when calling `delete_administrator_role`") # noqa: E501
# verify the required parameter 'api_version' is set
if ('api_version' not in params or
params['api_version'] is None):
raise ValueError("Missing the required parameter `api_version` when calling `delete_administrator_role`") # noqa: E501
if 'role_id' in params and not re.search('\\d+', str(params['role_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `role_id` when calling `delete_administrator_role`, must conform to the pattern `/\\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'role_id' in params:
path_params['roleID'] = params['role_id'] # noqa: E501
query_params = []
header_params = {}
if 'api_version' in params:
header_params['api-version'] = params['api_version'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['DefaultAuthentication'] # noqa: E501
return self.api_client.call_api(
'/roles/{roleID}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def describe_administrator_role(self, role_id, api_version, **kwargs): # noqa: E501
"""Describe an Administrator Role # noqa: E501
Describe an administrator role by ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.describe_administrator_role(role_id, api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int role_id: The ID number of the administrator role to describe. (required)
:param str api_version: The version of the api being called. (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.describe_administrator_role_with_http_info(role_id, api_version, **kwargs) # noqa: E501
else:
(data) = self.describe_administrator_role_with_http_info(role_id, api_version, **kwargs) # noqa: E501
return data
def describe_administrator_role_with_http_info(self, role_id, api_version, **kwargs): # noqa: E501
"""Describe an Administrator Role # noqa: E501
Describe an administrator role by ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.describe_administrator_role_with_http_info(role_id, api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int role_id: The ID number of the administrator role to describe. (required)
:param str api_version: The version of the api being called. (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['role_id', 'api_version'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method describe_administrator_role" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'role_id' is set
if ('role_id' not in params or
params['role_id'] is None):
raise ValueError("Missing the required parameter `role_id` when calling `describe_administrator_role`") # noqa: E501
# verify the required parameter 'api_version' is set
if ('api_version' not in params or
params['api_version'] is None):
raise ValueError("Missing the required parameter `api_version` when calling `describe_administrator_role`") # noqa: E501
if 'role_id' in params and not re.search('\\d+', str(params['role_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `role_id` when calling `describe_administrator_role`, must conform to the pattern `/\\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'role_id' in params:
path_params['roleID'] = params['role_id'] # noqa: E501
query_params = []
header_params = {}
if 'api_version' in params:
header_params['api-version'] = params['api_version'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['DefaultAuthentication'] # noqa: E501
return self.api_client.call_api(
'/roles/{roleID}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Role', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def list_administrator_roles(self, api_version, **kwargs): # noqa: E501
"""List Administrator Roles # noqa: E501
Lists all administrator roles. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_administrator_roles(api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_version: The version of the api being called. (required)
:return: AdministratorRoles
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.list_administrator_roles_with_http_info(api_version, **kwargs) # noqa: E501
else:
(data) = self.list_administrator_roles_with_http_info(api_version, **kwargs) # noqa: E501
return data
def list_administrator_roles_with_http_info(self, api_version, **kwargs): # noqa: E501
"""List Administrator Roles # noqa: E501
Lists all administrator roles. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_administrator_roles_with_http_info(api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_version: The version of the api being called. (required)
:return: AdministratorRoles
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_version'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method list_administrator_roles" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_version' is set
if ('api_version' not in params or
params['api_version'] is None):
raise ValueError("Missing the required parameter `api_version` when calling `list_administrator_roles`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
if 'api_version' in params:
header_params['api-version'] = params['api_version'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['DefaultAuthentication'] # noqa: E501
return self.api_client.call_api(
'/roles', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AdministratorRoles', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def modify_administrator_role(self, role_id, administrator, api_version, **kwargs): # noqa: E501
"""Modify an Administrator Role # noqa: E501
Modify an administrator role by ID. Any properties that have no value are not modified. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.modify_administrator_role(role_id, administrator, api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int role_id: The ID number of the administrator role to modify. (required)
:param Role administrator: The settings of the administrator role to modify. (required)
:param str api_version: The version of the api being called. (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.modify_administrator_role_with_http_info(role_id, administrator, api_version, **kwargs) # noqa: E501
else:
(data) = self.modify_administrator_role_with_http_info(role_id, administrator, api_version, **kwargs) # noqa: E501
return data
def modify_administrator_role_with_http_info(self, role_id, administrator, api_version, **kwargs): # noqa: E501
"""Modify an Administrator Role # noqa: E501
Modify an administrator role by ID. Any properties that have no value are not modified. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.modify_administrator_role_with_http_info(role_id, administrator, api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int role_id: The ID number of the administrator role to modify. (required)
:param Role administrator: The settings of the administrator role to modify. (required)
:param str api_version: The version of the api being called. (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['role_id', 'administrator', 'api_version'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method modify_administrator_role" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'role_id' is set
if ('role_id' not in params or
params['role_id'] is None):
raise ValueError("Missing the required parameter `role_id` when calling `modify_administrator_role`") # noqa: E501
# verify the required parameter 'administrator' is set
if ('administrator' not in params or
params['administrator'] is None):
raise ValueError("Missing the required parameter `administrator` when calling `modify_administrator_role`") # noqa: E501
# verify the required parameter 'api_version' is set
if ('api_version' not in params or
params['api_version'] is None):
raise ValueError("Missing the required parameter `api_version` when calling `modify_administrator_role`") # noqa: E501
if 'role_id' in params and not re.search('\\d+', str(params['role_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `role_id` when calling `modify_administrator_role`, must conform to the pattern `/\\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'role_id' in params:
path_params['roleID'] = params['role_id'] # noqa: E501
query_params = []
header_params = {}
if 'api_version' in params:
header_params['api-version'] = params['api_version'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'administrator' in params:
body_params = params['administrator']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['DefaultAuthentication'] # noqa: E501
return self.api_client.call_api(
'/roles/{roleID}', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Role', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def search_administrator_roles(self, api_version, **kwargs): # noqa: E501
"""Search Administrator Roles # noqa: E501
Search for administrator roles using optional filters. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.search_administrator_roles(api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_version: The version of the api being called. (required)
:param SearchFilter search_filter: A collection of options used to filter the search results.
:return: AdministratorRoles
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.search_administrator_roles_with_http_info(api_version, **kwargs) # noqa: E501
else:
(data) = self.search_administrator_roles_with_http_info(api_version, **kwargs) # noqa: E501
return data
def search_administrator_roles_with_http_info(self, api_version, **kwargs): # noqa: E501
"""Search Administrator Roles # noqa: E501
Search for administrator roles using optional filters. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.search_administrator_roles_with_http_info(api_version, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_version: The version of the api being called. (required)
:param SearchFilter search_filter: A collection of options used to filter the search results.
:return: AdministratorRoles
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_version', 'search_filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method search_administrator_roles" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_version' is set
if ('api_version' not in params or
params['api_version'] is None):
raise ValueError("Missing the required parameter `api_version` when calling `search_administrator_roles`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
if 'api_version' in params:
header_params['api-version'] = params['api_version'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'search_filter' in params:
body_params = params['search_filter']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['DefaultAuthentication'] # noqa: E501
return self.api_client.call_api(
'/roles/search', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AdministratorRoles', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 44.637703 | 311 | 0.61777 | 3,451 | 30,309 | 5.187482 | 0.060272 | 0.049603 | 0.02145 | 0.026813 | 0.943582 | 0.935203 | 0.931572 | 0.923975 | 0.910736 | 0.910066 | 0 | 0.016574 | 0.297271 | 30,309 | 678 | 312 | 44.70354 | 0.823935 | 0.334125 | 0 | 0.781421 | 0 | 0 | 0.225258 | 0.058564 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035519 | false | 0 | 0.010929 | 0 | 0.098361 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1f37e3436c1f4ea1bf57ba13a26931dfe6b86f16 | 22,378 | py | Python | src/GUI/byte_stream.py | panos23kar/photonomist | 3324878a03244b8d9a8fa059caa8e6cb443faf9b | [
"MIT"
] | null | null | null | src/GUI/byte_stream.py | panos23kar/photonomist | 3324878a03244b8d9a8fa059caa8e6cb443faf9b | [
"MIT"
] | 15 | 2022-01-28T15:12:29.000Z | 2022-02-13T17:04:38.000Z | src/GUI/byte_stream.py | panos23kar/photonomist | 3324878a03244b8d9a8fa059caa8e6cb443faf9b | [
"MIT"
] | null | null | null | BYTESTREAM="iVBORw0KGgoAAAANSUhEUgAAASkAAAEpCAYAAADPmdSCAAAgAElEQVR4Xu19CVxO2f//uamUiIoZPdVgZtoYI3uTIWqKzFhKRKFkjSlJtBoZRNZoLGmkTCpJWX4KqcRI1mkYSn2NrZ6YmVKUFnT+r2s0f0vqec7d73N6vby+830957O9P5/zvueeexYC4D+MwBsI5OfnG+3duxekpaWBbdu23Tpz5gzIzc0F169fBw8fPqQNq65du4LevXuDIUOGgKFDh4IFCxYYjx49GkyfPh2YmZkV0WYIKxI8AoTgI8AByI3A9evXFwcFBYG+fftuIEno7Nmz4MWLF3LrYVpASUkJDBo0CAwbNgwUFhb6rlmzBvTq1Wsj03axfn4hgEmKX/mg3Zv09PTsZcuWSR49emT04MED2vVzpdDAwAB06NChKDg4WOrs7DyCKz+wXeYRwCTFPMasWCgsLOygqakZ4ePj45qYmMiKTT4amTJlCli/fn1sdXW1p4mJyVM++oh9kg8BTFLy4cWb1lKptJ27u/s4CGH8iRMneOMX3xwZNWoUqK6udk5KSjoskUie8c0/7E/rCGCSah0j3rS4ffv2viVLljgfOXKEl3NIvAHqA46oqKiA7777DqxcuTL+iy++cOG7v9i/fxHAJMXzSrh48eITKyurDtXV1Tz3VHjutWvXDqSnpz+1tLTUFJ73iuMxJime5bq4uLjt9evX6yZPngwaGhp45p143VFVVQXkXJ6dnd1wdXX1HPFGKrzIMEnxJGfOzs7rs7OzfcvKynjikeK68dFHHwFbW9sNcXFxSxQXBf5EjkmKw1zcv39/9dChQ/3u3bvXhkM3sOkWEOjWrRvIzMwM/fzzz4MwUNwggEmKZdxv3bqll5aWVrJo0SKWLWNzVBFYsWIFGDdunL6ZmVkpVV1YXnYEMEnJjhWllseOHfNZvHjxxsLCQkp6sDD3CJiamoLAwMDF06ZN28S9N+L3AJMUwzlesWJF1tq1a0fU1tYybAmrZxsBNTU1kqyyf/jhByu2bSuSPUxSDGSb/EIXHx9ft3z5cga0Y5V8RMDHxwd4eHioGRoa1vPRPyH7hEmKxuyFhIQo3b9//2V0dDSNWrlX1blzZ0CeWqCtrQ20tLTApUuXEo2MjAA5qUzuoSP/kX/k701/FRUVr/6T3C9I/rt37x65SRgMHjx48uPHjwH5O3mqwj///MN9gDR6MHPmTBKPNiEhIY00qlVoVZikaEg/hNBm7NixJ48ePUqDNvZV9OvXD5DHpKSlpU2dPXs2cHNzg+rq6vFse1JbW+scExNDREVFkeuV4tLT08HVq1fZdoMWe2PHjgWHDx+2JQgigxaFCqwEkxTF5H/11VdPz58/356iGlbENTU1QVhYGDhy5EhH8lXU3Nz8CSuGaTCSl5enSX5dmz59etX3338PysvLadDKvIp+/fpVX716tQPzlsRrAZMUQm7Ly8s116xZU7VhwwYEaeZFCIIAhoaG5DlMZ3bt2vW3kpKSI/NWubHQ2NiYPGfOnC65ubnDCgoKAISQG0daserr6wsCAgI66ujoCObBwBcgMUnJmYlt27Y9W7BggbqcYow3t7CwIOeElu7atWubIu/2J0+HmDVr1oL6+vp1mZmZjOMur4GtW7fWenl5tZNXTpHbY5KSMfvFxcUFvXr1MuHLfjpytLRr1y5yQtt0zJgxePHVB/KYnJxM5qzAxcWFN6Ostm3bgmvXrhUaGxubylh+Ct0Mk1Qr6T969KjexIkTS+rq6jgvFD09PRAfH19saWlpxLkzAnUgLy+vaNKkSYb379/nPAKSrI4ePWpka2tbzLkzPHYAk1QLyfHz84PkRDOXf+RlBbNmzdq8cOFCHy79EKPtmJiYTdu2bVt06dIlTsMjt0ht3rwZ98UPZAED0wwwV65c2T1o0CD3ly9fclK85ErmBQsWBGzcuHEtJw4ooNGgoCD/zZs3r3n2jJvDO8lLJ3799ddoCwuLmQoIf4shY5J6Ax4IoVKfPn1eXrt2jfU6IeeYJk2aVLt//348qco6+m8bnDt37rOoqCj1xkb212OSi2Rv3brVhiAI9o1zjPuHzGOSeo1Mdnb20xEjRrC+3umTTz4BkZGRiXZ2dlN4WiMK61Z6enqCh4fH5Lt377KOgb29/ZTU1FTFvVHjDcQxSQEABg4cCNmel5gwYQJYvXr1NBMTkzjWewA2KBcC9+7dmxoUFPRLXBy7qSJ3Aly9elXh+6hCA1BVVVXZqVOnjmwuAAwMDAS+vr6dtLW1q+TqKbgx5whUVFR0jIyMrAwICGDNF3IaoKKiokpLS6sTa0Z5ZkhhScre3h6mpqaykg6y0BYsWHDmp59+smTFIDbCOALBwcE5oaGhw9iatxozZgy5XEEh+6vCBV1ZWRmmr6+/lI3bV0hyCg4OPrFy5cpRjPcabIATBH788cfjy5cvH8nGaFxDQwOUlpau69Spkx8nwXJkVKFI6pdffrk5bdo0Vlb5Hjx4EEyYMEGh8OWohnlhNiUlBTo4OLDiy08//VTw/fff92TFGA+MKEQnghCqjRgxovb06dOMQz5jxoyaPXv2sP6VkPHAsAGZEJg5c2b17t27NWRqTKHR0KFDwZkzZ9QJguB+KwSFOGQRFT1JNTQ0eLRr1277ixcvZMEDuc23334Ljh07Jno8kQFSMMHRo0fDtLQ0RqMmb2SuqamZr6qquoNRQxwrF3WnSkhI8HN2dl7L5HwBuf8qLi7um4kTJ/Jvyz3HxaXo5tPT060dHBxOMXm+PTnvGRMT4+/q6srt/i0Gky1akgoPD4fe3t6MQUcWR35+/rM+ffowPrRnLAismBUE8vPza/r27duOyYfl1q1bgZeXlyj7syiDsra2hkyeJWRubl6el5fXmZUKx0ZEg4ClpeU/OTk5OkwFZGlpCXJyckTXp0UX0McffwwfPXrESB2Qm0BPnDgx38bGRtRzAIyAh5W+QuD06dMeVlZW25laX/Xxxx+DR48eiapfiyaYx48fd+/cufMdpk4umDVrFoiKimpLEEQD7m8YASoIQAhVZ82aVb97924qaj4o26ZNG1BSUtJDV1eX/U2HDEQkCpKKiYnp7+bmdpkBfAA5MV5VVZWtpqaGL4BkAmAF1vn8+fPM9u3bW9XXM3NV3/bt2wfMnz//itAhFjxJZWRk5NnY2AxmIhHkkbP79u0TPEZMYIN10ofAlClTYEJCAn0K39CUmZl5wdra2pwR5SwpFXQHzMzMvGNtbd2dCayKiopKjYyM9JnQjXViBN5F4H//+1/J559/rscEMidOnLg7cuTIHkzoZkOnYElq//79lU5OTh3pBqlLly7gr7/+MiYIoohu3VgfRqAlBCCERl26dLnFxK3OBw4cqJo4caIgT1IQJEktXLjw4pYtWwbSXfLTp08v27t3r4RuvVgfRkAeBFxdXaWxsbG68sjI0nbBggWXtm3bNkiWtnxqIziSmjt37rXIyMjedIMYEBAwYc2aNSl068X6MAIoCISGhjoEBgYeRJFtSWbu3LnXIyMjv6RbL5P6BEVS/v7+cO1aeu8mUFZWBi9evBAUDkwWBNbNHwQghC7KyspxdC+r8ff3B2vXrhVMzQvGUSZGUGZmZuTWFsFgwJ/ugz1hE4E+ffrA33//nVaTs2bNuv7zzz8LYkQliA7q7e19MTw8nNY5KD8/PxAWFiaI+GmtTqxMkAgEBQXB1atX0+q7p6fnpYiICN7PUfG+kx48eLBywoQJtH7F++mnn3Z9//33c2nNOFaGEWAYgS1btkQuXLhwDp1m9u/fX+Xk5MTrr368JqmTJ0/esbW1pXUdVElJyQF9ff1JdCYa68IIsIWAVCpNkkgkE+m0l56eftfOzo6366h4S1JZWVl5VlZWtK0kJ49WKSkp8dfT0xPtuTt0Fi7WxV8EysrK/CQSCa3npJ08efKCra0tL1em85KkoqKi+s+ePZu2vXjkF7yMjIwuI0aM+Ie/pYc9wwjIjkB2dnZnGxubv+k8cTYqKmrA7NmzebfXj3ckVVZW1l1XV/eO7OlquaWamhqoq6vjXZx0xYf1KDYCampqsK6OvmPOy8rKeHd6Au86b5s2bSBd60IkEgl5BZAGQRDPFLuUcfRiRQBC2E4ikdSUlZXREiJ5zMvLly95xQu8cqZr167w4cOHtICtq6sLpFKphCAIerJHi1dYCUaAfgQghLr6+vrS0tJSWpTz7eA83pDUiBEjYHZ2Ni0gk2dA1dfX8yY2WoLCSjACrSBA56vfsGHDyCuzeNGHeOHEli1b4MKFC2kpQvIVTyqV8iIuWgLCSjACciAgkUigVCqVQ+LDTcPDw4G3tzfnfYlzB/bt2+fn4uJCy4a813NQ+BWPlhLFSoSIAPnqp6enJ6WLqBISEvynTJnC6bIdTkmqoaFhXtu2bXfQcdUP+RWvtrYWT5ILsWdhn2lFgJxMV1dXr6Hjqx+5vrC+vp7TC0g5Iyny6nMVFZVaOtZ54JMMaK1xrEwkCCgrK0O6+tfz5885u9KdM5IaOnQoPHv2LOVyIJk+KysLL9SkjCRWIDYELl++3HngwIF/0/GmMnz4cPI6Lk74ghOjUVFRN2fPnm1KR1GUlZX56+rqcvrOTEccWAdGgAkESktL/fT09GiZ842Oji5wd3fvyYSfLelknaQqKyvDOnXqtJSOQMvKyg7o6urizcJ0gIl1iBYBOjclV1ZWruvUqZMfm2CxTlIaGhqwpqaGcoybN2/etWjRInzcCmUksQJFQCAiIiLS09OT8jEv7dq1A8+ePWOVN1g1Nn78eHjo0CHKNSG0408pB4wVYARoQCAwMBCGhoZS1jR+/Hhw6NAh1riDNUOPHz+u1NLSonx4HT7yl3KNYQUKjABdRxFXVlZWderUiZXD8lgjKYIgINWvDHzc/KjA9Y5DFygCdCxNIL+qQwhZ4Q9WjAwYMABevkzL8VCs+CvQ2sNuYwRkQoC8hYYgiDiZGrfQqF+/fuDq1auM90nGDTg4ODilpKQkUgVk5cqVE5YtW4bvxaMKJJbHCAAAQkJCHEJCQijf65eVlVVtZWXVgUlQGSUpCKESQRAvqQbg5uZWFhMTg28WpgoklscIvIGAm5ubNCYmhvJNyRDCNgRBNDIFLqMkZWpqCgsKCij53qVLF/D3338z6iclB7EwRkDACHTu3Bn+8w+1U7V79uwJbt68yVgfZUzxhQsXdg8ePNidav4ghMYEQRRR1YPlMQIYgfcRgBAaEQRxiyo2ubm50RYWFjOp6mlOnjGSouMY4KKiolIjIyN9JgLHOjECGIF/ESgqKioxMjLSo4IHk1/eGSEpX19fuGHDBioxAxcXF7Bv3z5G/KPkGBbGCIgQgSlTpsCEhARKkS1ZsgSsX7+e9j5Lu8KcnBxDS0tLSq9n+PhfSrWChTECSAi0bdsW1tfXI8k2CSUlJelPmjSJnsPWXyulnaToOGe5oaEhS1VV1ZoSWlgYI4ARkAuB58+fZ6qoqFjJJfROYyYGGLSS1K1btwqMjY1NqAQ5c+ZMsHv3blr9ouIPlsUIKBICc+bMgbt27aIUclFRUaGRkREtRzGRjtBKBqqqqrChoQE5QCUlJfLOr7YEQaArQbaOBTECGAEIoWqbNm3qGxvRlz3RPZqijaS2bt36zMvLS51KmjMzM+dbW1vvoKIDy7aOwPPnz8ODg4On7tmzR6e2tpZ8MLQuxGEL8suRhoZGo5eX1+OlS5dOU1ZWTufQHdGbzsjI8LCxsdlOJdDw8PBab2/vdlR0NMnSQlLl5eWaOjo6VVQcGjp0aPnZs2c7U9GBZT+MwHffffdVTk5O7tOnT0UBU48ePcC4ceMswsPDz4siIJ4FYWFh8U9ubq4OFbfKy8s76ujoPKGig7bXPapLDtjcUU0VMKHJr1+/PiIgIOB7Og7k52PsKioqIDExMXTChAlBfPRPyD5RPblk8eLFYOPGjZQHQpQVvE4CpJKMa9euPfvyyy81qOjAsu8jQOdFkXzH18jIiFyUSFc98z1cVvz77bffavr27Uv1lY1yTigrMDc3f5qXl9ceFTXyvry6ujrKfqDaF6Pcrl27xs6ZM+ewGGNrLaawsDADPz+/ktba4d9lQ4Dq2qkBAwZUX758mdIpCZTIAUJoQxDESdnCbb7V0aNHvxkzZkwmFR1Y9v8jkJ6e/sTOzo5SUQgdT3LuzdLScojQ4+CD/6mpqdb29vanqPgCIbQlCCIDVQclkhozZgw8evQoqm3w7bffgmPHjlHyAdm4CAWjoqKiZs+ePUuEockdUkREhL2npyf1A/Xltiw+gdGjR8O0tDTkwMaMGQOOHj2K3M+RBUNCQpRCQkKofrtGto+MmEgFCwsLHU1MTA6INDyksB48eNDOwMCgFkkYC72LAKV5ZypnTiGThLu7O4yOjkZOpbu7e010dDTyXBayYREKkmdNM3nomMAhQ65xgcdNq/tubm7VMTExyB+3Zs2aBX7++WekXCAJFRcXtzU0NKyjiAKSbYo2RSlua2sLT56kNDUoSlzIoJYuXVqybt06A9EGyG5glEZTxcXFaoaGhnLvYEYiiqCgILh69WpkeA4ePAgmTJiAZBvZqLgFKRWPuKF5FR2uNRqSnJKSAh0cHJA1LVu2DKxcuVLuXMgt8NpD5E6BF24i57hZwSFDhsBz587Rq1Rk2nx9fcGGDRtQa11kaFALh+oCT5QHhtyJW7FiRdby5ctHoIYaEhJyIiQkZBSqPJZ7DwHkB4aCYSl3rSsYPjKFGxQUdHz16tUjZWrcTKNVq1ZlBwcHy3UcjNyJo3JeFB5Foaa2eTkLC4v5ubm52+jVKk5tw4YN++TMmTMPxBkdu1FRGU2pq6uD2tpauXhHrsZJSUk+kyZN2ogKyeLFi89s3LjRElUey72NgImJCSwsLMSwyIDA8OHDwenTp+WqdxnUKmQTLy+vnK1btw5DDT45OXmxo6PjJlnl5UoaDZ1CLnuyBqHA7fCrnnzJx/UnH14ttUauPRMTE1BYWChzLmRuWFhYKDExMUE+u/jHH38EP/zwg8z26MNSnJpKS0sN9PT07oszOsaiwvVHE7TLly+HK1asQNaWn5+vb2ZmJhOfyJy0VatWweDgYGSnKioqOmlra1M6cwrZuAgFV61aZRAcHIxJSr7cylzv8qlVvNYVFRUdtbW1K1EjJ5cwBQUFyZQPmRq9dgR5eOfk5AT2798vjy3U2BVGztbW1uDkyZOYpOTLOK5B+fBqsbWjoyNMTk6molGmfMjU6O7du6u7d+8eiOrNn3/+Oe3TTz+NQ5UXqlxlZeWnGzduDAoPDwfDhw93r6urA3QdPkeeJY/XR8lXGSNGIK+cec8Qedge+aUqLy8vmrxvztfXl5Hbe+WLkN3WxcXFUw0NDX9Btfrnn3+Gfvrpp60eVigTSfXo0QPeuXMHyZdu3bqBe/fuyWQHyQDPhK5fv/6tnZ3d/5WU4CONeJYaxt0hl9gYGhqCW7duqRIE8Zxxgzww0K1bN3jv3j0kT8gjoO/cudMqN7TagOqr3vHjx/ePGjVqMlIUAhLKyMjIs7GxGSwgl7GrDCJAXh5x6NChjWPGjPFl0AznqtPS0hJGjx5NpX+3ykGtNpg8efL6xMREJKAVYfHmnj17Ovn7+z9+9OgR5wWDHeAfAsbGxuTIqtV+xj/PZfeIyuLOqVOnboiLi1vSkrVWwevatSt8+PCh7B6/0dLd3b02Ojqa6hnJSLbZEIqJiUl2c3ObwIYtbEO4CJAP6/DwcK+FCxdGCDeKD3vu6ur6LDY2Fuk6O11dXVBWVtYiD7X4Y21traW6uvppCsC2SoIUdHMqGhYWBv38/Dj1ARsXFgLHjh0jT6MVa59A/vpfW1s7XF1dPedD2WwRsNTUVGhvb49UCSh7dJAMcSC0bt06uHTpUg4sY5NCR2D//v01Tk5Oojvskcqe3pSUFODg4PBBLmqRpFRUVODz52gfKfz8/ALCwsLWCr2o3vU/Pj5+v7Oz8ySxxYXjYQ+B2NjYxa6urjLvXWPPM3RLS5Ys8V+/fv0aFA2qqqqgoaEBjaQAAMhDOJRzY1ACZFMGQqhEEATVc93ZdBnb4ikCr2uJSv/iY2RU4pGfpHJzc59YWFggXY3Uv39/cOXKFdG9e3/00Ufwr7/+4mNxYJ8EhoAY+0jfvn3hb7/9hpSJvLy8p+bm5prNCX+QSDQ0NGBNTQ2SwR07dmz28PDwQRLmqVBiYuKpyZMnW/PUPeyWABEoLy+P0dHRmSFA15t1ecuWLZsWLly4CCUeDQ0NUFNT0ywftTTaYWTohhIAH2SorAXhg//YB/4hoKSkBBobG8X2xkE7bzQLUEFBwT5TU1NnlLQaGBiABw8eiAr4P/74Y8gXX3zxKwoeWAYj0BICd+7cMe3Ro4doTi7U19eHqFvCioqK4o2MjFzexatZMnFwcIDkZ0GUv/Pnzxd/9dVXRiiyfJXp0qUL/Pvvv/nqHvZLwAgMGjToz4sXL34m4BDecv3cuXNFQ4YMMUSJx9HRESQnJ7/HSc2SlLKyMqSwW19Uo6jXYFMZwqLkC8soCALKysrkyRhi6zNI/YU8WeL58+etk5RUKm0nkUiQZszFuFcPQtiFIAj8SU9BSIOLMCGEHQiCqObCNhM2qczfSqVSDYlE8uxNv95jLRsbmykZGRnxKM6T1667u7uL6qmwffv2E/Pnz7dFwQPLYARkQWDnzp1e8+bNE82+vqioKDh79mxZQn+vzahRo5yPHz+e0CJJjRw5Ep44cQLJwNGjR03HjBkjmklAEgQTE5MThYWFmKSQKgILyYJAz549vW7evCkakkpOTjZxdHQskCX2d9vY2dmB9PT0twY6zY16kN4nXxsT1SiKjGnYsGHwzJkzKHhjGYyATAisWrUKBAcHi63v0MYjbwFz48aN9r169XoqE7LvNPr666/Br7/+Kjaggbm5OczLy0OBBMtgBGRCgNysvm7dOlH1HQsLC5ibmytT/O82Kiws1DQxMfmPh94CRiqVxkgkElcUzd99993S//u//1uPIstnGUxSfM6OOHwjz0hfv369qEhq1KhRS44fP74OJUNSqTRWIpG4Ncm+BYyTkxPcv38/il7Q3Kw8kiKeCWGS4llCROiOGEmKyioBZ2dnEB8f/x83vcveSO+RYlx60NQXMEmJkBV4FpIYSYqEmMpShDdPUaGFpExNTUFBQYGohquYpHjWk0XsjlhJytTUFBYUIH3ke8Vxzb7uoZ4fNWfOnDO7du2yFGMd4ZGUGLPKr5jESlLu7u6no6OjUXnhfZJKTk7OdnR0HI6SvsbGxoNKSkqOKLJ8l8EkxfcMCd8/sZIUhPAAQRBIvJCSknLawcHh1W2u/7GVmZnZrfz8fNSNwaJ81SMBwiQlfBLgewRiJanXuCPNc/ft27fot99+M36LpAwMDOCDBw9Q84lJChU5gcuR98pNnDgR5OTkTOvSpcvjTz75BLRt2xY8fvwYXLx4ESxatOj/0tLSwIEDB8izkwQeLTPuY5J6H9c3j3x6k1yQGK9jx46gqqoKkxQz9cs7rSQJ5efn79TW1vZAdW7nzp0vPTw8lCBEKjlUs7yVEzNJaWpqwidPnqBi/4pXKJPU3r17wfTp0zFJoaZBAHIkMUVFRR0eOXLkeLrdnTdvHty5cyfdagWlT8wktWfPHjhjBvIJyf+fpK5fv764d+/eG1Aya2dn1zE9PR2ZKlFssimjyHNSn3/+OThy5Mjsnj17/sw05mvWrMkPCAjow7QdPuoXM0nZ2dlppqenV6HgfuPGDd9evXptfMVUY8eOXXzkyBEkksrLy+tobm6OSQolCzyWKS0tPamnpzeSbRfd3NxgTEwM22Y5tSdmksrLy9M0NzdHIqnx48f7Hjp06F+S8vf3h2vXIt/jKdpXPRIbRRtJWVpaFufk5KB+5aWls//111/++vr6axoaGmjRx3clYiap19gjTT4GBgaC0NBQ4hXBfPXVV/D8+fOoucQkhYocz+R27dq1fM6cOT/yxS1LS0uYk5PDF3cY8wOTVPPQDhkyBJw7d+5fklJSUoIon4fFeMHhu3Apykjq9u3bX3722WfXGeuJiIpXrVoFg4ODEaWFISZ2kkK9NLTpyq+mURDScIwsnlWrVuGRlDD6QrNekmua6uvreZ3D8PDwEm9vbz0Bw9yi62InqcDAQBgaGoqavn9HUqh79vr16zf16tWr+1CtC0FOzCMpNTU1UFdXx2uCaqqRmJgY6Ob23xFDQigdmX0UO0mZmZm55Ofnx8kMyNsNCYLcCkNuiUFRsGPHjqkeHh6YpFDA44HMs2fPvm7Xrt05HrgikwsUn8gy2eCikdhJatu2bS4LFixAIqn8/HxjwsfHx2jTpk1IJFVdXe3Svn17pJtluCgGFJtiHUmFhITohoSEPETBhEsZY2NjeOsWUrly6bZCv+5VV1c7t2/fHmkwExAQYEyYmJgYFRYWomZdEK8KVKpTjCS1dOnS39etW2dGBReOZZHmUDn2+YPmxT6SojKlZGJiYkxkZmZCa2tr1PxhkkJFjiM5iURCHvUs6LzdvHlzVs+ePaM4gpB2s5ikPgxpVlYWIJYvXw5XrFiBCrygi12WoMU2kjp79qz20KFDH8sSO5/bjBgxAmZnZ/PZRZl9wyT1Yah+/PFHQNjY2MCMjAyZAW1q2LlzZ/DPP/9gkpIbOe4EvvnmG3Dq1ClR5Ky2tvZrdXX1s9yhSZ9lRSApHR0dWF5eLjdotra2gOjatSt8+FD++dMvvvgC/PHHH6Io+JaQE9lISlT5srKyupiVlTVQ7srnmYAikFSvXr3gjRs35EaenJ4gixZpEnLYsGHgzJkzoir65hAUC0mRa4xiYmJElS8IoQFBEPflrnyeCSgCSQ0dOhSePYs28EUmqfHjx4NDhw6JqujFTFIlJSXf6OvrZ/Ksf1J2p3v37i/u3r3bhrIiDhUoAkmNGzcOHj58GAllZJLS09NLLC0tnYJkVUBCYhhJiUmonZUAACAASURBVOGL3odK5vnz52tVVFT8BFRS77mqCCQlkUgSpFLpZJQ8IZPU8OHDE0+fPo1JCgV1lmWCgoLSVq9e/S3LZtk0hzRlwaaDLdlSBJIaOnRowtmzZ9klKVdX18TY2FhMUnyp9Jb9EPVr+ejRoyF52YNQ/xSBpKZOnZoQFxfHLkktW7YsceXKlZikeN4zhHDKAVUIhw0b1vfMmTNXqerhSl4RSCo4ODhh1apV7JLU9u3bE+fPn49JiqvKltGunp7e49LSUm0ZmwuyWUlJiY6+vv4/gnQeAKAIJBUREZHg6enJLknt2bMnccaMGZikeN4z4uLiSqZOnWrAczfpcE+w81KKQFK7d+9OmDlzJrsktX///kQnJydMUnR0L2Z1iHo+qgk61BXNzEIvm3ZFIKnExMSEyZMns0tShw8fThw3bhwmKdnqkMtWCkFSdnZ2MD09nUuckW0rAkkdOnQoYfz48eySFB5JIdck24IKQVKenp4wIiKCbWxpsacIJBUfH5/g7OzMLknhOSla6pMNJQpBUkpKSssaGxt5c9ONPIlVBJLiZE4Kf92Tpww5basQJPXRRx8t++uvvzBJcVpqHzbOydc9vE6Kp9XwvlsKQVLKysrLXrx4gUmKp2UZGBiYEBoayu7rHl5xztNqUFCS8vb2huHh4YJJypuOKsLrHicrzvHePcH0B4UYSY0dOxYeOXJEMElRNJLiZO+eRCJJlEqleAkC/7uFQpCURCKBUqmU/9loxkNFGEl9/PHHCY8ePWL3dW/cuHHg8OHDou8AQj+qJS0trWT06NF4xTmP6UsRSIrKSBf5qBZ8MiePq/4N1z777LO/bt++/bEwvEXzUiqVtpNIJDVo0txLKQJJff311/DXX39FApvQ1dWFZWVlcgvjM87lhowTgXbt2oFnz56JesQ7efLkvomJifgUBE4qTDajpqamsKCgQLbGb7TS09NDvy1GR0cHlJeXi7r4SayE/rr3Ot+izpO7uzuMjo6WuwPwRUARRlLa2tqwoqJCbshHjhwJiJCQEBgSEiK3sCIUv1hIKiQkZEdISMh81CQLQE6wJyCQ2CoCSaFe+PLq3r2srCxoZWWFWoeifkKLhaRMTExAYWGhmHOFSQq1B7Mnh5Sj06dPA8LExMSosLDwFqKvYi78V5CI5HUPVFZWDuzUqdNlxDzzVmzQoEG1Fy9eVOOtgzI4hkdSHwbJxMTEmPDx8THatGkTEklVV1e7tG/fPl6GPAi2iVhIavny5WDFihVifKggPaH5VJBiJ6mKigpnbW3tfSiYBwQEGBP5+flGZmZmSCS1Y8eOqR4eHkjGURzmQkYsJCXGOUQHB4djKSkpo7moCzptip2kNm/e7LJo0aI4FMwKCgqMm56sSE+jfv36Tb169SomKRT0OZBZunRp47p16wR9keY7sCHVLQfQt2hS7CTVs2dPl5s3byKRFACAoERSwcHBYNWqVWJ8hfivqEQ2kiLjEkW+5s6dCyMjI/nGN0j+iJ2k/Pz8YFhYGBI2/5GUkpISbGxslFtJv379wNWrV0VR9B8KXmwk1bNnT3Dz5k1B5+zx48d9tbS0BLt4891aEztJffnll/DatWty84uSkhJobGz8dyRlYWEBc3Nz5VYixnmOd0EQG0mR8W3evPnwokWLxqMmnGs5giAghKJ403sFpdhJCnWN1Ndffw1+/fXXf0nK398frl27FrX2BP1Ubi1oMZIUGXNsbGxnV1fX8tbi59vvlpaWj3NycjrxzS8q/mCSah69wMBAEBoa+i9JjR07dvGRI0c2oACdl5fX0dzc/AmKrBBkxEpSJPYQws4EQQiGqBITEyHirUi8LjUxk1ReXp6mubl5FUoCxo8f73vo0KGNr0jq+vXri3v37o1EUnZ2dh3T09MxSaFkgWMZIe2//Omnn3y+//77jRxDxoh5MZPU4MGDNS9cuIBEUjdu3PDt1avXvyT1+g/pJT8mJga4ubmJ9pVPzCMpMu8GBgbgwYMHvM5famrqBnt7+8WMMAQPlIqZpHbu3AnnzZuHivKruqRMUpqamuDJkye8LnJUhEg5sZPUqyIgCPLVj5c5PH369Mvhw4crUckh32XFTFLq6uqwtrYWNQVvk5SBgQF88OABJWWownyWUwSSIvEnP/eGhYX1XrJkyR98yUdAQABcs2YNX9xhzA8xkxTql71PPvkE3L9//22S6tu3763ffvvNCDETvHwKI8bylpiikFRT0HPnzk2LjIz8lg7sUHVACKd07do1/tGjR6gqBCWHSer9dPXv37/oypUrxm+97qWkpGQ7ODgMR8luY2PjQSUlJUcUWb7LKBpJkfno0KEDePr0KScPnqioqL9nz57dme91Qad/YiWpFy9eHFBWVkbihdTU1NP29vYj3p2TIv8/0uT5rFmzzvz888+WdCaOL7oUkaSasCcv24iJiemnpaX1G9P5uHLlyun+/fuLsoZaw06sJDVlypTTCQkJqDn97yH57tMSiaRMTU1BQUEBJ0/e1gqA6u+KTFJN2NnY2IAdO3b88Pnnn6+kiue78jNmzAjbt2/f0oaGBrpVC0afWEnK2NgY3rqFdMDKWwMoWkiKz1+HqFYqJqn/jyB5qcOwYcMuHj9+fDAVXC9evHhhxowZg27cuEFFjWhkxUpSFLcvNT+SmjJlCkxISEBKvlQq1ZBIJM+QhHkshEmq5eRMmzYNeHp6goCAgK7kMdQ9evR4JVBTUwMuXLgAyFuFQ0NDH5JnVd+/f5/HmebONTGSVF1d3RI1NbV1KKhOnToVxMXFNU9SUqk0RiKRuKIoHj169NK0tLT1KLJ8lsEkxefsiMM3MZJUcHDwgVWrViFNmpeVlcXq6uq6NWX3rde9GzdutO/Vq9dTlNQ37VhGkeWzDCYpPmdHHL6JkaQGDBgAL19GO1K/sLBQ08TE5D8eam6yG2ny/HW5iG7yHJOUOIiAz1GIkaRQVwo0xyPvkcqoUaPg8ePHkXKanJxs6ujoWIgkzFMhTFI8TYyI3BIbSe3Zs8dkxowZ8l9XDACws7MD6enpb/HSeyRlY2MzJSMjA+kGmN27d4OZM2eKajSFSUpEbMDTUMRGUlu3boVeXl5IaI8aNcr5+PHjb329e49QpFJpO4lEUoNiQYxLETBJoVQClpEHAbGRFJWlB82tEmh21KOiogKfP38uD85vtsUjKVTksJxCIiA2kkKdj1JRUQHPnz9/jz+aJZQJEybAgwcPIhVMbm5usYWFBepGZSSbTArhkRST6GLdJAJiIqkTJ04UjRw50hAls46OjiA5OVk2kioqKtpnZGTkjGJIX18flJSUiGY0hUkKpQqwjDwIiImkJBIJlEql8oT/X9uioqJ4IyMjl3eFWyITvBRBQQ69Q6ooLEQbAmIiKdRXvddgNstHHyQpDQ0NSG5tQPmLiIjY7Onp6YMiyzcZitd98S0c7A8PEQgKCgKrV68W/NvHypUrNy1btmwRCsQaGhrkVir5SOrixYtPBg0a1AHFoJguDbWxsYEZGRkoMGAZjIBMCGzZsgUsXLhQ8CTVq1cviLpp/MKFC0/JSxuaA6w1YBT+lW/MmDEnjh49aitTteFGGAEEBMaPH+916NChCARRvokwwhctkpSqqipEPefHz88vICwsDPnGUb6gf+XKlRP9+/fHJMWXhIjQj/z8fC8zMzNBk9S8efP8d+7ciXQgvaqqKmhoaPggF7VIUqmpqdDe3h6pLNTV1UFtbW1rIzUk3RwIUXlCcOAuNikwBATfT6isrUxNTQX29vZoJFVbW2uprq5+mkLCBQ/+69gxSVEoAiz6YQQkEgmQSqVi6CfIfaS2tna4urp6zodQahUcKuseZsyYUbtnz552Qi/SuXPnwsjISKGHgf3nIQJeXl4nt27dOpKHrsnskoODw7OUlBR1mQXeaCgLSbdKUlOnTl0fFxfni+KAWPbyJSUlqU6aNKkeBQMsgxFoBYFW+yDfEaSyV2/69Okb9u7du6SlGGUFCHkol5aWtn/06NGT+Q50a/4ZGRnBoqKi1prh3zECMiMwZMgQcO7cOVn7oMx62WyYmJiYMHnyZCr9u9X4W21ABtyjRw94584dpNi7desG7t27J5MdJAMsCd27d++7bt26HWXJHDajAAhACNsSBCHoa3KoTAeR5+HfuXOnVW5otQFZK3fv3l3dvXv3QNS6KS4unmZoaBiHKs8XudGjRxelpaUhbZ7kSwzYD34gsGTJkqz169db88MbNC/y8/OnmpmZ/YImDcCff/4Z+umnnwa1Ji8TSb1WgvzKN3HiRHDgwAF5bLXmN2e/t23bFtbX4+kpzhIgAsOfffYZuH37tuD7g52dHUxPT6eSEZkwkKkR6cXq1ashuccI9a+ioqKTtrZ2Fao8X+QuX76sMmDAAEEP0fmCpSL6QX5MamxsVCYI4qWQ46+oqOiora1diRpDaGgoCAwMlIl/ZGpEOlJYWCgxMTEpRXUqJCQEhISEyGwP1Q4bctu2bTNdsGDBTTZsYRuiQ0AUfcDLywtu3boVOTn5+fn6ZmZmMvGJXICZmprCggKk89WbgpHLHjICLAhmZGSYjxw58nxjYyML1rAJoSNAnjp56NAh3W+//fah0GOhOv1jamoKCgoKZOYCmRuSjiUnJ/s4OjpuRAXZ29v7THh4uCWqPN/kIIS66urq0rq6Or65hv3hEQKffPIJ+P3337W0tLSQX494FA6YPn16zt69e4eh+pSSkrLYwcFhk6zycpEUqVRdXR3W1tbKqv+tdmJZ3Plu8Hv27Pnd3d39SwiRvy0g4YmF+I1AmzZtQE5OzrWvv/66D789lds75EJH2dMrN0n9+OOPWT/88MMIucN6LfDDDz+c+PHHH0ehyvNZbtmyZXDlypV8dhH7xhICy5cv/3PFihWfsWSONTPe3t7Hw8PDkbfxrF69OjsoKMhKHoflJimq76NiHU29CbqHh4f5X3/9dR71Mgt5Eojb8geB2bNng27dun0VHBycxx+vaPcEeRT12hO5OUduAdIQ1RFDcnIycHR0RLJNO+RYIUYAIyATArGxsdDV1VWmts01Wr58OVixYoXc/V5uAdJ4cXFxW0NDQ6qzxUi2kRHCghgBjABVBCiNooqLi9UMDQ3lXgmNTBQzZ86E5LXqqH+urq41sbGx7VHlsRxGACPAHgKOjo7VycnJGqgWyVfhqKgoJL5BEiIdDQkJUQoJCaG6ahbZPipYWA4jgBFAQoDSKApC2IYgCKRFhZRIYuzYsfDIkSNIEZNCdnZ2ID09nZIPyMaxIEYAIyATAlZWVjArK0umts01GjduHDh8+DByP0cWJJ2BENoQBHES2XsAQGpq6jf29vaZVHRgWYwARoAZBGJiYqzd3NxOUdEOIbQlCAL5XjhKJEU6/tVXXz09f/488txS27ZtQX19PWU/qICIZTECGIHmEaByYxSpcdCgQdUXL15Eur+zySO6yIHS+2p+fv4zMzMz5Ek5XGAYAYwA/QicO3euZsiQIVTvKKDMMZQVkNAsWbIErl+/HhklRVjgiQwOFsQIcIQAlbPLSZd9fX3Bhg0bKHMMZQWkM+Xl5Zo6OjqUzooaMmRI+blz5zpzlA9sFiOAEXgDAWtra5iZSW2quLy8vKOOjs4TqsDSQlKkE9u2bXu2YMECpGttmoLIzMycb21tvYNqUFgeI4ARQEfg8OHDHuPGjduOrgGAiIiIWk9PT6qviq9coI2kSGVUj9ZVUlICL1++FPzh9FSSi2UxAlwiACFUJQhC7lXhb/pM98cwWkmqqKiowMjIyIQKyO7u7iA6OppWv6j4g2UxAoqEwOTJk2FiYiKlkG/fvl342WefmVJS8oYw7WRA5bypJr/q6uqy1dTU5DrOgS5AsB6MgKIiUF1dndW+fXvkY5hI3NTU1EBdXR2tvEKrMtLJpKQkvUmTJpVQSTTdw0UqvmBZjICiIKCiogKfP39OKdykpCT9SZMmyXR2uayGaCcp0jDVJQmkDvJS1MTEREb8kxUc3A4joCgIODg4wJSUFErh+vn5gbCwMNr7LO0Km6JUVlaGL168oBT0rVu3So2NjfUpKcHCGAGMQIsIFBQUlJiamupRgYk8Kvnly5eM8AkjSslgL126tHvgwIHuVAInZSGExgRBFFHVg+UxAhiB9xGoq6szUlNTu0UVm7y8vGhzc/OZVPU0J88YSZHGevXqBW/cuEHJ786dO4N//vmHUT8pOYiFMQICRkBLSws+fvyYUgRffPEF+OOPPxjro4wpfj0KUqLjptbp06eX7d27V0IJSSyMEcAIvIXAlClTpAkJCbpUYaFyVpQsthklKdIBBwcHp5SUFGoLL/49ZG9CSEgItZk9WRDBbTACCoCAt7e3Q3h4+EGqoWZlZVVbWVlROuWgNR8YJynSgYEDB8JLly615kurv0MIpxIEsa/VhrgBRgAj8EEEIIQuBEHEUYVowIAB4PLly4xzCOMGmoCguqOa1MPkFwSqCcPyGAGhINCmTRv48iW1k7/ZPLmENZKqrKys7NSpU0eqiezTpw95ZTVrflP1F8tjBPiEAB0fs8h4qqqqqjp27NiJjdhY7eyOjo6QvHOP6p+/vz9Yu3Ytq75T9RnLYwS4RoCORdZkDPb29uSx36z1P9YMNSWoffv2sLq6mnK+Nm3atMvHx2cuZUVYAUZAARDYvHlz5KJFi+ZQDVVDQwPU1NSwyhusGiMBqqysDOvUqdNSqmCR8g8ePDhgYGAwiQ5dWAdGQKwIPHjwIMnAwGAiHfE9efJknaamph8dumTVwTpJkY5FR0ffdHd3p+Uoh9LSUn89Pb0wWQPG7TACioRAaWmpn56e3lo6Yo6NjS1wdXXtSYcueXRwQlKkg8OHD4enT5+Wx9dm25JfGbKysrqMGDHiH8rKsAKMgIgQyM7O7mxlZfU3hJTuSXmFiJWVFdnPOOELToySQUMI1VRVVWupHg1B6lJWVgYvXrzgLBYR1TUORUQI0LHJn4RDRUUFNDQ0qBMEUccFPJx27JcvX85TVlbeQQfTk2dQ1dXVaRAE8YwLILFNjABfEIAQtlNXV6+pq6POKeSbSn19/XxVVVXO7h7glKTIpKakpPg5ODjQ8s7ctWtXUFZWJiEIoowvBYP9wAiwiQCEUFcikUjLyujpAvv37/d3cnLidM6Xc5IiExgREQE9PT1pySVJVA8fPuRFXLQEhJVgBORAQCKRQKlUKofEh5tu2bIFLFy4kPO+xLkDTRDRcc9Xky58/DAtNYqVCAwBNTU1SMcrHhm2paUlyMnJ4QU/8MKJplrQ1dWFdA1TdXV1gVQqxa9+Auto2F35ESBf8fT09KR0jaD49jbCK5Ii00PH5sc3SI8kKjyZLn/dYwmBIEBOkuvq6tY8fPiQFo/5uImfdyT1+PHj7lpaWndoQfzfC0vJrxO8i5Ou+LAexUaA6oW876L3+PHjHlpaWnf5hCovO+++ffv6u7i4XKYLKPLpcOrUKbzgky5AsR7OESAXatrY2PxN9bKTNwOJiYkZ4ObmdoXz4N5xgJckRfqYlZWVZ2VlNZhOwB48eOBvYGDA6edUOuPBuhQTgbKyMj+JRLKWjvWFTQhmZGRcsLGxMecjorwlKRKszMzMO9bW1t3pBO7evXsHunXrhjcl0wkq1sUaAiUlJUn6+vq0bBZucvrUqVN3v/nmmx6sBSGnIV6TFBlLSkpKpYODA+XD8t7EZc2aNbsCAgLwMS9yFgtuzi0CmzZtivTx8aF83MqbUSQlJVVNmjSJlcPrUNHjPUmRgfn4+FzctGnTQNQgm5Pz9fUFGzZsEET8dMaNdQkTAV9fX7hhwwZanff09LwUERExiFalDCgTTCedO3futcjIyN50YsD0fWF0+op1KS4CX3zxBfzjjz9oBWDevHnXd+7c+SWtShlSJhiSIuP39/eHa9fSss3vPzjJL38vXrzAt9AwVGBYLToC5K0uysrKcVQvTXjXA6Edvy0okiLBZmJERepdvHjxhI0bN+J7/dD7FJakEYGAgACHNWvWUL4X712X5s6dez0yMlIQI6gm3wVHUqTjCxcuvLhlyxZa56hIvZMmTSpLSkrCNyXT2NmwKvkRcHFxke7bt4/yzcLvWl6wYMGlbdu28X4O6l2/BUlSZBCpqamV9vb2tH71I/Vqa2uTW2mM1dTUiuQvLyyBEUBHAEJopKOjc6uiogJdyQckDxw4UDVx4kRef8X7UNCCJSkyICbWUTUBdf369dLevXvr014tWCFGoBkECgoKSkxNTfWYAOfkyZN3bW1tebsOqrWYBU1Sr4kqz9ramtaV6U2gOTg4kOu0BI9Ra0WAf+cWAXt7e5iamsqIE5mZmResra15uZJc1oBF0QFjYmL6u7m50bbX703wyPOdKyoqsjt06GAlK6i4HUZAFgRqa2szO3bsaNXQ0CBLc7nbREVFDZg9ezbv9uLJG4goSIoMmjw9oXPnznfo/lzbBOjkyZNBQkJCW4IgmKkoeTOH2wsWAQihqpubW31sbCwjMZDLakpKSnro6ury6jQD1GBFQ1JNANB5cF5zoCYlJc2fNGkSZ4fSoyYay/EDgcOHD3uMHz9+O52bg9+MjG8H1tGBuuhIigTlm2++gadOnaIDn2Z1DB8+HJw+fVqU2DEGGlYMbG1t4cmTJxlDQqx1KdqOtmXLFrhw4ULGCoK86ufcuXPPLCwsNBgzghWLAoHz58/XWFhYtGNq9ESCtHXrVuDl5SXK/izKoJoqOyEhwc/Z2ZnWc3fe7TWqqqpg+/bt38yaNStTFD0KB0EbAgkJCdZubm6n6uvradP5riLyYZmYmMj5tVOMBQgAEDVJkcA1NDTMa9eu3Q46TzBsLiFcXkPNZIFg3WgI0Hn70Yc8IL8819TUeKiqqu5E81IYUqInKTIN5JXutra2tRkZGYxnxdHRsSY5Obk944awAV4iMHXq1Oq4uDjGpwCGDh0Kzpw5w9nV52yCrxAk1QToL7/8cnPatGmmbAC8a9cuMGfOHIXClw1c+WojMTERkstU2Pj7+eefC2bNmtWTDVt8sKFwnaiysjJMX19/aXV1NSv4e3t7nwgPDx/FijFshHUE/Pz8joeFhY1kw7CGhgYoKytbp6mp6ceGPb7YUDiSagLewcEBpqSwdzKLi4vLmX379lnyJfHYD2oILFiwIGf79u3DmPxi96aHEyZMAAcPHlTI/qqQQTcl/8mTJ5UdO3bsyFahkXbnz58PVq1a1UlbW7uKWjfB0mwjUFFR0XHdunWVdB+82FIc5Ne7ioqKKi0tLUGeYEBHjhSapJoAHDhwILx06RIdeMqsY+TIkSA0NHRa//7942QWwg05QaCwsHBqYGDgL2yOvMlA+/XrB65evarwfVThAWiq+uzs7KcjRoxg/ascuY1hy5YtiU5OTlM46YHY6AcROHjwYIK3t/fkBw8esI6Svb39lNTU1ETWDfPQICapN5ICIVTq06fPy2vXrrGeKnJYP2rUqNr09PR2rBvHBt9CwMnJ6VlSUpI6m9MATQ4YGRmBW7dutSEIohGn5V8EMEk1UwlXr17dPXDgQHemTlRorfjIRXpHjx4lSQvnpzWwaPp90aJF/tu3b1/D5OrwllwlTy7Izc2NHjx48EyaQhKNGtwJWkiln58fDAvj9lZ28snq4uKyefny5T6iqTqeBLJly5ZN0dHRi37//XdOPcJ3QLYMPyapVsrz6NGjek5OTiXPnj3jtJBJ4+T8VVRUVPGYMWOMOHdGoA6cP3++aOLEiYYlJSWcR6CmpgYiIyP1XV1dSzl3hscOYJKSMTl37twpMDExMeHqdaA5N9evXw86d+5sOmPGjEIZw1C4ZvHx8SZPnz4tmDdvHrk9ihfxt23bFly7dq3Q2NiYld0PvAiaghOYpOQEb/v27c/mz5+vLqcY483NzMzAd999lxwcHHxRTU1tPeMGeWpAKpW2c3FxWVBfX78uNzeXd15GRETUenp64o8jcmQGk5QcYDU1LS8v11y7dm0VOZLh4x/5pdDQ0BAMGDDgTGxs7N8qKiqOfPSTDp9evHiR7Orq2uXy5cvDioqKeDNaeje2JUuWkDdwd9TR0XlCR9yKpAOTFMVsW1hYPM3NzWV9fRWK2+RXw/DwcLB3796OW7ZsAebm5oLpMHl5eZrkIYZeXl5Vc+bMIY8oQYGAdZn+/ftXX7lypQPrhkVkEJMUDcmEENqMGzfu5JEjR2jQxr6KPn36ADs7O3Ds2LGps2fPBq6urrBjx47xbHtSVVXlHBsbS0RFRYFvv/02Lj09HXD95Q0Vg3HjxoFDhw7ZEgTB/PlAqE4KRA6TFI2JCgkJUSopKXn5888/06iVe1VaWlpAV1f31e3O5L+LFy8mGhsbAz09PdC9e3egr68PyHU+5G9Nf+QtvOQ6M/Ir2t27d0FpaSm5SBEMGjRoMvkb+e/hw4ev/ldMfyTJ79q1Cy/GpDGpmKRoBLNJVXFxcdv4+Pi65cuXM6Adq+QjAj4+PsDDw0PN0NCQubOC+Rg4Cz5hkmIY5FWrVmWtXr16RG1tLcOWsHq2ESDXOQUHB2cHBwfji2MZBB+TFIPgvqn62LFjPosXL95YWIiXNLEEOWNmTE1NQWBg4OJp06ZtYswIVvwfApikWC6GwsJCyfHjx0u9vb1ZtozNUUVgxYoVwMnJSc/ExERKVReWlx0BTFKyY0V7y5KSktVDhgzxu3fvXhvalWOFtCDQrVs3kJ2dHfrpp58G0aIQK5EbAUxSckPGjMDUqVPXZ2Vl+Uql+CHNDMKya/3oo4/I24Y3xMXFLZFdCrdkCgFMUkwhi6iX/DL4xx9/1Dk5OZF3BiJqwWLyIkBe8pqUlARGjhw5XF1dPUdeedyeOQQwSTGHLS2aL1269GTEiBEd2LrdhhanBaKEvH0lKyvr6eDBgzUF4rJCuolJSkBpv3379r6lS5c6kyvbnz9/LiDP+eEquS2IXAm+mBfrDgAAATNJREFUYsWK+F69ernwwyvsRWsIYJJqDSGe/k7u9p81a9a4xsbG+OPHj/PUS+7dIrf7PH361DkpKemwRCLh/lAw7iERnAeYpASXsuYdLiws7KCpqRnh6+vrGh/P+rY73qDo4uICNmzYEFtVVeVpYmLylDeOYUeQEcAkhQydMATT09Ozly1bJnn06JERF7eeMIWSgYEB6NChQ1FwcLDU2dl5BFN2sF7uEcAkxX0OWPfgxo0bi4OCgkCfPn02nDlzBpw9exa8ePGCdT9aM6ikpERuSAbDhg0D//vf/3xXrlwJevXqtbE1Ofy7uBDAJCWufFKOJj8/32j//v0gNTUVbN++/RZJYOQJl9evXwd0ruGSSCSgd+/ewMLC4hUJeXh4GNvb25MruoGZmVkR5UCwAtEg8P8AzPpLxGQ7o9oAAAAASUVORK5CYII="
| 11,189 | 22,377 | 0.971892 | 548 | 22,378 | 39.687956 | 0.998175 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149171 | 0.000045 | 22,378 | 1 | 22,378 | 22,378 | 0.822764 | 0 | 0 | 0 | 0 | 1 | 0.999374 | 0.999374 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1f96c291eac8f6324e3f9478c23799bf0c94fbc8 | 31,201 | py | Python | tests/test_api.py | mathrick/testrail-python | 64e60dac202e4767057b09920f77ec73dfb320ba | [
"MIT"
] | 32 | 2016-03-14T20:30:46.000Z | 2021-12-31T09:14:48.000Z | tests/test_api.py | mathrick/testrail-python | 64e60dac202e4767057b09920f77ec73dfb320ba | [
"MIT"
] | 123 | 2016-01-21T20:36:45.000Z | 2022-02-25T19:37:49.000Z | tests/test_api.py | mathrick/testrail-python | 64e60dac202e4767057b09920f77ec73dfb320ba | [
"MIT"
] | 41 | 2016-03-09T22:40:46.000Z | 2022-02-27T22:44:42.000Z | import ast
import copy
from datetime import datetime, timedelta
import mock
import os
import shutil
import util
try:
import unittest2 as unittest
except ImportError:
import unittest
try:
from itertools import ifilter as filter
except ImportError:
pass
from testrail.api import API
from testrail.helper import TestRailError
class TestBase(unittest.TestCase):
def setUp(self):
self.client = API()
def test_set_project_id(self):
self.client.set_project_id(20)
self.assertEqual(self.client._project_id, 20)
class TestConfig(unittest.TestCase):
def setUp(self):
home = os.path.expanduser('~')
self.config_path = '%s/.testrail.conf' % home
self.config_backup = '%s/.testrail.conf_test_orig' % home
self.test_dir = os.path.dirname(os.path.abspath(__file__))
if os.path.isfile(self.config_path):
shutil.move(self.config_path, self.config_backup)
shutil.copyfile('%s/testrail.conf' % self.test_dir, self.config_path)
def tearDown(self):
if os.path.isfile(self.config_path):
os.remove(self.config_path)
if os.path.isfile(self.config_backup):
shutil.move(self.config_backup, self.config_path)
if os.environ.get('TESTRAIL_USER_EMAIL'):
del os.environ['TESTRAIL_USER_EMAIL']
if os.environ.get('TESTRAIL_USER_KEY'):
del os.environ['TESTRAIL_USER_KEY']
if os.environ.get('TESTRAIL_URL'):
del os.environ['TESTRAIL_URL']
if os.environ.get('TESTRAIL_VERIFY_SSL'):
del os.environ['TESTRAIL_VERIFY_SSL']
def test_no_env(self):
client = API()
config = client._conf()
self.assertEqual(config['email'], 'user@yourdomain.com')
self.assertEqual(config['key'], 'your_api_key')
self.assertEqual(config['url'], 'https://<server>')
self.assertEqual(client.verify_ssl, True)
def test_user_env(self):
email = 'user@example.com'
os.environ['TESTRAIL_USER_EMAIL'] = email
client = API()
config = client._conf()
self.assertEqual(config['email'], email)
self.assertEqual(config['key'], 'your_api_key')
self.assertEqual(config['url'], 'https://<server>')
def test_key_env(self):
key = 'itgiwiht84inf92GWT'
os.environ['TESTRAIL_USER_KEY'] = key
client = API()
config = client._conf()
self.assertEqual(config['email'], 'user@yourdomain.com')
self.assertEqual(config['key'], key)
self.assertEqual(config['url'], 'https://<server>')
def test_url_env(self):
url = 'https://example.com'
os.environ['TESTRAIL_URL'] = url
client = API()
config = client._conf()
self.assertEqual(config['email'], 'user@yourdomain.com')
self.assertEqual(config['key'], 'your_api_key')
self.assertEqual(config['url'], url)
def test_ssl_env(self):
os.environ['TESTRAIL_VERIFY_SSL'] = 'False'
client = API()
self.assertEqual(client.verify_ssl, False)
def test_no_config_file(self):
os.remove(self.config_path)
key = 'itgiwiht84inf92GWT'
email = 'user@example.com'
url = 'https://example.com'
os.environ['TESTRAIL_URL'] = url
os.environ['TESTRAIL_USER_KEY'] = key
os.environ['TESTRAIL_USER_EMAIL'] = email
client = API()
config = client._conf()
self.assertEqual(config['url'], url)
self.assertEqual(config['key'], key)
self.assertEqual(config['email'], email)
self.assertEqual(client.verify_ssl, True)
def test_config_no_email(self):
os.remove(self.config_path)
shutil.copyfile('%s/testrail.conf-noemail' % self.test_dir,
self.config_path)
with self.assertRaises(TestRailError) as e:
API()
self.assertEqual(str(e.exception),
('A user email must be set in environment ' +
'variable TESTRAIL_USER_EMAIL or in ~/.testrail.conf'))
def test_config_no_key(self):
os.remove(self.config_path)
shutil.copyfile('%s/testrail.conf-nokey' % self.test_dir,
self.config_path)
with self.assertRaises(TestRailError) as e:
API()
self.assertEqual(str(e.exception),
('A password or API key must be set in environment ' +
'variable TESTRAIL_USER_KEY or in ~/.testrail.conf'))
def test_config_no_url(self):
os.remove(self.config_path)
shutil.copyfile('%s/testrail.conf-nourl' % self.test_dir,
self.config_path)
with self.assertRaises(TestRailError) as e:
API()
self.assertEqual(str(e.exception),
('A URL must be set in environment ' +
'variable TESTRAIL_URL or in ~/.testrail.conf'))
def test_config_verify_ssl_false(self):
os.remove(self.config_path)
shutil.copyfile('%s/testrail.conf-nosslcert' % self.test_dir, self.config_path)
client = API()
self.assertEqual(client.verify_ssl, False)
class TestHTTPMethod(unittest.TestCase):
def setUp(self):
self.client = API()
@mock.patch('testrail.api.requests.get')
def test_get_ok(self, mock_get):
mock_response = mock.Mock()
return_value = {
"announcement": "..",
"completed_on": None,
"id": 1,
"is_completed": False,
"name": "Datahub",
"show_announcement": True,
"url": "http://<server>/index.php?/projects/overview/1"
}
expected_response = copy.deepcopy(return_value)
mock_response.json.return_value = return_value
mock_response.status_code = 200
mock_get.return_value = mock_response
url = 'https://<server>/index.php?/api/v2/get_project/1'
actual_response = self.client._get('get_project/1')
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_bad_no_params(self, mock_get):
mock_response = mock.Mock()
expected_response = {
'url': 'https://<server>/index.php?/api/v2/get_plan/200',
'status_code': 400,
'payload': None,
'response_headers': "Mock headers",
'error': 'Invalid or unknown test plan'
}
url = 'https://<server>/index.php?/api/v2/get_plan/200'
mock_response.json.return_value = {
'error': 'Invalid or unknown test plan'
}
mock_response.headers = "Mock headers"
mock_response.status_code = 400
mock_response.url = url
mock_get.return_value = mock_response
with self.assertRaises(TestRailError) as e:
self.client._get('get_plan/200')
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, ast.literal_eval(str(e.exception)))
class TestUser(unittest.TestCase):
def setUp(self):
self.client = API()
self.mock_user_data = [
{
"email": "han@example.com",
"id": 1,
"is_active": 'true',
"name": "Han Solo"
},
{
"email": "jabba@example.com",
"id": 2,
"is_active": 'true',
"name": "Jabba the Hutt"
}
]
self.users = copy.deepcopy(self.mock_user_data)
def tearDown(self):
util.reset_shared_state(self.client)
@mock.patch('testrail.api.requests.get')
def test_get_users(self, mock_get):
mock_response = mock.Mock()
expected_response = self.users
url = 'https://<server>/index.php?/api/v2/get_users'
mock_response.json.return_value = self.mock_user_data
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.users()
timeout = self.client._timeout
# set timeout to 1 second from now
delta = timedelta(seconds=timeout-1)
self.client._users['ts'] = datetime.now() - delta
actual_response = self.client.users() # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_users_cache_timeout(self, mock_get):
self.client = API()
mock_response = mock.Mock()
expected_response = self.users
url = 'https://<server>/index.php?/api/v2/get_users'
mock_response.json.return_value = self.mock_user_data
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.users()
timeout = self.client._timeout
self.client._users['ts'] = datetime.now() - timedelta(seconds=timeout)
actual_response = self.client.users() # verity cache timed out
c = mock.call(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
mock_get.assert_has_calls([c, mock.call().json()] * 2)
self.assertEqual(2, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_user_with_id(self, mock_get):
mock_response = mock.Mock()
expected_response = next(filter(
lambda x: x if x['id'] == 2 else None, self.users))
url = 'https://<server>/index.php?/api/v2/get_users'
mock_response.json.return_value = self.mock_user_data
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.user_with_id(2)
actual_response = self.client.user_with_id(2) # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_user_invalid_id(self, mock_get):
mock_response = mock.Mock()
mock_response.json.return_value = self.mock_user_data
mock_response.status_code = 200
mock_get.return_value = mock_response
with self.assertRaises(TestRailError) as e:
self.client.user_with_id(300)
err_msg = "User ID '300' was not found"
self.assertEqual(err_msg, str(e.exception))
@mock.patch('testrail.api.requests.get')
def test_get_user_with_email(self, mock_get):
mock_response = mock.Mock()
expected_response = next(filter(
lambda x: x if x['email'] == 'han@example.com' else None,
self.users))
url = 'https://<server>/index.php?/api/v2/get_users'
mock_response.json.return_value = self.mock_user_data
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.user_with_email('han@example.com')
# verify cache hit
actual_response = self.client.user_with_email('han@example.com')
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_user_invalid_email(self, mock_get):
mock_response = mock.Mock()
mock_response.json.return_value = self.mock_user_data
mock_response.status_code = 200
mock_get.return_value = mock_response
with self.assertRaises(TestRailError) as e:
self.client.user_with_email('invalid@example.com')
err_msg = "User email 'invalid@example.com' was not found"
self.assertEqual(err_msg, str(e.exception))
class TestProject(unittest.TestCase):
def setUp(self):
self.client = API()
self.mock_project_data = [
{
"announcement": "..",
"completed_on": None,
"id": 1,
"is_completed": False,
"name": "Project1",
"show_announcement": True,
"url": "http://<server>/index.php?/projects/overview/1"
},
{
"announcement": "..",
"completed_on": False,
"id": 2,
"is_completed": True,
"name": "Project2",
"show_announcement": True,
"url": "http://<server>/index.php?/projects/overview/2"
}
]
self.projects = copy.deepcopy(self.mock_project_data)
def tearDown(self):
util.reset_shared_state(self.client)
@mock.patch('testrail.api.requests.get')
def test_get_projects(self, mock_get):
mock_response = mock.Mock()
expected_response = self.projects
url = 'https://<server>/index.php?/api/v2/get_projects'
mock_response.json.return_value = self.mock_project_data
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.projects()
timeout = self.client._timeout
# set timeout to 1 second from now
delta = timedelta(seconds=timeout-1)
self.client._users['ts'] = datetime.now() - delta
actual_response = self.client.projects() # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_projects_cache_timeout(self, mock_get):
mock_response = mock.Mock()
expected_response = self.projects
url = 'https://<server>/index.php?/api/v2/get_projects'
mock_response.json.return_value = self.mock_project_data
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.projects()
timeout = self.client._timeout
self.client._projects['ts'] = datetime.now() - timedelta(
seconds=timeout)
actual_response = self.client.projects() # verify cache hit
c = mock.call(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
mock_get.assert_has_calls([c, mock.call().json()] * 2)
self.assertEqual(2, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_project_id(self, mock_get):
mock_response = mock.Mock()
expected_response = next(filter(
lambda x: x if x['id'] == 1 else None, self.projects))
url = 'https://<server>/index.php?/api/v2/get_projects'
mock_response.json.return_value = self.mock_project_data
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.project_with_id(1)
actual_response = self.client.project_with_id(1) # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_project_invalid_id(self, mock_get):
mock_response = mock.Mock()
mock_response.json.return_value = self.mock_project_data
mock_response.status_code = 200
mock_get.return_value = mock_response
with self.assertRaises(TestRailError) as e:
self.client.project_with_id(300)
err_msg = "Project ID '300' was not found"
self.assertEqual(err_msg, str(e.exception))
class TestSuite(unittest.TestCase):
def setUp(self):
self.client = API()
self.client.set_project_id(1)
self.mock_suites_data_1 = [
{
"description": "..",
"id": 1,
"name": "Setup & Installation",
"project_id": 1,
"url": "http://<server>/index.php?/suites/view/1"
},
{
"description": "..",
"id": 2,
"name": "Setup & Installation",
"project_id": 1,
"url": "http://<server>/index.php?/suites/view/2"
}]
self.mock_suites_data_2 = [
{
"description": "..",
"id": 3,
"name": "Setup & Installation",
"project_id": 2,
"url": "http://<server>/index.php?/suites/view/1"
},
{
"description": "..",
"id": 4,
"name": "Setup & Installation",
"project_id": 2,
"url": "http://<server>/index.php?/suites/view/2"
}
]
self.suites_1 = copy.deepcopy(self.mock_suites_data_1)
self.suites_2 = copy.deepcopy(self.mock_suites_data_2)
def tearDown(self):
util.reset_shared_state(self.client)
@mock.patch('testrail.api.requests.get')
def test_get_suites(self, mock_get):
mock_response = mock.Mock()
expected_response = self.suites_1
url = 'https://<server>/index.php?/api/v2/get_suites/1'
mock_response.json.return_value = self.mock_suites_data_1
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.suites()
actual_response = self.client.suites() # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_suites_with_project(self, mock_get):
mock_response = mock.Mock()
expected_response = self.suites_2
url = 'https://<server>/index.php?/api/v2/get_suites/2'
mock_response.json.return_value = self.mock_suites_data_2
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.suites(2)
actual_response = self.client.suites(2) # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_suites_invalid_project(self, mock_get):
mock_response = mock.Mock()
mock_response.json.return_value = {}
mock_response.status_code = 400
mock_get.return_value = mock_response
with self.assertRaises(TestRailError):
self.client.suites(20)
@mock.patch('testrail.api.requests.get')
def test_get_suites_cache_timeout(self, mock_get):
mock_response = mock.Mock()
expected_response = self.suites_1
url = 'https://<server>/index.php?/api/v2/get_suites/1'
mock_response.json.return_value = self.mock_suites_data_1
mock_response.status_code = 200
mock_get.return_value = mock_response
self.client.suites()
timeout = self.client._timeout
self.client._suites[1]['ts'] = datetime.now() - timedelta(
seconds=timeout)
actual_response = self.client.suites() # verify cache timeout
c = mock.call(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
mock_get.assert_has_calls([c, mock.call().json()] * 2)
self.assertEqual(2, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_suites_different_projects_no_cache_hit(self, mock_get):
mock_response = mock.Mock()
expected_response = self.suites_1
url = 'https://<server>/index.php?/api/v2/get_suites/1'
url2 = 'https://<server>/index.php?/api/v2/get_suites/2'
mock_response.json.return_value = self.mock_suites_data_1
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.suites()
actual_response = self.client.suites(2) # verify cache not hit
c1 = mock.call(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
c2 = mock.call(
url2,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
mock_get.assert_has_calls(
[c1, mock.call().json(), c2, mock.call().json()])
self.assertEqual(2, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_suite_with_id(self, mock_get):
mock_response = mock.Mock()
expected_response = next(filter(
lambda x: x if x['id'] == 2 else None, self.suites_1))
url = 'https://<server>/index.php?/api/v2/get_suites/1'
mock_response.json.return_value = self.mock_suites_data_1
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.suite_with_id(2)
actual_response = self.client.suite_with_id(2) # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_suites_invalid_suite_id(self, mock_get):
mock_response = mock.Mock()
mock_response.json.return_value = {}
mock_response.status_code = 200
mock_get.return_value = mock_response
with self.assertRaises(TestRailError) as e:
self.client.suite_with_id(20)
self.assertEqual(str(e.exception), "Suite ID '20' was not found")
class TestPlan(unittest.TestCase):
def setUp(self):
self.client = API()
self.client.set_project_id(1)
self.mock_plans_data_1 = [
{
"id": 1,
"name": "Test Plan #1",
"is_completed": False,
"description": "..",
"project_id": 1,
"milestone_id": 1,
"url": "http://<server>/index.php?/plans/view/1",
"assignedto_id": None,
"blocked_count": 1,
"completed_on": None,
"created_by": 1,
"created_on": 1393845644,
"untested_count": 6,
"passed_count": 5,
"failed_count": 2,
"entries": []
},
{
"id": 2,
"name": "Test Plan #2",
"is_completed": False,
"description": "..",
"project_id": 1,
"milestone_id": 2,
"url": "http://<server>/index.php?/plans/view/2",
"assignedto_id": None,
"blocked_count": 1,
"completed_on": None,
"created_by": 1,
"created_on": 1393845644,
"untested_count": 6,
"passed_count": 5,
"failed_count": 2,
"entries": []
}
]
self.mock_plans_data_2 = [
{
"id": 3,
"name": "Test Plan #3",
"is_completed": False,
"description": "..",
"project_id": 2,
"milestone_id": 3,
"url": "http://<server>/index.php?/plans/view/3",
"assignedto_id": 1,
"blocked_count": 2,
"completed_on": None,
"created_by": 2,
"created_on": 1393843644,
"untested_count": 6,
"passed_count": 5,
"failed_count": 2,
"entries": []
},
{
"id": 4,
"name": "Test Plan #4",
"is_completed": False,
"description": "..",
"project_id": 2,
"milestone_id": 3,
"url": "http://<server>/index.php?/plans/view/4",
"assignedto_id": 1,
"blocked_count": 2,
"completed_on": None,
"created_by": 2,
"created_on": 1393843644,
"untested_count": 6,
"passed_count": 5,
"failed_count": 2,
"entries": []
}
]
self.plans_1 = copy.deepcopy(self.mock_plans_data_1)
self.plans_2 = copy.deepcopy(self.mock_plans_data_2)
def tearDown(self):
util.reset_shared_state(self.client)
@mock.patch('testrail.api.requests.get')
def test_get_plans(self, mock_get):
mock_response = mock.Mock()
expected_response = self.plans_1
url = "https://<server>/index.php?/api/v2/get_plans/1"
mock_response.json.return_value = self.mock_plans_data_1
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.plans()
actual_response = self.client.plans() # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_plans_with_project(self, mock_get):
mock_response = mock.Mock()
expected_response = self.plans_2
url = "https://<server>/index.php?/api/v2/get_plans/2"
mock_response.json.return_value = self.mock_plans_data_2
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.plans(2)
actual_response = self.client.plans(2) # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_plans_invalid_project(self, mock_get):
mock_response = mock.Mock()
mock_response.json.return_value = {}
mock_response.status_code = 400
mock_get.return_value = mock_response
with self.assertRaises(TestRailError):
self.client.plans(20)
@mock.patch('testrail.api.requests.get')
def test_get_plan_with_id(self, mock_get):
mock_response = mock.Mock()
expected_response = next(filter(
lambda x: x if x['id'] == 2 else None, self.plans_1))
url = 'https://<server>/index.php?/api/v2/get_plans/1'
mock_response.json.return_value = self.mock_plans_data_1
mock_response.status_code = 200
mock_get.return_value = mock_response
actual_response = self.client.plan_with_id(2)
actual_response = self.client.plan_with_id(2) # verify cache hit
mock_get.assert_called_once_with(
url,
headers={'Content-Type': 'application/json'},
params=None,
verify=True,
auth=('user@yourdomain.com', 'your_api_key')
)
self.assertEqual(1, mock_response.json.call_count)
self.assertEqual(expected_response, actual_response)
@mock.patch('testrail.api.requests.get')
def test_get_plans_invalid_suite_id(self, mock_get):
mock_response = mock.Mock()
mock_response.json.return_value = {}
mock_response.status_code = 200
mock_get.return_value = mock_response
with self.assertRaises(TestRailError) as e:
self.client.plan_with_id(30)
self.assertEqual(str(e.exception), "Plan ID '30' was not found")
if __name__ == "__main__":
unittest.main()
| 38.615099 | 87 | 0.589789 | 3,614 | 31,201 | 4.849751 | 0.058661 | 0.078736 | 0.037428 | 0.039368 | 0.892338 | 0.858333 | 0.830205 | 0.816055 | 0.776573 | 0.751298 | 0 | 0.015069 | 0.291721 | 31,201 | 807 | 88 | 38.662949 | 0.778044 | 0.010705 | 0 | 0.700947 | 0 | 0 | 0.181011 | 0.024053 | 0 | 0 | 0 | 0 | 0.121786 | 1 | 0.063599 | false | 0.008119 | 0.018945 | 0 | 0.092016 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1f978694c4c6cd23d02a67057cc4e126d2e0da40 | 15,737 | py | Python | src/py_dss_interface/models/Loads/LoadsF.py | davilamds/py_dss_interface | a447c97787aeac962381db88dd622ccb235eef4b | [
"MIT"
] | null | null | null | src/py_dss_interface/models/Loads/LoadsF.py | davilamds/py_dss_interface | a447c97787aeac962381db88dd622ccb235eef4b | [
"MIT"
] | null | null | null | src/py_dss_interface/models/Loads/LoadsF.py | davilamds/py_dss_interface | a447c97787aeac962381db88dd622ccb235eef4b | [
"MIT"
] | null | null | null | # -*- encoding: utf-8 -*-
"""
Created by eniocc at 11/10/2020
"""
import ctypes
from py_dss_interface.models.Base import Base
class LoadsF(Base):
"""
This interface can be used to read/modify the properties of the Loads Class where the values are floating point
numbers (double).
The structure of the interface is as follows:
double DSSLoadsF(int32_t Parameter,double Argument);
This interface returns a Double (IEEE 754 64 bits), the variable “parameter” (Integer) is used to specify the
property of the class to be used and the variable “argument” (double) can be used to modify the value of the
property when necessary. Reading and writing properties are separated and require a different parameter number to
be executed.
The properties (parameter) are integer numbers and are described as follows.
"""
def loads_read_kw(self) -> float:
"""Allows to read the kW property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(0), ctypes.c_double(0)))
def loads_write_kw(self, argument) -> float:
"""Allows to write the kW property of the active load. The parameter argument must contain the new value in
kW for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(1), ctypes.c_double(argument)))
def loads_read_kv(self) -> float:
"""Allows to read the kV property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(2), ctypes.c_double(0)))
def loads_write_kv(self, argument) -> float:
"""Allows to write the kV property of the active load. The parameter argument must contain the new value in
kV for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(3), ctypes.c_double(argument)))
def loads_read_kvar(self) -> float:
"""Allows to read the kvar property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(4), ctypes.c_double(0)))
def loads_write_kvar(self, argument) -> float:
"""Allows to write the kvar property of the active load. The parameter argument must contain the new value in
kvar for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(5), ctypes.c_double(argument)))
def loads_read_pf(self) -> float:
"""Allows to read the pf property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(6), ctypes.c_double(0)))
def loads_write_pf(self, argument) -> float:
"""Allows to write the pf property of the active load. The parameter argument must contain the new value in
pf for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(7), ctypes.c_double(argument)))
def loads_read_pct_mean(self) -> float:
"""Allows to read the PctMean property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(8), ctypes.c_double(0)))
def loads_write_pct_mean(self, argument) -> float:
"""Allows to write the PctMean property of the active load. The parameter argument must contain the new value
in PctMean for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(9), ctypes.c_double(argument)))
def loads_read_pct_std_dev(self) -> float:
"""Allows to read the PctStdDev property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(10), ctypes.c_double(0)))
def loads_write_pct_std_dev(self, argument) -> float:
"""Allows to write the PctStdDev property of the active load. The parameter argument must contain the new
value in PctStdDev for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(11), ctypes.c_double(argument)))
def loads_read_allocation_factor(self) -> float:
"""Allows to read the AllocationFactor property of the active load. The parameter argument can be filled with
a 0. """
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(12), ctypes.c_double(0)))
def loads_write_allocation_factor(self, argument) -> float:
"""Allows to write the AllocationFactor property of the active load. The parameter argument must contain the
new value in AllocationFactor for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(13), ctypes.c_double(argument)))
def loads_read_c_factor(self) -> float:
"""Allows to read the CFactor property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(14), ctypes.c_double(0)))
def loads_write_c_factor(self, argument: float) -> float:
"""Allows to write the CFactor property of the active load. The parameter argument must contain the new value
in CFactor for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(15), ctypes.c_double(argument)))
def loads_read_cvr_watts(self) -> float:
"""Allows to read the CVRWatts property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(16), ctypes.c_double(0)))
def loads_write_cvr_watts(self, argument) -> float:
"""Allows to write the CVRWatts property of the active load. The parameter argument must contain the new
value in CVRWatts for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(17), ctypes.c_double(argument)))
def loads_read_cvr_vars(self) -> float:
"""Allows to read the CVRvars property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(18), ctypes.c_double(0)))
def loads_write_cvr_vars(self, argument) -> float:
"""Allows to write the CVRvars property of the active load. The parameter argument must contain the new value
in CVRWatts for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(19), ctypes.c_double(argument)))
def loads_read_kva(self) -> float:
"""Allows to read the kva property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(20), ctypes.c_double(0)))
def loads_write_kva(self, argument) -> float:
"""Allows to write the kva property of the active load. The parameter argument must contain the new value in
kva for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(21), ctypes.c_double(argument)))
def loads_read_kwh(self) -> float:
"""Allows to read the kWh property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(22), ctypes.c_double(0)))
def loads_write_kwh(self, argument) -> float:
"""Allows to write the kWh property of the active load. The parameter argument must contain the new value in
kWh for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(23), ctypes.c_double(argument)))
def loads_read_kwh_days(self) -> float:
"""Allows to read the kWhdays property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(24), ctypes.c_double(0)))
def loads_write_kwh_days(self, argument) -> float:
"""Allows to write the kWhdays property of the active load. The parameter argument must contain the new value
in kWhdays for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(25), ctypes.c_double(argument)))
def loads_read_r_neut(self) -> float:
"""Allows to read the RNeut (neutral resistance for wye connected loads) property of the active load. The
parameter argument can be filled with a 0. """
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(26), ctypes.c_double(0)))
def loads_write_r_neut(self, argument) -> float:
"""Allows to write the RNeut (neutral resistance for wye connected loads) property of the active load. The
parameter argument must contain the new value in RNeut for the desired active load. The return value will be
equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(27), ctypes.c_double(argument)))
def loads_read_vmax_pu(self) -> float:
"""Allows to read the VMaxpu property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(28), ctypes.c_double(0)))
def loads_write_vmax_pu(self, argument) -> float:
"""Allows to write the VMaxpu property of the active load. The parameter argument must contain the new value
in VMaxpu for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(29), ctypes.c_double(argument)))
def loads_read_vmin_emerg(self) -> float:
"""Allows to read the VMinemerg property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(30), ctypes.c_double(0)))
def loads_write_vmin_emerg(self, argument) -> float:
"""Allows to write the VMinemerg property of the active load. The parameter argument must contain the new
value in VMinemerg for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(31), ctypes.c_double(argument)))
def loads_read_vmin_norm(self) -> float:
"""Allows to read the VMinnorm property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(32), ctypes.c_double(0)))
def loads_write_vmin_norm(self, argument) -> float:
"""Allows to write the VMinnorm property of the active load. The parameter argument must contain the new
value in VMinnorm for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(33), ctypes.c_double(argument)))
def loads_read_vmin_pu(self) -> float:
"""Allows to read the VMinpu property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(34), ctypes.c_double(0)))
def loads_write_vmin_pu(self, argument) -> float:
"""Allows to write the VMinpu property of the active load. The parameter argument must contain the new value
in VMinpu for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(35), ctypes.c_double(argument)))
def loads_read_xfkva(self) -> float:
"""Allows to read the xfKVA (Rated service transformer KVA for load allocation, using Allocationfactor.
Affects kW, kvar and pf.) property of the active load. The parameter argument can be filled with a 0. """
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(36), ctypes.c_double(0)))
def loads_write_xfkva(self, argument) -> float:
"""Allows to write the xfKVA (Rated service transformer KVA for load allocation, using Allocationfactor.
Affects kW, kvar and pf.) property of the active load. The parameter argument must contain the new value in
xfKVA for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(37), ctypes.c_double(argument)))
def loads_read_x_neut(self) -> float:
"""Allows to read the Xneut property of the active load. The parameter argument can be filled with a 0."""
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(38), ctypes.c_double(0)))
def loads_write_x_neut(self, argument) -> float:
"""Allows to write the Xneut property of the active load. The parameter argument must contain the new value
in Xneut for the desired active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(39), ctypes.c_double(argument)))
def loads_read_pct_series_rl(self) -> float:
"""allows to read the PctSeriesRL (Percent of Load that is modeled as series R-L for harmonic studies)
property of the active load. The parameter argument can be filled with a 0. """
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(40), ctypes.c_double(0)))
def loads_write_pct_series_rl(self, argument) -> float:
"""allows to write the PctSeriesRL (Percent of Load that is modeled as series R-L for harmonic studies)
property of the active load. The parameter argument must contain the new value in PctSeriesRL for the desired
active load. The return value will be equal to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(41), ctypes.c_double(argument)))
def loads_read_rel_weight(self) -> float:
"""Allows to read the RelWeight (relative weighting factor) property of the active load. The parameter
argument can be filled with a 0. """
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(42), ctypes.c_double(0)))
def loads_write_rel_weight(self, argument) -> float:
"""Allows to write the RelWeight (relative weighting factor) property of the active load. The parameter
argument must contain the new value in RelWeight for the desired active load. The return value will be equal
to 0. """
argument = Base.check_float_param(argument)
return float(self.dss_obj.DSSLoadsF(ctypes.c_int32(43), ctypes.c_double(argument)))
| 61.956693 | 118 | 0.712779 | 2,413 | 15,737 | 4.523829 | 0.079983 | 0.056431 | 0.0786 | 0.076585 | 0.902803 | 0.901154 | 0.870374 | 0.746977 | 0.673873 | 0.673873 | 0 | 0.019725 | 0.201055 | 15,737 | 253 | 119 | 62.201581 | 0.848485 | 0.476965 | 0 | 0.19469 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.389381 | false | 0 | 0.017699 | 0 | 0.80531 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
c814258b404cbf8a7f62f8747036b0344d62c89c | 7,561 | py | Python | tests/test_neuralNetRegression.py | DSAAR/amorf | 5cc5e346e6d4f918d588ff527aaa45136f036851 | [
"MIT"
] | 13 | 2020-03-24T12:03:51.000Z | 2022-03-25T09:15:58.000Z | tests/test_neuralNetRegression.py | DSAAR/amorf | 5cc5e346e6d4f918d588ff527aaa45136f036851 | [
"MIT"
] | null | null | null | tests/test_neuralNetRegression.py | DSAAR/amorf | 5cc5e346e6d4f918d588ff527aaa45136f036851 | [
"MIT"
] | 2 | 2020-08-14T11:30:02.000Z | 2022-03-10T12:28:47.000Z | import unittest
import amorf.neuralNetRegression as nnRegressor
import amorf.datasets as ds
from sklearn.model_selection import train_test_split
from amorf.metrics import average_relative_root_mean_squared_error
import numpy
import torch
import os
class TestLinearNeuralNet(unittest.TestCase):
def setUp(self):
X, y = ds.EDM().get_numpy()
self.X_train, self.X_test, self.y_train, self.y_test = train_test_split(
X, y, test_size=0.1)
self.selectors = ['mean', 'max', 'doubleInput']
self.input_dim = len(self.X_train[0, :])
self.target_dim = len(self.y_train[0, :])
def test_predict_without_GPU(self):
model = nnRegressor.Linear_NN_Model(
self.input_dim, self.target_dim, 'mean')
fittedReg = nnRegressor.NeuralNetRegressor(model=model, patience=1).fit(
self.X_train, self.y_train)
result = fittedReg.predict(self.X_test)
self.assertEqual(next(fittedReg.model.parameters()).is_cuda, False)
self.assertEqual(
result.shape, (len(self.X_test), len(self.y_test[0, :])))
self.assertTrue(type(result) is numpy.ndarray)
self.assertTrue(result.dtype is numpy.dtype('float32') or
result.dtype is numpy.dtype('float64'))
def test_predict_without_GPU_training_limit(self):
model = nnRegressor.Linear_NN_Model(
self.input_dim, self.target_dim, 'mean')
fittedReg = nnRegressor.NeuralNetRegressor(model=model, patience=100, training_limit=1).fit(
self.X_train, self.y_train)
result = fittedReg.predict(self.X_test)
self.assertEqual(next(fittedReg.model.parameters()).is_cuda, False)
self.assertEqual(
result.shape, (len(self.X_test), len(self.y_test[0, :])))
self.assertTrue(type(result) is numpy.ndarray)
self.assertTrue(result.dtype is numpy.dtype('float32') or
result.dtype is numpy.dtype('float64'))
def test_predict_without_GPU_default_model(self):
fittedReg = nnRegressor.NeuralNetRegressor(patience=1).fit(
self.X_train, self.y_train)
result = fittedReg.predict(self.X_test)
self.assertEqual(next(fittedReg.model.parameters()).is_cuda, False)
self.assertEqual(
result.shape, (len(self.X_test), len(self.y_test[0, :])))
self.assertTrue(type(result) is numpy.ndarray)
self.assertTrue(result.dtype is numpy.dtype('float32') or
result.dtype is numpy.dtype('float64'))
def test_predict_with_GPU(self):
if torch.cuda.is_available():
model = nnRegressor.Linear_NN_Model(
self.input_dim, self.target_dim, 'mean')
fittedReg = nnRegressor.NeuralNetRegressor(model=model, patience=1, use_gpu=True).fit(
self.X_train, self.y_train)
result = fittedReg.predict(self.X_test)
self.assertEqual(next(fittedReg.model.parameters()).is_cuda, True)
self.assertEqual(
result.shape, (len(self.X_test), len(self.y_test[0, :])))
self.assertTrue(type(result) is numpy.ndarray)
self.assertTrue(result.dtype is numpy.dtype('float32') or
result.dtype is numpy.dtype('float64'))
def test_save_load(self):
model = nnRegressor.Linear_NN_Model(
self.input_dim, self.target_dim, 'mean')
reg = nnRegressor.NeuralNetRegressor(
model=model, patience=1, use_gpu=True)
reg.save('test')
self.assertTrue(os.path.exists('test'))
newReg = nnRegressor.NeuralNetRegressor()
newReg.load('test')
self.assertEquals(newReg.model.fc1.in_features, self.input_dim)
self.assertEquals(newReg.model.fc3.out_features, self.target_dim)
# TODO: add test for scoring mehtod
def test_score(self):
model = nnRegressor.Linear_NN_Model(
self.input_dim, self.target_dim, 'mean')
reg = nnRegressor.NeuralNetRegressor(
model=model, patience=1, use_gpu=True)
fitted = reg.fit(self.X_train, self.y_train)
score = fitted.score(self.X_test, self.y_test)
class TestConvolutionalNeuralNet(unittest.TestCase):
def setUp(self):
X, y = ds.EDM().get_numpy()
self.X_train, self.X_test, self.y_train, self.y_test = train_test_split(
X, y, test_size=0.1)
self.input_dim = len(self.X_train[0, :])
self.target_dim = len(self.y_train[0, :])
def test_predict_without_GPU(self):
input_dim = len(self.X_train[0, :])
target_dim = len(self.y_train[0, :])
model = nnRegressor.Convolutional_NN_Model(input_dim, target_dim)
fittedReg = nnRegressor.NeuralNetRegressor(model=model, patience=1).fit(
self.X_train, self.y_train)
result = fittedReg.predict(self.X_test)
self.assertEqual(next(fittedReg.model.parameters()).is_cuda, False)
self.assertEqual(
result.shape, (len(self.X_test), len(self.y_test[0, :])))
self.assertTrue(type(result) is numpy.ndarray)
self.assertTrue(result.dtype is numpy.dtype('float32') or
result.dtype is numpy.dtype('float64'))
def test_predict_without_GPU_default_model(self):
fittedReg = nnRegressor.NeuralNetRegressor(patience=1).fit(
self.X_train, self.y_train)
result = fittedReg.predict(self.X_test)
self.assertEqual(next(fittedReg.model.parameters()).is_cuda, False)
self.assertEqual(
result.shape, (len(self.X_test), len(self.y_test[0, :])))
self.assertTrue(type(result) is numpy.ndarray)
self.assertTrue(result.dtype is numpy.dtype('float32')
or result.dtype is numpy.dtype('float64'))
def test_predict_with_GPU(self):
if torch.cuda.is_available():
input_dim = len(self.X_train[0, :])
target_dim = len(self.y_train[0, :])
model = nnRegressor.Convolutional_NN_Model(input_dim, target_dim)
fittedReg = nnRegressor.NeuralNetRegressor(model=model, patience=1, use_gpu=True).fit(
self.X_train, self.y_train)
result = fittedReg.predict(self.X_test)
self.assertEqual(next(fittedReg.model.parameters()).is_cuda, True)
self.assertEqual(
result.shape, (len(self.X_test), len(self.y_test[0, :])))
self.assertTrue(type(result) is numpy.ndarray)
self.assertTrue(result.dtype is numpy.dtype('float32') or
result.dtype is numpy.dtype('float64'))
def test_save_load(self):
model = nnRegressor.Convolutional_NN_Model(
self.input_dim, self.target_dim)
reg = nnRegressor.NeuralNetRegressor(
model=model, patience=1, use_gpu=True)
reg.save('testCNN')
self.assertTrue(os.path.exists('testCNN'))
newReg = nnRegressor.NeuralNetRegressor()
newReg.load('testCNN')
self.assertEquals(newReg.model.input_dim, self.input_dim)
self.assertEquals(newReg.model.output_dim, self.target_dim)
def test_score(self):
input_dim = len(self.X_train[0, :])
target_dim = len(self.y_train[0, :])
model = nnRegressor.Convolutional_NN_Model(input_dim, target_dim)
fittedReg = nnRegressor.NeuralNetRegressor(model=model, patience=1).fit(
self.X_train, self.y_train)
score = fittedReg.score(self.X_test, self.y_test)
| 42.240223 | 100 | 0.646872 | 955 | 7,561 | 4.934031 | 0.105759 | 0.0382 | 0.03438 | 0.05348 | 0.882216 | 0.849321 | 0.849321 | 0.823005 | 0.809211 | 0.809211 | 0 | 0.011244 | 0.235419 | 7,561 | 178 | 101 | 42.477528 | 0.80384 | 0.004365 | 0 | 0.79021 | 0 | 0 | 0.022458 | 0 | 0 | 0 | 0 | 0.005618 | 0.237762 | 1 | 0.090909 | false | 0 | 0.055944 | 0 | 0.160839 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c06f104ce6ce390d8aab25984597d5a02f898c63 | 9,447 | py | Python | api/test/test_deploy.py | diego-hermida/ClimateChangeApp | 576d49ec5b76f709cc86874ffb03f4a38dbbbbfd | [
"MIT"
] | 2 | 2018-07-01T20:36:46.000Z | 2019-11-01T22:47:06.000Z | api/test/test_deploy.py | diego-hermida/ClimateChangeApp | 576d49ec5b76f709cc86874ffb03f4a38dbbbbfd | [
"MIT"
] | 1 | 2021-06-10T20:28:53.000Z | 2021-06-10T20:28:53.000Z | api/test/test_deploy.py | diego-hermida/ClimateChangeApp | 576d49ec5b76f709cc86874ffb03f4a38dbbbbfd | [
"MIT"
] | null | null | null | from unittest import TestCase, mock
from unittest.mock import Mock
from pymongo.errors import DuplicateKeyError
import api.deploy as deploy
from api.config.config import API_CONFIG
@mock.patch('sys.argv', ['deploy.py'])
@mock.patch('api.deploy.environ', {})
@mock.patch('api.deploy.recursive_makedir', Mock())
class TestDeploy(TestCase):
def tearDown(self):
from os import environ
try:
del environ['SKIP_DEPLOY']
except KeyError:
pass
@mock.patch('api.deploy.TextTestRunner')
@mock.patch('api.deploy.TestLoader', Mock())
@mock.patch('api.deploy.bulk_create_authorized_users')
@mock.patch('api.deploy.create_user', Mock())
@mock.patch('api.deploy.ping_database', Mock())
@mock.patch('argparse.ArgumentParser')
def test_all_with_tests_everything_ok(self, mock_args, mock_auth_users, mock_test_runner):
import yaml
mock_args.return_value.parse_args.return_value = args = Mock()
args.skip_all = False
args.with_test_reports = False
args.with_test_reports = False
args.all = True
args.db_user = False
args.add_users = False
args.remove_files = False
mock_test_runner.return_value.run = results = Mock()
results.return_value.wasSuccessful.return_value = True
with open(API_CONFIG['AUTHORIZED_USERS_FILEPATH'], 'r', encoding='utf-8') as f:
users = yaml.load(f)
mock_auth_users.return_value = len(users['authorized_users'].keys())
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertTrue(mock_args.called)
self.assertTrue(mock_auth_users.called)
@mock.patch('api.deploy._execute_tests', Mock(return_value=True))
@mock.patch('api.deploy.bulk_create_authorized_users')
@mock.patch('api.deploy.create_user', Mock())
@mock.patch('api.deploy.ping_database', Mock())
@mock.patch('api.deploy.environ', {'DEPLOY_ARGS': '--all --with-tests'})
def test_all_with_tests_everything_ok_with_args_from_env_variable(self, mock_auth_users):
import yaml
with open(API_CONFIG['AUTHORIZED_USERS_FILEPATH'], 'r', encoding='utf-8') as f:
users = yaml.load(f)
mock_auth_users.return_value = len(users['authorized_users'].keys())
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertTrue(mock_auth_users.called)
@mock.patch('api.deploy.create_user', Mock(side_effect=DuplicateKeyError('User already exists')))
@mock.patch('api.deploy.ping_database', Mock())
@mock.patch('argparse.ArgumentParser')
def test_create_api_user_when_it_does_already_exist(self, mock_args):
mock_args.return_value.parse_args.return_value = args = Mock()
args.skip_all = False
args.with_tests = False
args.with_test_reports = False
args.all = False
args.db_user = True
args.add_users = False
args.remove_files = False
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertTrue(mock_args.called)
@mock.patch('api.deploy._execute_tests', Mock(return_value=False))
@mock.patch('api.deploy.ping_database', Mock())
@mock.patch('argparse.ArgumentParser')
def test_tests_failed(self, mock_args):
mock_args.return_value.parse_args.return_value = args = Mock()
args.skip_all = False
args.with_tests = True
args.with_test_reports = False
args.all = False
args.db_user = False
args.add_users = False
args.remove_files = False
with self.assertRaises(SystemExit) as e:
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertEqual(1, e.exception.code)
self.assertTrue(mock_args.called)
@mock.patch('argparse.ArgumentParser',
Mock(side_effect=Exception('Test error (to verify anomalous exit). This is OK.')))
def test_anomalous_exit(self):
with self.assertRaises(SystemExit) as e:
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertEqual(1, e.exception.code)
@mock.patch('argparse.ArgumentParser')
def test_skip_all(self, mock_args):
mock_args.return_value.parse_args.return_value = args = Mock()
args.skip_all = True
args.with_tests = False
args.with_test_reports = False
args.all = False
args.db_user = False
args.add_users = False
args.remove_files = False
with self.assertRaises(SystemExit) as e:
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertEqual(0, e.exception.code)
self.assertTrue(mock_args.called)
@mock.patch('api.deploy.ping_database',
Mock(side_effect=EnvironmentError('Test error to verify deploy is aborted.')))
def test_deploy_aborts_if_database_down(self):
with self.assertRaises(SystemExit) as e:
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertEqual(1, e.exception.code)
@mock.patch('api.deploy.bulk_create_authorized_users', Mock(return_value=0))
@mock.patch('api.deploy.ping_database', Mock())
@mock.patch('argparse.ArgumentParser')
def test_deploy_aborts_if_not_all_authorized_users_are_inserted(self, mock_args):
mock_args.return_value.parse_args.return_value = args = Mock()
args.skip_all = False
args.with_tests = False
args.with_test_reports = False
args.all = False
args.db_user = False
args.add_users = True
args.remove_files = False
with self.assertRaises(SystemExit) as e:
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertEqual(1, e.exception.code)
self.assertTrue(mock_args.called)
@mock.patch('yaml.load', Mock(return_value={'authorized_users': {'user1': {'token': 'test_token'}}}))
@mock.patch('api.deploy.ping_database', Mock())
@mock.patch('argparse.ArgumentParser')
def test_deploy_aborts_if_authorized_users_do_not_have_scope(self, mock_args):
mock_args.return_value.parse_args.return_value = args = Mock()
args.skip_all = False
args.with_tests = False
args.with_test_reports = False
args.all = False
args.db_user = False
args.add_users = True
args.remove_files = False
with self.assertRaises(SystemExit) as e:
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertEqual(1, e.exception.code)
self.assertTrue(mock_args.called)
@mock.patch('yaml.load', Mock(return_value={'authorized_users': {'user1': {'scope': 1}}}))
@mock.patch('api.deploy.ping_database', Mock())
@mock.patch('argparse.ArgumentParser')
def test_deploy_aborts_if_authorized_users_do_not_have_token(self, mock_args):
mock_args.return_value.parse_args.return_value = args = Mock()
args.skip_all = False
args.with_tests = False
args.with_test_reports = False
args.all = False
args.db_user = False
args.add_users = True
args.remove_files = False
with self.assertRaises(SystemExit) as e:
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertEqual(1, e.exception.code)
self.assertTrue(mock_args.called)
@mock.patch('coverage.Coverage')
@mock.patch('api.deploy._execute_tests', Mock(return_value=True))
@mock.patch('api.deploy.ping_database', Mock())
@mock.patch('argparse.ArgumentParser')
def test_coverage_report_is_generated_if_tests_ok(self, mock_args, mock_coverage):
mock_args.return_value.parse_args.return_value = args = Mock()
args.skip_all = False
args.with_tests = False
args.with_test_reports = True
args.all = False
args.db_user = False
args.add_users = False
args.remove_files = False
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertEqual(1, mock_coverage.return_value.start.call_count)
self.assertEqual(1, mock_coverage.return_value.stop.call_count)
self.assertEqual(1, mock_coverage.return_value.save.call_count)
@mock.patch('coverage.Coverage')
@mock.patch('api.deploy._execute_tests', Mock(return_value=False))
@mock.patch('api.deploy.ping_database', Mock())
@mock.patch('argparse.ArgumentParser')
def test_coverage_report_is_not_generated_if_tests_fail(self, mock_args, mock_coverage):
mock_args.return_value.parse_args.return_value = args = Mock()
args.skip_all = False
args.with_tests = False
args.with_test_reports = True
args.all = False
args.db_user = False
args.add_users = False
args.remove_files = False
with self.assertRaises(SystemExit) as e:
deploy.deploy(log_to_file=False, log_to_stdout=False, log_to_telegram=False)
self.assertEqual(1, e.exception.code)
self.assertEqual(1, mock_coverage.return_value.start.call_count)
self.assertEqual(1, mock_coverage.return_value.stop.call_count)
self.assertEqual(0, mock_coverage.return_value.save.call_count)
| 45.200957 | 105 | 0.69186 | 1,275 | 9,447 | 4.843922 | 0.101961 | 0.065576 | 0.048575 | 0.072863 | 0.843912 | 0.830311 | 0.825615 | 0.802137 | 0.796308 | 0.784812 | 0 | 0.002641 | 0.198476 | 9,447 | 208 | 106 | 45.418269 | 0.812995 | 0 | 0 | 0.753927 | 0 | 0 | 0.130835 | 0.092834 | 0 | 0 | 0 | 0 | 0.162304 | 1 | 0.068063 | false | 0.005236 | 0.041885 | 0 | 0.115183 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f19d7fe5f8d795aa3e97f3f56b5608aa8f466898 | 131 | py | Python | src/__init__.py | jordansilva/raspberry-f1-dashboard | 96446a348d036a75f4699bab4459eabec16705f8 | [
"Apache-2.0"
] | null | null | null | src/__init__.py | jordansilva/raspberry-f1-dashboard | 96446a348d036a75f4699bab4459eabec16705f8 | [
"Apache-2.0"
] | null | null | null | src/__init__.py | jordansilva/raspberry-f1-dashboard | 96446a348d036a75f4699bab4459eabec16705f8 | [
"Apache-2.0"
] | null | null | null | from .telemetry import Telemetry
from .context import Context
from .services import F12019Socket
from .services import F12020Socket | 32.75 | 34 | 0.854962 | 16 | 131 | 7 | 0.4375 | 0.214286 | 0.321429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086207 | 0.114504 | 131 | 4 | 35 | 32.75 | 0.87931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f1cdeed64a1bcd37a5f3756a4c40ccbf6b866a61 | 7,444 | py | Python | admin_/forms.py | kells4real/school_management | c6b2dad8c7dea0e4bb1a66bddc0fea1934ce4053 | [
"MIT"
] | null | null | null | admin_/forms.py | kells4real/school_management | c6b2dad8c7dea0e4bb1a66bddc0fea1934ce4053 | [
"MIT"
] | null | null | null | admin_/forms.py | kells4real/school_management | c6b2dad8c7dea0e4bb1a66bddc0fea1934ce4053 | [
"MIT"
] | null | null | null | from django import forms
import re
from management.models import User, School
class StudentRegisterForm(forms.ModelForm):
username = forms.RegexField(regex=re.compile(r'^[A-Za-z0-9_.-]{4,30}$'), required=True,
error_messages={'invalid': "Username must be between 4 and 30 alphanumeric characters including _ or ."}, label="Username")
mother_number = forms.RegexField(regex=re.compile(r'^[0-9+]{10,15}$'), required=False,
error_messages={'invalid': "Enter a valid phone number"}, label="Phone No")
father_number = forms.RegexField(regex=re.compile(r'^[0-9+]{10,15}$'), required=False,
error_messages={'invalid': "Enter a valid phone number"}, label="Phone No")
admission_date = forms.DateField(
widget=forms.TextInput(
attrs={'type': 'date', 'title': 'Date of birth'}
)
)
date_of_birth = forms.DateField(
widget=forms.TextInput(
attrs={'type': 'date', 'title': 'Date of birth'}
)
)
email = forms.EmailField(max_length=200)
password = forms.CharField(label='Password', widget=forms.PasswordInput)
password2 = forms.CharField(label='Repeat password', widget=forms.PasswordInput)
class Meta:
model = User
fields = ('username', 'email', 'first_name', 'last_name', 'date_of_birth', 'student_class', 'gender',
'academic_year', 'image', 'password', 'password2', 'admission_date', 'mother_name', 'mother_number', 'father_name',
'father_number', 'address', 'religion')
labels = {
'first_name': "First name(s)",
}
def clean_password2(self):
cd = self.cleaned_data
if cd['password'] != cd['password2']:
raise forms.ValidationError('Passwords don\'t match.')
return cd['password2']
def clean_email(self):
email = self.cleaned_data['email']
if User.objects.filter(email=email).exists():
raise forms.ValidationError('Please use another Email, this Email has already been used by another user.')
return email
class AdminRegisterForm(forms.ModelForm):
username = forms.RegexField(regex=re.compile(r'^[A-Za-z0-9_.-]{4,30}$'), required=True,
error_messages={'invalid': "Username must be between 4 and 30 alphanumeric characters including _ or ."}, label="Username")
email = forms.EmailField(max_length=200, help_text='Required')
password = forms.CharField(label='Password', widget=forms.PasswordInput)
password2 = forms.CharField(label='Repeat password', widget=forms.PasswordInput)
class Meta:
model = User
fields = ('username', 'email', 'first_name', 'last_name', 'school', 'image', 'password', 'password2')
def clean_password2(self):
cd = self.cleaned_data
if cd['password'] != cd['password2']:
raise forms.ValidationError('Passwords don\'t match.')
return cd['password2']
def clean_email(self):
email = self.cleaned_data['email']
if User.objects.filter(email=email).exists():
raise forms.ValidationError('Please use another Email, this Email has already been used by another user.')
return email
class TeacherRegisterForm(forms.ModelForm):
username = forms.RegexField(regex=re.compile(r'^[A-Za-z0-9_.-]{4,30}$'), required=True,
error_messages={'invalid': "Username must be between 4 and 30 alphanumeric characters including _ or ."}, label="Username")
email = forms.EmailField(max_length=200, help_text='Required')
phone_no = forms.RegexField(regex=re.compile(r'^[0-9+]{10,15}$'), required=False,
error_messages={'invalid': "Enter a valid phone number"}, label="Phone No")
password = forms.CharField(label='Password', widget=forms.PasswordInput)
password2 = forms.CharField(label='Repeat password', widget=forms.PasswordInput)
date_employed = forms.DateField(
widget=forms.TextInput(
attrs={'type': 'date', 'title': 'Date employed'}
)
)
class Meta:
model = User
fields = ('username', 'email', 'first_name', 'last_name', 'religion', 'subject',
'salary', 'image', 'password', 'password2', 'address', 'date_employed',
'phone_no', 'gender', 'classes')
labels = {
"subject": "Subject(s)"
}
def clean_password2(self):
cd = self.cleaned_data
if cd['password'] != cd['password2']:
raise forms.ValidationError('Passwords don\'t match.')
return cd['password2']
def clean_email(self):
email = self.cleaned_data['email']
if User.objects.filter(email=email).exists():
raise forms.ValidationError('Please use another Email, this Email has already been used by another user.')
return email
class UpdateStudentRegisterForm(forms.ModelForm):
mother_number = forms.RegexField(regex=re.compile(r'^[0-9+]{10,15}$'), required=False,
error_messages={'invalid': "Enter a valid phone number"}, label="Phone No")
father_number = forms.RegexField(regex=re.compile(r'^[0-9+]{10,15}$'), required=False,
error_messages={'invalid': "Enter a valid phone number"}, label="Phone No")
admission_date = forms.DateField(
widget=forms.TextInput(
attrs={'type': 'date', 'title': 'Date of birth'}
)
)
date_of_birth = forms.DateField(
widget=forms.TextInput(
attrs={'type': 'date', 'title': 'Date of birth'}
)
)
email = forms.EmailField(max_length=200)
class Meta:
model = User
fields = ('email', 'first_name', 'last_name', 'date_of_birth', 'student_class', 'gender',
'academic_year', 'image', 'admission_date', 'mother_name', 'mother_number', 'father_name',
'father_number', 'address', 'religion',)
labels = {
'first_name': "First name(s)",
}
class UpdateTeacherRegisterForm(forms.ModelForm):
email = forms.EmailField(max_length=200, help_text='Required')
phone_no = forms.RegexField(regex=re.compile(r'^[0-9+]{10,15}$'), required=False,
error_messages={'invalid': "Enter a valid phone number"}, label="Phone No")
date_employed = forms.DateField(
widget=forms.TextInput(
attrs={'type': 'date', 'title': 'Date employed'}
)
)
class Meta:
model = User
fields = ('email', 'first_name', 'last_name', 'religion', 'subject',
'salary', 'image', 'address', 'date_employed',
'phone_no', 'gender', 'classes')
labels = {
"subject": "Subject(s)"
}
class SettingsForm(forms.ModelForm):
class Meta:
model = School
fields = ["name", "student_can_send_message", "teacher_can_send_student_message",
"student_can_view_student_details", "teacher_can_view_student_details"]
labels = {
"name": "School Name",
"student_can_view_student_details": "Students can view other students details",
"student_can_send_message": "Students can send other students messages",
"teacher_can_view_student_details": "Teacher can view student details",
"teacher_can_send_student_message": "Teacher can send students messages"
}
| 42.056497 | 144 | 0.621709 | 839 | 7,444 | 5.381406 | 0.152563 | 0.029236 | 0.039867 | 0.043854 | 0.890587 | 0.872647 | 0.872647 | 0.868882 | 0.868882 | 0.835659 | 0 | 0.015851 | 0.237238 | 7,444 | 176 | 145 | 42.295455 | 0.779324 | 0 | 0 | 0.6875 | 0 | 0 | 0.316765 | 0.041107 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0.145833 | 0.020833 | 0 | 0.368056 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
f1f00629c973f60b4326a1e9a3a0741543ce1e9f | 96 | py | Python | python/seldon_core/__init__.py | jklaise/seldon-core | 99ff974f83c81358b27432ee2d5b6f939034e201 | [
"Apache-2.0"
] | null | null | null | python/seldon_core/__init__.py | jklaise/seldon-core | 99ff974f83c81358b27432ee2d5b6f939034e201 | [
"Apache-2.0"
] | null | null | null | python/seldon_core/__init__.py | jklaise/seldon-core | 99ff974f83c81358b27432ee2d5b6f939034e201 | [
"Apache-2.0"
] | null | null | null | from seldon_core.version import __version__
#from seldon_core.seldon_client import SeldonClient
| 32 | 51 | 0.885417 | 13 | 96 | 6 | 0.538462 | 0.25641 | 0.358974 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 96 | 2 | 52 | 48 | 0.886364 | 0.520833 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9e7a9e690e14413ba97cd1cb8489fbc2029a6257 | 5,700 | py | Python | computational-linguistics/ass-2/viterbi.py | sangeet2020/ws-20-21 | 316d2c1495cd540ce390eced52d41c57efed1f64 | [
"Apache-2.0"
] | null | null | null | computational-linguistics/ass-2/viterbi.py | sangeet2020/ws-20-21 | 316d2c1495cd540ce390eced52d41c57efed1f64 | [
"Apache-2.0"
] | 1 | 2020-12-03T00:05:20.000Z | 2020-12-03T00:05:20.000Z | computational-linguistics/ass-2/viterbi.py | sangeet2020/WS-20-21 | 316d2c1495cd540ce390eced52d41c57efed1f64 | [
"Apache-2.0"
] | 2 | 2021-01-05T06:55:52.000Z | 2021-01-17T00:19:30.000Z | #!/usr/bin/python3
# -*- coding: utf-8 -*-
# author : Sangeet Sagar
# e-mail : sasa00001@stud.uni-saarland.de
# Organization: Universität des Saarlandes
import numpy as np
def viterbi(O,S,Y, pi, A, B):
"""Generates a path which is a sequence of most likely states that generates the given observation Y.
Args:
O (numpy.ndarray): observation space. Size: 1 X N
S (numpy.ndarray): state space. Size: 1 X K
Y (list): observation sequence. Size: 1 X T
pi (numpy.ndarray): inial probablities. Size: 1 X K
A (numpy.ndarray): transition matrix. Size: K X K
B (numpy.ndarray): emission matrix Size: N X K
Returns:
list: list of most likely sequence of POS tags
"""
# Reference: https://en.wikipedia.org/wiki/Viterbi_algorithm#Pseudocode
#**************************************************************************
## Example data for trial
# input
# O = np.arange(1,7) # observation space # uniq words # Size = 1 X N
# S = np.asarray([0, 1, 2]) # State space # uniq POS tags # Size = 1 X K
# Y = np.array([0, 2, 0, 2, 2, 1]).astype(np.int32) # Observation sequnece T
# # Size = 1 X T
# pi = np.array([0.6, 0.2, 0.2]) # Initial probablity # Size = 1 X K
# A = np.array([[0.8, 0.1, 0.1],
# [0.2, 0.7, 0.1],
# [0.1, 0.3, 0.6]]) # transition matrix # Size = K X K
# B = np.array([[0.7, 0.0, 0.3],
# [0.1, 0.9, 0.0],
# [0.0, 0.2, 0.8]]) # emission matrix # Size = K X N
# print("O",O)
# print("S",S)
# print("pi",pi)
# print("Y",Y)
# print("A",A,'\n')
# print("B",B)
# output
# X = [0, 0, 0, 2, 2, 1] # Most likely path/sequence
#**************************************************************************
N = len(O)
K = len(S)
T = len(Y)
T1 = np.zeros(shape=(K,T))
T2 = np.zeros(shape=(K,T))
for i in range(K):
T1[i,0] = pi[i] * B[i, Y[0]]
T2[i,0] = 0
for j in range(1, T):
for i in range(K):
if Y[j] == -1:
# Unkown word handling. Set B[i, Y[j]] = 1 for all tags if Y[j] == -1
# aka word not found in train set.
next_prob = T1[:,j-1] * A[:, i] * 1
else:
next_prob = T1[:,j-1] * A[:, i] * B[i, Y[j]]
T1[i,j] = np.max(next_prob)
T2[i,j] = np.argmax(next_prob)
Z = [None] * T
X = [None] * T
# Backpointer
Z[T-1] = np.argmax(T1[:,T-1])
X[T-1] = S[Z[T-1]]
for j in reversed(range(1, T)):
Z[j-1] = T2[int(Z[j]),j]
X[j-1] = S[int(Z[j-1])]
return X # Most likely tags
def viterbi_log(O,S,Y, pi, A, B):
"""Generates a path which is a sequence of most likely states that generates the given observation Y.
Args:
O (numpy.ndarray): observation space. Size: 1 X N
S (numpy.ndarray): state space. Size: 1 X K
Y (list): observation sequence. Size: 1 X T
pi (numpy.ndarray): inial probablities. Size: 1 X K
A (numpy.ndarray): transition matrix. Size: K X K
B (numpy.ndarray): emission matrix Size: N X K
Returns:
list: list of most likely sequence of POS tags
"""
# Reference: https://en.wikipedia.org/wiki/Viterbi_algorithm#Pseudocode
#**************************************************************************
## Example data for trial
# input
# O = np.arange(1,7) # observation space # uniq words # Size = 1 X N
# S = np.asarray([0, 1, 2]) # State space # uniq POS tags # Size = 1 X K
# Y = np.array([0, 2, 0, 2, 2, 1]).astype(np.int32) # Observation sequnece T
# # Size = 1 X T
# pi = np.array([0.6, 0.2, 0.2]) # Initial probablity # Size = 1 X K
# A = np.array([[0.8, 0.1, 0.1],
# [0.2, 0.7, 0.1],
# [0.1, 0.3, 0.6]]) # transition matrix # Size = K X K
# B = np.array([[0.7, 0.0, 0.3],
# [0.1, 0.9, 0.0],
# [0.0, 0.2, 0.8]]) # emission matrix # Size = K X N
# print("O",O)
# print("S",S)
# print("pi",pi)
# print("Y",Y)
# print("A",A,'\n')
# print("B",B)
# output
# X = [0, 0, 0, 2, 2, 1] # Most likely path/sequence
#**************************************************************************
tiny = np.finfo(0.).tiny # limits for floating point types.
N = len(O)
K = len(S)
T = len(Y)
# pi = np.log(pi + tiny)
# A = np.log(A + tiny)
# B = np.log(B + tiny)
T1 = np.zeros(shape=(K,T))
T2 = np.zeros(shape=(K,T))
for i in range(K):
T1[i,0] = pi[i] * B[i, Y[0]]
T2[i,0] = 0
for j in range(1, T):
for i in range(K):
if Y[j] == -1:
# Unkown word handling. Set B[i, Y[j]] = 1 for all tags if Y[j] == -1
# aka word not found in train set.
next_prob = T1[:,j-1] + np.log(A[:, i])
else:
next_prob = T1[:,j-1] + np.log(A[:, i]) + np.log(B[i, Y[j]])
T1[i,j] = np.max(next_prob)
T2[i,j] = np.argmax(next_prob)
Z = [None] * T
X = [None] * T
# Backpointer
Z[T-1] = np.argmax(T1[:,T-1])
X[T-1] = S[Z[T-1]]
for j in reversed(range(1, T)):
Z[j-1] = T2[int(Z[j]),j]
X[j-1] = S[int(Z[j-1])]
return X # Most likely tags | 32.571429 | 105 | 0.439825 | 877 | 5,700 | 2.846066 | 0.150513 | 0.014423 | 0.038462 | 0.022436 | 0.907853 | 0.907853 | 0.904647 | 0.898237 | 0.898237 | 0.875801 | 0 | 0.054776 | 0.349825 | 5,700 | 175 | 106 | 32.571429 | 0.618726 | 0.595088 | 0 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.019231 | 0 | 0.096154 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9e80b355e1fc068981134e03c0e5fe3a3e1c11dc | 2,395 | py | Python | crypto_compare/apis/histo.py | konsh/crypto_compare | 48511ca22c217a30a7aa3945550853bd3e91a0c7 | [
"MIT"
] | 28 | 2017-08-30T18:05:20.000Z | 2022-03-31T10:28:38.000Z | crypto_compare/apis/histo.py | konsh/crypto_compare | 48511ca22c217a30a7aa3945550853bd3e91a0c7 | [
"MIT"
] | 2 | 2017-12-25T22:11:07.000Z | 2018-11-24T08:19:07.000Z | crypto_compare/apis/histo.py | konsh/crypto_compare | 48511ca22c217a30a7aa3945550853bd3e91a0c7 | [
"MIT"
] | 9 | 2017-11-15T19:01:54.000Z | 2021-06-19T11:04:27.000Z |
def histo_day(self, **kwargs):
"""
https://min-api.cryptocompare.com/
Keyword arguments:
inside kwargs
fsym - From Symbol
tsym - To Symbol
extraParams - Name of your application
sign - If set to true, the server will sign the requests.
limit - default 30, max 2000, min 1
tryConversion - If set to false, it will try to get values without using any conversion at all
e - market list
aggregate - default 1, max 30, min 1
allData - default false
toTs - timestamp
"""
fsym, tsym, querystring = self._get_querystring(kwargs)
self._is_params_valid(fsym=fsym, tsym=tsym)
return self._fetch_data(self.HISTO_DAY_URL+querystring)
def histo_hour(self, **kwargs):
"""
https://min-api.cryptocompare.com/
Keyword arguments:
inside kwargs
fsym - From Symbol
tsym - To Symbol
extraParams - Name of your application
sign - If set to true, the server will sign the requests.
limit - default 168, max 2000, min 1
tryConversion - If set to false, it will try to get values without using any conversion at all
e - market list
aggregate - default 1, max 30, min 1
allData - default false
toTs - timestamp
"""
fsym, tsym, querystring = self._get_querystring(kwargs)
self._is_params_valid(fsym=fsym, tsym=tsym)
return self._fetch_data(self.HISTO_HOUR_URL+querystring)
def histo_minute(self, **kwargs):
"""
https://min-api.cryptocompare.com/
Keyword arguments:
inside kwargs
fsym - From Symbol
tsym - To Symbol
extraParams - Name of your application
sign - If set to true, the server will sign the requests.
limit - default 1440, max 2000, min 1
tryConversion - If set to false, it will try to get values without using any conversion at all
e - market list
aggregate - default 1, max 30, min 1
allData - default false
toTs - timestamp
"""
fsym, tsym, querystring = self._get_querystring(kwargs)
self._is_params_valid(fsym=fsym, tsym=tsym)
return self._fetch_data(self.HISTO_MINUTE_URL+querystring) | 27.528736 | 100 | 0.602088 | 297 | 2,395 | 4.754209 | 0.228956 | 0.021246 | 0.029745 | 0.038244 | 0.92847 | 0.92847 | 0.92847 | 0.92847 | 0.92847 | 0.92847 | 0 | 0.022444 | 0.330271 | 2,395 | 87 | 101 | 27.528736 | 0.857855 | 0.648434 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7ba62203e056098f948b70521e44d6badae27952 | 123 | py | Python | scripts/field/Curbrock_Scene1.py | G00dBye/YYMS | 1de816fc842b6598d5b4b7896b6ab0ee8f7cdcfb | [
"MIT"
] | 54 | 2019-04-16T23:24:48.000Z | 2021-12-18T11:41:50.000Z | scripts/field/Curbrock_Scene1.py | G00dBye/YYMS | 1de816fc842b6598d5b4b7896b6ab0ee8f7cdcfb | [
"MIT"
] | 3 | 2019-05-19T15:19:41.000Z | 2020-04-27T16:29:16.000Z | scripts/field/Curbrock_Scene1.py | G00dBye/YYMS | 1de816fc842b6598d5b4b7896b6ab0ee8f7cdcfb | [
"MIT"
] | 49 | 2020-11-25T23:29:16.000Z | 2022-03-26T16:20:24.000Z | # Curbrock Scene 2
sm.showFieldEffect("Map/Effect.img/Curbrock1/frame")
sm.showFieldEffect("Map/Effect.img/Curbrock1/002") | 30.75 | 52 | 0.796748 | 17 | 123 | 5.764706 | 0.647059 | 0.346939 | 0.408163 | 0.530612 | 0.77551 | 0.77551 | 0 | 0 | 0 | 0 | 0 | 0.051282 | 0.04878 | 123 | 4 | 53 | 30.75 | 0.786325 | 0.130081 | 0 | 0 | 0 | 0 | 0.54717 | 0.54717 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c87069ae7c6b2d45081e34a87feeac0921f3bca0 | 27,492 | py | Python | src/genie/libs/parser/nxos/tests/ShowIpv6MrouteVrfAll/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 204 | 2018-06-27T00:55:27.000Z | 2022-03-06T21:12:18.000Z | src/genie/libs/parser/nxos/tests/ShowIpv6MrouteVrfAll/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 468 | 2018-06-19T00:33:18.000Z | 2022-03-31T23:23:35.000Z | src/genie/libs/parser/nxos/tests/ShowIpv6MrouteVrfAll/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 309 | 2019-01-16T20:21:07.000Z | 2022-03-30T12:56:41.000Z |
expected_output = {
'vrf':
{'VRF':
{'address_family':
{'ipv6': {}}},
'VRF1':
{'address_family':
{'ipv6':
{'multicast_group':
{'ff1e:1111::1:0/128':
{'source_address':
{'*':
{'flags': 'ipv6 ' 'mld '
'pim6', 'incoming_interface_list': {'loopback10': {'rpf_nbr': '2001:db8:4401:9999::1'}},
'oil_count': '3',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:02:58'},
'Ethernet1/33.11': {'oil_flags': 'mld',
'oil_uptime': '00:04:03'},
'port-channel1001': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'}},
'uptime': '00:04:03'},
'2001::222:1:1:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.10': {'rpf_nbr': '2001::222:1:1:1234',
'internal': True}},
'oil_count': '3',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:02:58'},
'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'},
'port-channel1001': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'}},
'uptime': '00:04:03'},
'2001::222:1:2:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.11': {'rpf_nbr': '2001::222:1:2:1234',
'internal': True}},
'oil_count': '3',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:02:58'},
'Ethernet1/33.11': {'oif_rpf': '(RPF)',
'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'},
'port-channel1001': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'}},
'uptime': '00:04:03'},
'2001::222:2:3:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10',
'internal': True}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:2:44:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10',
'internal': True}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'}}},
'ff1e:1111:ffff::/128': {'source_address': {'*': {'flags': 'ipv6 '
'mld '
'pim6',
'incoming_interface_list': {'Ethernet1/33.10': {'rpf_nbr': '2001::222:1:1:1'}},
'oil_count': '2',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'},
'Ethernet1/33.11': {'oil_flags': 'mld',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:1:1:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.10': {'rpf_nbr': '2001::222:1:1:1234',
'internal': True}},
'oil_count': '3',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:02:58'},
'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'},
'port-channel1001': {'oil_flags': 'pim6',
'oil_uptime': '00:04:00'}},
'uptime': '00:04:03'},
'2001::222:1:2:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.11': {'rpf_nbr': '2001::222:1:2:1234',
'internal': True}},
'oil_count': '2',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'},
'Ethernet1/33.11': {'oif_rpf': '(RPF)',
'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:2:3:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10',
'internal': True}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:2:44:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10',
'internal': True}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'}}},
'ff1e:2222:ffff::/128': {'source_address': {'*': {'flags': 'ipv6 '
'mld '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10'}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'mld',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:1:1:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.10': {'rpf_nbr': '2001::222:1:1:1234'}},
'oil_count': '2',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'},
'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:1:2:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.11': {'rpf_nbr': '2001::222:1:2:1234'}},
'oil_count': '2',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'},
'Ethernet1/33.11': {'oif_rpf': '(RPF)',
'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:2:3:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10'}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:02'}},
'uptime': '00:04:02'},
'2001::222:2:44:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10'}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:02'}},
'uptime': '00:04:02'}}},
'ff1e:2222:ffff::1:0/128': {'source_address': {'*': {'flags': 'ipv6 '
'mld '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10'}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'mld',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:1:1:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.10': {'rpf_nbr': '2001::222:1:1:1234'}},
'oil_count': '3',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:02:58'},
'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'},
'port-channel1001': {'oil_flags': 'pim6',
'oil_uptime': '00:04:02'}},
'uptime': '00:04:03'},
'2001::222:1:2:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.11': {'rpf_nbr': '2001::222:1:2:1234'}},
'oil_count': '2',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:04:02'},
'Ethernet1/33.11': {'oif_rpf': '(RPF)',
'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'}}},
'ff1e:3333::1:0/128': {'source_address': {'*': {'flags': 'ipv6 ' 'mld '
'pim6', 'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10'}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'mld',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:1:1:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.10': {'rpf_nbr': '2001::222:1:1:1234'}},
'oil_count': '2',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'},
'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:1:2:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.11': {'rpf_nbr': '2001::222:1:2:1234'}},
'oil_count': '3',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:02:58'},
'Ethernet1/33.11': {'oif_rpf': '(RPF)',
'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'},
'port-channel1001': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'}},
'uptime': '00:04:03'}}},
'ff1e:3333:ffff::/128': {'source_address': {'*': {'flags': 'ipv6 '
'mld '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10'}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'mld',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:1:1:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.10': {'rpf_nbr': '2001::222:1:1:1234'}},
'oil_count': '3',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:02:58'},
'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'},
'port-channel1001': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'}},
'uptime': '00:04:03'},
'2001::222:1:2:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/33.11': {'rpf_nbr': '2001::222:1:2:1234'}},
'oil_count': '2',
'outgoing_interface_list': {'Ethernet1/26': {'oil_flags': 'pim6',
'oil_uptime': '00:04:01'},
'Ethernet1/33.11': {'oif_rpf': '(RPF)',
'oil_flags': 'm6rib',
'oil_uptime': '00:04:03'}},
'uptime': '00:04:03'},
'2001::222:2:3:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10'}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:01'}},
'uptime': '00:04:01'},
'2001::222:2:44:1234/128': {'flags': 'ipv6 '
'm6rib '
'pim6',
'incoming_interface_list': {'Ethernet1/26': {'rpf_nbr': 'fe80::10'}},
'oil_count': '1',
'outgoing_interface_list': {'Ethernet1/33.11': {'oil_flags': 'm6rib',
'oil_uptime': '00:04:00'}},
'uptime': '00:04:00'}}},
'ff30::/12': {'source_address': {'*': {'flags': 'ipv6 '
'pim6',
'incoming_interface_list': {'Null': {'rpf_nbr': '0::'}},
'oil_count': '0',
'uptime': '19:55:47'}}}}}}},
'default': {'address_family': {'ipv6': {'multicast_group': {'ff30::/12': {'source_address': {'*': {'flags': 'ipv6 '
'pim6',
'incoming_interface_list': {'Null': {'rpf_nbr': '0::'}},
'oil_count': '0',
'uptime': '00:11:23'}}}}}}}}}
| 95.790941 | 241 | 0.214353 | 1,403 | 27,492 | 4.002138 | 0.047042 | 0.105432 | 0.117542 | 0.094034 | 0.977738 | 0.966696 | 0.964559 | 0.964559 | 0.959216 | 0.959216 | 0 | 0.17311 | 0.681034 | 27,492 | 286 | 242 | 96.125874 | 0.467214 | 0 | 0 | 0.900709 | 0 | 0 | 0.23413 | 0.062934 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c88e3142a9e4f2be19912b87235613e208e03376 | 107 | py | Python | chapter10/test_digits_ann.py | ankona/Learning-OpenCV-4-Computer-Vision-with-Python-Third-Edition | caa9326e310253fba1aab624b46ea899ce16a21f | [
"BSD-3-Clause"
] | 286 | 2019-06-29T11:47:40.000Z | 2022-03-29T08:41:28.000Z | chapter10/test_digits_ann.py | chihhao428/Learning-OpenCV-4-Computer-Vision-with-Python-Third-Edition | ee29cfefb4f21ba5acf6222aa69ef1c05c8fc05d | [
"BSD-3-Clause"
] | 8 | 2020-10-01T17:48:04.000Z | 2022-03-26T04:27:06.000Z | chapter10/test_digits_ann.py | chihhao428/Learning-OpenCV-4-Computer-Vision-with-Python-Third-Edition | ee29cfefb4f21ba5acf6222aa69ef1c05c8fc05d | [
"BSD-3-Clause"
] | 153 | 2019-07-01T02:53:02.000Z | 2022-03-28T08:43:44.000Z | from digits_ann import create_ann, train, test
ann, test_data = train(create_ann())
test(ann, test_data)
| 17.833333 | 46 | 0.766355 | 18 | 107 | 4.277778 | 0.444444 | 0.272727 | 0.285714 | 0.38961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130841 | 107 | 5 | 47 | 21.4 | 0.827957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
c8b039190ebba47122dd9b992d5756b647bfb47c | 966 | py | Python | number1.py | aimejayd/python | b10fa2c3274e9a68b32f7f0a162db5ad0f52e2f1 | [
"MIT"
] | null | null | null | number1.py | aimejayd/python | b10fa2c3274e9a68b32f7f0a162db5ad0f52e2f1 | [
"MIT"
] | null | null | null | number1.py | aimejayd/python | b10fa2c3274e9a68b32f7f0a162db5ad0f52e2f1 | [
"MIT"
] | null | null | null | #Names: Manishimwe Jean De Dieu
#The program below is code inorder to print out the first letters of my names in big letters
print ("MMM MMM JJJJJJJJJJJJJJJJJJJ")
print ("M M M M J ")
print ("M M M M J ")
print ("M M M M J ")
print ("M M M M J ")
print ("M M M M J ")
print ("M M M M J ")
print ("M M J ")
print ("M M J ")
print ("M M J ")
print ("M M J ")
print ("M M J ")
print ("M M JJJJJJJJJJJJJJ ")
| 43.909091 | 92 | 0.258799 | 87 | 966 | 2.873563 | 0.287356 | 0.192 | 0.336 | 0.352 | 0.428 | 0.428 | 0.428 | 0.428 | 0.428 | 0.428 | 0 | 0 | 0.684265 | 966 | 21 | 93 | 46 | 0.819672 | 0.125259 | 0 | 0.846154 | 0 | 0 | 0.82283 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
a8085a2020f37ae02547e333f43ca23632515194 | 8,028 | py | Python | hybrid/solvers.py | nodonoughue/emitter-detection-python | ebff19acebcc1edfd941280e05f8ddf2ff20c974 | [
"MIT"
] | null | null | null | hybrid/solvers.py | nodonoughue/emitter-detection-python | ebff19acebcc1edfd941280e05f8ddf2ff20c974 | [
"MIT"
] | null | null | null | hybrid/solvers.py | nodonoughue/emitter-detection-python | ebff19acebcc1edfd941280e05f8ddf2ff20c974 | [
"MIT"
] | null | null | null | import utils
from utils import solvers
from . import model
def max_likelihood(x_aoa, x_tdoa, x_fdoa, v_fdoa, zeta, cov, x_ctr, search_size, epsilon=None, tdoa_ref_idx=None, fdoa_ref_idx=None):
"""
Construct the ML Estimate by systematically evaluating the log
likelihood function at a series of coordinates, and returning the index
of the maximum. Optionally returns the full set of evaluated
coordinates, as well.
:param x_aoa: AOA sensor positions [m]
:param x_tdoa: TDOA sensor positions [m]
:param x_fdoa: FDOA sensor positions [m]
:param v_fdoa: FDOA sensor velocities [m/s]
:param zeta: Combined measurement vector
:param cov: Measurement error covariance matrix
:param x_ctr: Center of search grid [m]
:param search_size: 2-D vector of search grid sizes [m]
:param epsilon: Desired resolution of search grid [m]
:param tdoa_ref_idx: Scalar index of reference sensor, or nDim x nPair matrix of sensor pairings for TDOA
:param fdoa_ref_idx: Scalar index of reference sensor, or nDim x nPair matrix of sensor pairings for FDOA
:return x_est: Estimated source position [m]
:return likelihood: Likelihood computed across the entire set of candidate source positions
:return x_grid: Candidate source positions
"""
# Set up function handle
def ell(x):
return model.log_likelihood(x_aoa, x_tdoa, x_fdoa, v_fdoa, zeta, cov, x, tdoa_ref_idx, fdoa_ref_idx)
# Call the util function
x_est, likelihood, x_grid = solvers.ml_solver(ell, x_ctr, search_size, epsilon)
return x_est, likelihood, x_grid
def gradient_descent(x_aoa, x_tdoa, x_fdoa, v_fdoa, zeta, cov, x_init, alpha, beta, epsilon=None,
max_num_iterations=None, force_full_calc=False, plot_progress=False, tdoa_ref_idx=None,
fdoa_ref_idx=None):
"""
Computes the gradient descent solution for FDOA processing.
Ported from MATLAB code.
Nicholas O'Donoughue
21 February 2021
:param x_aoa: AOA sensor positions [m]
:param x_tdoa: TDOA sensor positions [m]
:param x_fdoa: FDOA sensor positions [m]
:param v_fdoa: FDOA sensor velocities [m/s]
:param zeta: Combined measurement vector
:param cov: FDOA error covariance matrix
:param x_init: Initial estimate of source position [m]
:param alpha: Backtracking line search parameter
:param beta: Backtracking line search parameter
:param epsilon: Desired position error tolerance (stopping condition)
:param max_num_iterations: Maximum number of iterations to perform
:param force_full_calc: Boolean flag to force all iterations (up to max_num_iterations) to be computed, regardless
of convergence (DEFAULT = False)
:param plot_progress: Boolean flag dictacting whether to plot intermediate solutions as they are derived
(DEFAULT = False).
:param tdoa_ref_idx: Scalar index of reference sensor, or nDim x nPair matrix of sensor pairings, for TDOA
:param fdoa_ref_idx: Scalar index of reference sensor, or nDim x nPair matrix of sensor pairings, for FDOA
:return x: Estimated source position
:return x_full: Iteration-by-iteration estimated source positions
"""
# Initialize measurement error and jacobian functions
def y(this_x):
return zeta - model.measurement(x_aoa, x_tdoa, x_fdoa, v_fdoa, this_x, tdoa_ref_idx, fdoa_ref_idx)
def jacobian(this_x):
return model.jacobian(x_aoa, x_tdoa, x_fdoa, v_fdoa, this_x, tdoa_ref_idx, fdoa_ref_idx)
# Call generic Gradient Descent solver
x, x_full = solvers.gd_solver(y, jacobian, cov, x_init, alpha, beta, epsilon, max_num_iterations, force_full_calc,
plot_progress)
return x, x_full
def least_square(x_aoa, x_tdoa, x_fdoa, v_fdoa, zeta, cov, x_init, epsilon=None, max_num_iterations=None,
force_full_calc=False, plot_progress=False, tdoa_ref_idx=None, fdoa_ref_idx=None):
"""
Computes the least square solution for FDOA processing.
Ported from MATLAB Code
Nicholas O'Donoughue
21 February 2021
:param x_aoa: AOA sensor positions [m]
:param x_tdoa: TDOA sensor positions [m]
:param x_fdoa: FDOA sensor positions [m]
:param v_fdoa: FDOA sensor velocities [m/s]
:param zeta: Combined measurement vector
:param cov: Measurement Error Covariance Matrix [(m/s)^2]
:param x_init: Initial estimate of source position [m]
:param epsilon: Desired estimate resolution [m]
:param max_num_iterations: Maximum number of iterations to perform
:param force_full_calc: Boolean flag to force all iterations (up to max_num_iterations) to be computed, regardless
of convergence (DEFAULT = False)
:param plot_progress: Boolean flag dictating whether to plot intermediate solutions as they are derived
(DEFAULT = False).
:param tdoa_ref_idx: Scalar index of reference sensor, or nDim x nPair matrix of sensor pairings for TDOA
:param fdoa_ref_idx: Scalar index of reference sensor, or nDim x nPair matrix of sensor pairings for FDOA
:return x: Estimated source position
:return x_full: Iteration-by-iteration estimated source positions
"""
# Initialize measurement error and Jacobian function handles
def y(this_x):
return zeta - model.measurement(x_aoa, x_tdoa, x_fdoa, v_fdoa, this_x, tdoa_ref_idx, fdoa_ref_idx)
def jacobian(this_x):
return model.jacobian(x_aoa, x_tdoa, x_fdoa, v_fdoa, this_x, tdoa_ref_idx, fdoa_ref_idx)
# Call the generic Least Square solver
x, x_full = solvers.ls_solver(y, jacobian, cov, x_init, epsilon, max_num_iterations, force_full_calc, plot_progress)
return x, x_full
def bestfix(x_aoa, x_tdoa, x_fdoa, v_fdoa, zeta, cov, x_ctr, search_size, epsilon, tdoa_ref_idx=None, fdoa_ref_idx=None,
pdftype=None):
"""
Construct the BestFix estimate by systematically evaluating the PDF at
a series of coordinates, and returning the index of the maximum.
Optionally returns the full set of evaluated coordinates, as well.
Assumes a multi-variate Gaussian distribution with covariance matrix C,
and unbiased estimates at each sensor. Note that the BestFix algorithm
implicitly assumes each measurement is independent, so any cross-terms in
the covariance matrix C are ignored.
Ref:
Eric Hodson, "Method and arrangement for probabilistic determination of
a target location," U.S. Patent US5045860A, 1990, https://patents.google.com/patent/US5045860A
Ported from MATLAB Code
Nicholas O'Donoughue
21 February 2021
:param x_aoa: AOA sensor positions [m]
:param x_tdoa: TDOA sensor positions [m]
:param x_fdoa: FDOA sensor positions [m]
:param v_fdoa: FDOA sensor velocities [m/s]
:param zeta: Combined measurement vector
:param cov: Measurement error covariance matrix
:param x_ctr: Center of search grid [m]
:param search_size: 2-D vector of search grid sizes [m]
:param epsilon: Desired resolution of search grid [m]
:param tdoa_ref_idx: Scalar index of reference sensor, or nDim x nPair matrix of sensor pairings for TDOA
:param fdoa_ref_idx: Scalar index of reference sensor, or nDim x nPair matrix of sensor pairings for FDOA
:param pdftype: String indicating the type of distribution to use. See +utils/makePDFs.m for options.
:return x_est: Estimated source position [m]
:return likelihood: Likelihood computed across the entire set of candidate source positions
:return x_grid: Candidate source positions
"""
# Generate the PDF
def msmt(this_x):
return model.measurement(x_aoa, x_tdoa, x_fdoa, v_fdoa, this_x, tdoa_ref_idx, fdoa_ref_idx)
pdfs = utils.make_pdfs(msmt, zeta, pdftype, cov)
# Call the util function
x_est, likelihood, x_grid = solvers.bestfix(pdfs, x_ctr, search_size, epsilon)
return x_est, likelihood, x_grid
| 45.874286 | 133 | 0.723219 | 1,193 | 8,028 | 4.699078 | 0.168483 | 0.029968 | 0.024973 | 0.044952 | 0.832858 | 0.79629 | 0.784873 | 0.784873 | 0.774884 | 0.770781 | 0 | 0.006177 | 0.213503 | 8,028 | 174 | 134 | 46.137931 | 0.881691 | 0.66866 | 0 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.30303 | false | 0 | 0.090909 | 0.181818 | 0.69697 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
b5819cb4427ead702eb029a17757a7403a04edf1 | 9,670 | py | Python | tests/unittests/test_query_history.py | ZPascal/grafana_api_sdk | 97c347790200e8e9a2aafd47e322297aa97b964c | [
"Apache-2.0"
] | 2 | 2022-02-01T20:18:48.000Z | 2022-02-02T01:22:14.000Z | tests/unittests/test_query_history.py | ZPascal/grafana_api_sdk | 97c347790200e8e9a2aafd47e322297aa97b964c | [
"Apache-2.0"
] | 5 | 2022-01-12T06:55:54.000Z | 2022-03-26T13:35:50.000Z | tests/unittests/test_query_history.py | ZPascal/grafana_api_sdk | 97c347790200e8e9a2aafd47e322297aa97b964c | [
"Apache-2.0"
] | null | null | null | from unittest import TestCase
from unittest.mock import MagicMock, Mock, patch
from src.grafana_api.model import APIModel, QueryObject, QueryDatasourceObject
from src.grafana_api.query_history import QueryHistory
class QueryHistoryTestCase(TestCase):
@patch("src.grafana_api.api.Api.call_the_api")
def test_add_query_to_history(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
query_datasource: QueryDatasourceObject = QueryDatasourceObject("test", "test")
query: QueryObject = QueryObject("test", "test", "test", query_datasource)
mock: Mock = Mock()
mock.json = Mock(return_value=dict({"result": "test"}))
call_the_api_mock.return_value = mock
self.assertEqual(
dict({"result": "test"}),
query_history.add_query_to_history("test", [query]),
)
@patch("src.grafana_api.api.Api.call_the_api")
def test_add_query_to_history_no_datasource_uid(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(ValueError):
query_history.add_query_to_history("", [])
@patch("src.grafana_api.api.Api.call_the_api")
def test_add_query_to_history_no_result(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
query_datasource: QueryDatasourceObject = QueryDatasourceObject("test", "test")
query: QueryObject = QueryObject("test", "test", "test", query_datasource)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(Exception):
query_history.add_query_to_history("test", [query])
@patch("src.grafana_api.api.Api.call_the_api")
def test_search_query_history(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict({"result": "test"}))
call_the_api_mock.return_value = mock
self.assertEqual(
dict({"result": "test"}),
query_history.search_query_history(["test", "test"], "test"),
)
@patch("src.grafana_api.api.Api.call_the_api")
def test_search_query_history_no_datasource_uids(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(ValueError):
query_history.search_query_history([], "")
@patch("src.grafana_api.api.Api.call_the_api")
def test_search_query_history_no_result(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(Exception):
query_history.search_query_history(["test"], "test")
@patch("src.grafana_api.api.Api.call_the_api")
def test_delete_query_history(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict({"message": "Query deleted"}))
call_the_api_mock.return_value = mock
self.assertEqual(None, query_history.delete_query_history("test"))
@patch("src.grafana_api.api.Api.call_the_api")
def test_delete_query_history_no_datasource_uid(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(ValueError):
query_history.delete_query_history("")
@patch("src.grafana_api.api.Api.call_the_api")
def test_delete_query_history_no_result(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict({"message": "test"}))
call_the_api_mock.return_value = mock
with self.assertRaises(Exception):
query_history.delete_query_history("test")
@patch("src.grafana_api.api.Api.call_the_api")
def test_update_query_history(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict({"result": "test"}))
call_the_api_mock.return_value = mock
self.assertEqual(
dict({"result": "test"}), query_history.update_query_history("test", "test")
)
@patch("src.grafana_api.api.Api.call_the_api")
def test_update_query_history_no_uid(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(ValueError):
query_history.update_query_history("", "")
@patch("src.grafana_api.api.Api.call_the_api")
def test_update_query_history_no_result(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(Exception):
query_history.update_query_history("test", "test")
@patch("src.grafana_api.api.Api.call_the_api")
def test_star_query_history(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict({"result": "test"}))
call_the_api_mock.return_value = mock
self.assertEqual(
dict({"result": "test"}), query_history.star_query_history("test")
)
@patch("src.grafana_api.api.Api.call_the_api")
def test_star_query_history_no_uid(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(ValueError):
query_history.star_query_history("")
@patch("src.grafana_api.api.Api.call_the_api")
def test_star_query_history_no_result(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(Exception):
query_history.star_query_history("test")
@patch("src.grafana_api.api.Api.call_the_api")
def test_unstar_query_history(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict({"result": "test"}))
call_the_api_mock.return_value = mock
self.assertEqual(
dict({"result": "test"}), query_history.unstar_query_history("test")
)
@patch("src.grafana_api.api.Api.call_the_api")
def test_unstar_query_history_no_uid(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(ValueError):
query_history.unstar_query_history("")
@patch("src.grafana_api.api.Api.call_the_api")
def test_unstar_query_history_no_result(self, call_the_api_mock):
model: APIModel = APIModel(host=MagicMock(), token=MagicMock())
query_history: QueryHistory = QueryHistory(grafana_api_model=model)
mock: Mock = Mock()
mock.json = Mock(return_value=dict())
call_the_api_mock.return_value = mock
with self.assertRaises(Exception):
query_history.unstar_query_history("test")
| 38.070866 | 88 | 0.686763 | 1,192 | 9,670 | 5.235738 | 0.041946 | 0.128826 | 0.086525 | 0.080756 | 0.960904 | 0.956417 | 0.950328 | 0.945842 | 0.945201 | 0.938151 | 0 | 0 | 0.199069 | 9,670 | 253 | 89 | 38.221344 | 0.80581 | 0 | 0 | 0.728814 | 0 | 0 | 0.091727 | 0.067011 | 0 | 0 | 0 | 0 | 0.101695 | 1 | 0.101695 | false | 0 | 0.022599 | 0 | 0.129944 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a9231218c85fc527a5caa68e7d92e545b8916e72 | 75,536 | py | Python | LeetCode127-WordLadder.py | hannah6392wang/LeetCode | 3868dd090491a666d23796625c23d7de92a528d9 | [
"MIT"
] | null | null | null | LeetCode127-WordLadder.py | hannah6392wang/LeetCode | 3868dd090491a666d23796625c23d7de92a528d9 | [
"MIT"
] | null | null | null | LeetCode127-WordLadder.py | hannah6392wang/LeetCode | 3868dd090491a666d23796625c23d7de92a528d9 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
# this code is generated by jupyter notebook
# In[1]:
import time
# In[2]:
class Node:
index = -1
via = -1
dist = 6000
next = None
def __str__(self) -> str:
return "{}{}{}{}{}{}".format("index=", self.index, ", via=", self.via, ", dist=", self.dist)
def __repr__(self):
return "{"+str(self)+"}"
def __init__(self, index, via, dist):
self.index = index
self.via = via
self.dist = dist
self.next = None
# In[3]:
class Node2:
word = ""
dist = 6000
next = None
def __str__(self) -> str:
return "{}{}{}{}".format("word=", self.word, ", dist=", self.dist)
def __repr__(self):
return "{"+str(self)+"}"
def __init__(self, word, dist):
self.word = word
self.dist = dist
self.next = None
# In[4]:
class LinkedList:
head = None
tail = None
length = 0
def __init__(self):
self.head = None
self.tail = None
self.length = 0
def __repr__(self):
node = self.head
nodes = []
while node is not None:
nodes.append(str(node))
node = node.next
nodes.append("None")
return " -> ".join(nodes)
def append(self, n: Node):
if(self.head == None):
self.head = n
self.tail = n
self.length = 1
else:
self.tail.next = n
self.tail = n
self.length += 1
def pop(self):
if(self.head == None):
return None
else:
n = self.head
self.head = self.head.next
self.length -= 1
return n
def contains(self, index) -> bool:
if(self.head == None):
return False
else:
curr = self.head
while(curr != None):
if(curr.index == index): return True
curr = curr.next
return False
# In[5]:
# use python built-in list to implement a list
# and implement methods so as to maintain the list as a sorted list
class SortedList:
theList = []
def __init__(self):
self.theList = []
def insert(self, n: Node):
done = False
for i in range(len(self.theList)):
if(self.theList[i].dist >= n.dist):
self.theList.insert(i, n)
done = True
break
if(not done): self.theList.append(n)
def append(self, n: Node):
self.theList.append(n)
def length(self) -> int:
return len(self.theList)
def pop(self, i: int) -> Node:
return self.theList.pop(i)
def findIndex(self, index: int) -> int:
for i in range(len(self.theList)-1, -1, -1):
# improvement: find from the end to the beginning
if(self.theList[i].index == index): return i
return -1
def get(self, i:int) -> Node:
return self.theList[i]
def __str__(self) -> str:
return str(self.theList)
# In[6]:
class Solution:
def diff(self, word1: str, word2: str) -> int:
l = len(word1)
d = 0
for i in range(l):
if(word1[i] != word2[i]):
d += 1
if(d > 1):
break
return d
def dist(self, word1: str, word2: str) -> int:
l = len(word1)
d = 0
for i in range(l):
if(word1[i] != word2[i]):
d += 1
return d
def ladderLengthList(self, beginWord: str, endWord: str, wordList: []) -> int:
endWordExists = False
sortedList = SortedList()
# dijkastra/bfs
# initialize the first batch of nodes connected to beginWords
for i, word in enumerate(wordList, start=0):
if(self.diff(endWord, word) == 0):
endWordExists = True
if(self.diff(beginWord, word) == 1):
sortedList.append(Node(i, 0, 1))
if(not endWordExists): return 0
if(sortedList.length == 0): return 0
neighborDict = {}
processedDist = {}
count = len(wordList)
# find the distance(connection) between each pair
for i in range(count):
for j in range(i):
dist = self.diff(wordList[i], wordList[j])
if(dist == 1):
if(i not in neighborDict): neighborDict[i] = set()
neighborDict[i].add(j)
if(j not in neighborDict): neighborDict[j] = set()
neighborDict[j].add(i)
# work through the sorted list
while(sortedList.length() > 0):
currNode = sortedList.pop(0)
# in case the node has been visited before,
# the distance must be greater than before, therefore simply ignore it, and move on
# do this removal is better than to check it existance before appending ,because it is constant time
# checking its existance is linear time
if(currNode.index in processedDist): continue
# if need to print the trace, need to store node in the dictionary
# but in this case, only return distance, only store distance
processedDist[currNode.index] = currNode.dist
if(self.diff(wordList[currNode.index], endWord)==0):
return currNode.dist + 1
if(currNode.index not in neighborDict): continue
for i in neighborDict[currNode.index]:
n = Node(i, currNode.index, currNode.dist + 1)
sortedList.append(n)
neighborDict[i].discard(currNode.index)
#print(sortedLinkedList)
endWordIndex = wordList.index(endWord)
#print(processedDist)
#print(neighborDict)
if(endWordIndex in processedDist): return processedDist[endWordIndex] + 1
else: return 0
# this method uses a self-implemented linked list
# it has better performance, not a whole lot though, compared to using python built-in list
def ladderLengthLinkedList(self, beginWord: str, endWord: str, wordList: []) -> int:
endWordExists = False
sortedLinkedList = LinkedList()
# dijkastra/bfs
# initialize the first batch of nodes connected to beginWords
for i, word in enumerate(wordList, start=0):
if(self.diff(endWord, word) == 0):
endWordExists = True
if(self.diff(beginWord, word) == 1):
sortedLinkedList.append(Node(i, 0, 1))
if(not endWordExists): return 0
if(sortedLinkedList.length == 0): return 0
neighborDict = {}
processedDist = {}
count = len(wordList)
# find the distance(connection) between each pair
for i in range(count):
for j in range(i):
dist = self.diff(wordList[i], wordList[j])
if(dist == 1):
if(i not in neighborDict): neighborDict[i] = set()
neighborDict[i].add(j)
if(j not in neighborDict): neighborDict[j] = set()
neighborDict[j].add(i)
# work through the sorted list
while(sortedLinkedList.length > 0):
currNode = sortedLinkedList.pop()
# in case the node has been visited before,
# the distance must be greater than before, therefore simply ignore it, and move on
# do this removal is better than to check it existance before appending ,because it is constant time
# checking its existance is linear time
if(currNode.index in processedDist): continue
# if need to print the trace, need to store node in the dictionary
# but in this case, only return distance, only store distance
processedDist[currNode.index] = currNode.dist
if(self.diff(wordList[currNode.index], endWord)==0): return currNode.dist + 1
if(currNode.index not in neighborDict): continue
for i in neighborDict[currNode.index]:
n = Node(i, currNode.index, currNode.dist + 1)
sortedLinkedList.append(n)
neighborDict[i].discard(currNode.index)
#print(sortedLinkedList)
endWordIndex = wordList.index(endWord)
#print(processedDist)
#print(neighborDict)
if(endWordIndex in processedDist): return processedDist[endWordIndex] + 1
else: return 0
# this method is try to cut down further in terms of processing time
# we do not need to add the begin word to the list
# we can just add the begin word to the sorted linked list
# that can kick of the process
def ladderLengthLinkedListImproved(self, beginWord: str, endWord: str, wordList: []) -> int:
endWordExists = False
sortedLinkedList = LinkedList()
wordLinkedList = LinkedList()
# dijkastra / bfs
# initialize the first batch of nodes connected to beginWords
#print(sortedLinkedList)
sortedLinkedList.append(Node2(beginWord, 0))
for word in wordList: wordLinkedList.append(Node2(word,0))
processedDist = {}
# work through the sorted list
while(sortedLinkedList.length > 0):
currNode = sortedLinkedList.pop()
# in case the node has been visited before,
# the distance must be greater than before, therefore simply ignore it, and move on
# do this removal is better than to check it existance before appending ,because it is constant time
# checking its existance is linear time
if(currNode.word in processedDist): continue
processedDist[currNode.word] = currNode.dist
#if(self.diff(currNode.word, endWord)==0): return currNode.dist + 1
prev = None
curr = wordLinkedList.head
while(curr is not None):
if(self.diff(currNode.word, curr.word) != 1):
prev = curr
curr = curr.next
else:
n = Node2(curr.word, currNode.dist + 1)
sortedLinkedList.append(n)
if(curr.word == endWord): return n.dist + 1
# remove curr node
if(prev is not None): # not at head
prev.next = curr.next
curr = curr.next
else: # at head
wordLinkedList.head = curr.next
curr = curr.next
#print(wordLinkedList.head)
#print(sortedLinkedList)
#print(neighborDict)
if(endWord in processedDist): return processedDist[endWord] + 1
else: return 0
# use hashing to hash words to buckets, to replace the need to compare every pair of words
# this is the solution that is eventually being accepted
# as all previous solution involves comparing every pair of words, which has a performance hit when the words list grows
def ladderLengthPattern(self, beginWord: str, endWord: str, wordList: []) -> int:
endWordExists = False
sortedLinkedList = LinkedList()
pattern = {}
# dijkastra/bread first search
# initialize the first batch of nodes connected to beginWords
for i in range(len(beginWord)):
pat = beginWord[:i]+"*"+beginWord[i+1:]
if(pat not in pattern): pattern[pat] = set()
pattern[pat].add(beginWord)
for word in wordList:
for i in range(len(word)):
pat = word[:i]+"*"+word[i+1:]
if(pat not in pattern): pattern[pat] = set()
pattern[pat].add(word)
sortedLinkedList.append(Node2(beginWord, 0))
processedDist = {}
# work through the sorted list
while(sortedLinkedList.length > 0):
currNode = sortedLinkedList.pop()
# in case the node has been visited before,
# the distance must be greater than before, therefore simply ignore it, and move on
# do this removal is better than to check it existance before appending ,because it is constant time
# checking its existance is linear time
if(currNode.word in processedDist): continue
processedDist[currNode.word] = currNode.dist
if(currNode.word == endWord): return currNode.dist + 1
for i in range(len(currNode.word)):
patternset = pattern[currNode.word[:i]+"*"+currNode.word[i+1:]]
patternset.remove(currNode.word)
for word in patternset:
n = Node2(word, currNode.dist + 1)
sortedLinkedList.append(n)
if(word == endWord): return n.dist + 1
#print(sortedLinkedList)
if(endWord in processedDist): return processedDist[endWord] + 1
else: return 0
# In[400]:
s = Solution()
print(s.ladderLengthPattern("hit","cog",["hot","dot","dog","lot","log","cog"]))
print(s.ladderLengthPattern("a","b",["a","b","c"]))
print(s.ladderLengthPattern("a", "c", ["a", "b","c"]))
# In[402]:
s = Solution()
beginWord = "charge"
endWord = "comedo"
wordList = ["shanny","shinny","whinny","whiney","shiver","sharer","scarer","scaler","render","fluxes","teases","starks","clinks","messrs","crewed","donner","blurts","bettye","powell","castes","hackee","hackle","heckle","deckle","decile","defile","define","refine","repine","rapine","ravine","raving","roving","chased","roping","coping","coming","homing","pointy","hominy","homily","homely","comely","comedy","comedo","vagues","crocus","spiked","bobbed","dourer","smells","feared","wooden","stings","loafer","pleads","gaiter","meeter","denser","bather","deaves","wetted","pleats","cadger","curbed","grover","hinged","budget","gables","larked","flunks","fibbed","bricks","bowell","yonder","grimes","clewed","triads","legion","lacier","ridden","bogied","camper","damien","spokes","flecks","goosed","snorer","choked","choler","leakey","vagued","flumes","scanty","bugger","tablet","nilled","julies","roomed","ridges","snared","singes","slicks","toiled","verged","shitty","clicks","farmed","stunts","dowsed","brisks","skunks","linens","hammer","naiver","duster","elates","kooked","whacky","mather","loomed","soured","mosses","keeled","drains","drafty","cricks","glower","brayed","jester","mender","burros","arises","barker","father","creaks","prayed","bulges","heaped","called","volley","girted","forded","huffed","bergen","grated","douses","jagger","grovel","lashes","creeds","bonier","snacks","powder","curled","milker","posers","ribbed","tracts","stoked","russel","bummer","cusses","gouged","nailed","lobbed","novels","stands","caches","swanks","jutted","zinged","wigged","lunges","divers","cranny","pinter","guides","tigers","traces","berber","purges","hoaxer","either","bribed","camped","funked","creaky","noises","paused","splits","morrow","faults","ladies","dinged","smoker","calved","deters","kicker","wisher","ballad","filled","fobbed","tucker","steams","rubber","staled","chived","warred","draped","curfew","chafed","washer","tombed","basket","limned","rapped","swills","gashed","loaner","settee","layers","bootee","rioted","prance","sharps","wigner","ranted","hanker","leaden","groped","dalian","robbed","peeled","larder","spoofs","pushed","hallie","maiden","waller","pashas","grains","pinked","lodged","zipper","sneers","bootie","drives","former","deepen","carboy","snouts","fained","wilmer","trance","bugles","chimps","deeper","bolder","cupped","mauser","pagers","proven","teaser","plucky","curved","shoots","barged","mantes","reefer","coater","clotho","wanner","likens","swamis","troyes","breton","fences","pastas","quirky","boiler","canoes","looted","caries","stride","adorns","dwells","hatred","cloths","rotted","spooks","canyon","lances","denied","beefed","diaper","wiener","rifled","leader","ousted","sprays","ridged","mousey","darken","guiled","gasses","suited","drools","bloody","murals","lassie","babied","fitter","lessee","chiles","wrongs","malian","leaves","redder","funnel","broths","gushes","grants","doyens","simmer","locked","spoors","berger","landed","mosley","scorns","whiten","hurled","routed","careen","chorus","chasms","hopped","cadged","kicked","slewed","shrewd","mauled","saucer","jested","shriek","giblet","gnarls","foaled","roughs","copses","sacked","blends","slurps","cashew","grades","cramps","radius","tamped","truths","cleans","creams","manner","crimps","hauled","cheery","shells","asters","scalps","quotas","clears","clover","weeder","homers","pelted","hugged","marked","moaned","steely","jagged","glades","goshes","masked","ringer","eloped","vortex","gender","spotty","harken","hasten","smiths","mulled","specks","smiles","vainer","patted","harden","nicked","dooley","begged","belief","bushel","rivers","sealed","neuter","legged","garter","freaks","server","crimea","tossed","wilted","cheers","slides","cowley","snotty","willed","bowled","tortes","pranks","yelped","slaved","silver","swords","miners","fairer","trills","salted","copsed","crusts","hogged","seemed","revert","gusted","pixies","tamika","franks","crowed","rocked","fisher","sheers","pushes","drifts","scouts","sables","sallie","shiner","coupes","napped","drowse","traced","scenes","brakes","steele","beater","buries","turned","luther","bowers","lofted","blazer","serves","cagney","hansel","talker","warmed","flirts","braced","yukked","milken","forged","dodder","strafe","blurbs","snorts","jetted","picket","pistil","valved","pewter","crawls","strews","railed","clunks","smiled","dealer","cussed","hocked","spited","cowers","strobe","donned","brawls","minxes","philby","gavels","renter","losses","packet","defied","hazier","twines","balled","gaoled","esther","narrow","soused","crispy","souped","corned","cooley","rioter","talley","keaton","rocker","spades","billie","mattel","billet","horton","navels","sander","stoker","winded","wilder","cloyed","blazed","itched","docked","greene","boozed","ticket","temped","capons","bravos","rinded","brandi","massed","sobbed","shapes","yippee","script","lesion","mallet","seabed","medals","series","phases","grower","vertex","dented","tushed","barron","toffee","bushes","mouser","zenger","quaked","marley","surfed","harmed","mormon","flints","shamed","forgot","jailor","boater","sparer","shards","master","pistol","tooted","banned","drover","spices","gobbed","corals","chucks","kitten","whales","nickel","scrape","hosted","hences","morays","stomps","marcel","hummed","wonder","stoves","distil","coffer","quaker","curler","nurses","cabbed","jigger","grails","manges","larger","zipped","rovers","stints","nudges","marlin","exuded","storey","pester","longer","creeps","meaner","wallop","dewier","rivera","drones","valued","bugled","swards","cortes","charts","benson","wreaks","glares","levels","smithy","slater","suites","paired","fetter","rutted","levied","menses","wither","woolly","weeded","planed","censer","tested","pulled","hitter","slicer","tartar","chunky","whirrs","mewled","astern","walden","hilton","cached","geller","dolled","chores","sorter","soothe","reused","clumps","fueled","hurler","helled","packed","ripped","tanned","binder","flames","teased","punker","jerked","cannon","joists","whited","sagged","heaven","hansen","grayer","turfed","cranks","stater","bunted","horsey","shakes","brands","faints","barber","gorged","creamy","mowers","scrams","gashes","knacks","aeries","sticks","altars","hostel","pumped","reeves","litter","hoaxed","mushed","guided","ripper","bought","gelled","ranker","jennie","blares","saloon","bomber","mollie","scoops","coolie","hollis","shrunk","tattle","sensed","gasket","dodoes","mapped","strips","dodges","sailed","talked","sorted","lodges","livest","pastel","ladles","graded","thrice","thales","sagger","mellon","ganged","maroon","fluked","raised","nannie","dearer","lither","triked","dorset","clamps","lonnie","spates","larded","condor","sinker","narced","quaver","atones","farted","elopes","winger","mottle","loaned","smears","joanne","boozes","waster","digger","swoops","smokey","nation","drivel","ceased","miffed","faiths","pisses","frames","fooled","milled","dither","crazed","darryl","mulder","posses","sumter","weasel","pedals","brawny","charge","welted","spanks","sallow","joined","shaker","blocks","mattie","swirls","driver","belles","chomps","blower","roared","ratted","hailed","taunts","steamy","parrot","deafer","chewed","spaces","cuffed","molded","winked","runnel","hollow","fluted","bedded","crepes","stakes","vested","parley","burton","loiter","massey","carnap","closed","bailed","milder","heists","morale","putter","snyder","damion","conned","little","pooped","ticced","cocked","halves","wishes","francs","goblet","carlin","pecked","julius","raster","shocks","dawned","loosen","swears","buried","peters","treats","noshed","hedges","trumps","rabies","ronnie","forces","ticked","bodies","proved","dadoes","halved","warner","divest","thumbs","fettle","ponies","testis","ranked","clouts","slates","tauted","stools","dodged","chancy","trawls","things","sorrow","levies","glides","battle","sauced","doomed","seller","strove","ballet","bumper","gooses","foiled","plowed","glints","chanel","petals","darted","seared","trunks","hatter","yokels","vanned","tweedy","rubles","crones","nettie","roofed","dusted","dicker","fakers","rusted","bedder","darrin","bigger","baylor","crocks","niches","tented","cashed","splats","quoted","soloed","tessie","stiles","bearer","hissed","soiled","adored","bowery","snakes","wagers","rafter","crests","plaids","cordon","listed","lawson","scared","brazos","horded","greens","marred","mushes","hooper","halter","ration","calked","erodes","plumed","mummer","pinged","curios","slated","ranter","pillow","frills","whaled","bathos","madden","totted","reamed","bellow","golfer","seaman","barred","merger","hipped","silken","hastes","strays","slinks","hooted","convex","singed","leased","bummed","leaner","molted","naught","caters","tidied","forges","sealer","gulled","plumps","racket","fitted","rafted","drapes","nasser","tamara","winced","juliet","ledger","bettie","howell","reeved","spiced","thebes","apices","dorsey","welled","feeler","warded","reader","folded","lepers","cranky","bosses","ledges","player","yellow","lunged","mattes","confer","malign","shared","brandy","filmed","rhinos","pulsed","rouses","stones","mixers","cooped","joiner","papped","liston","capote","salvos","wicker","ciders","hoofed","wefted","locket","picker","nougat","limpid","hooter","jailer","peaces","mashes","custer","wallis","purees","trends","irater","honied","wavers","tanner","change","hinges","tatted","cookie","catnap","carton","crimed","betted","veined","surges","rumped","merlin","convey","placid","harped","dianna","hookey","nobles","carted","elided","whined","glover","bleats","stales","husker","hearer","tartan","weaker","skewer","lumbar","temper","gigged","gawked","mayors","pigged","gather","valves","mitten","largos","boreas","judges","cozens","censor","frilly","dumbed","downer","jogger","scolds","danced","floras","funded","lumped","dashes","azores","quites","chunks","washed","duller","bilges","cruels","brooks","fishes","smoked","leaped","hotter","trials","heaves","rouges","kissed","sleety","manses","spites","starts","banded","clings","titted","vetoed","mister","mildew","wailed","sheets","peeked","passer","felted","broken","lieges","ruffed","bracts","buster","muffed","lanker","breaks","coffey","sighed","charms","balded","kisser","booths","leaven","cheeps","billed","lauder","bumped","career","stocks","airier","limped","jeanie","roamed","carves","lilted","router","bonnie","denver","briggs","steeps","nerves","oinked","bucked","hooves","dancer","burris","parked","swells","collie","perked","cooler","fopped","wedder","malted","sabers","lidded","conner","rogues","fought","dapper","purled","crowds","barnes","bonner","globed","goners","yankee","probes","trains","sayers","jersey","valley","vatted","tauter","dulled","mucked","jotted","border","genres","banked","filter","hitler","dipper","dollie","sieves","joliet","tilted","checks","sports","soughs","ported","causes","gelded","mooter","grills","parred","tipped","placer","slayer","glided","basked","rinses","tamper","bunged","nabbed","climbs","faeces","hanson","brainy","wicket","crowns","calmed","tarred","spires","deanne","gravel","messes","snides","tugged","denier","moslem","erased","mutter","blahed","hunker","fasten","garbed","cracks","braked","rasped","ravens","mutton","tester","tories","pinker","titled","arisen","softer","woolen","disses","likest","dicier","nagged","lipton","plumbs","manged","faulty","sacred","whiter","erases","padres","haired","captor","metals","cardin","yowled","trusts","revels","boxers","toured","spouts","sodded","judged","holley","figged","pricey","lapses","harper","beaned","sewers","caused","willie","farmer","pissed","bevies","bolled","bugler","votive","person","linton","senses","supped","mashed","pincer","wetter","tangos","sticky","lodger","loader","daunts","peaked","moused","sleeps","lasted","tasked","awards","lovely","gushed","spurts","canter","mantis","coaled","groans","dannie","oopses","sneaky","vogues","mobile","plumes","chides","theses","marcia","parser","flexed","stayed","fouler","tusked","quartz","daubed","clancy","rouged","flaked","norton","dunner","corded","shelly","hester","fucker","polled","rodger","yeager","zinced","livens","browne","gonged","pubbed","sapped","thrive","placed","jensen","moises","scopes","stumpy","stocky","heller","levers","morals","wheres","gasped","jobber","leaved","champs","rosier","pallet","shooed","parses","bender","closet","pureed","routes","verges","bulled","foster","rummer","molten","condos","better","cotter","lassos","grafts","vendor","thrace","codded","tinker","bullet","beaker","garden","spiels","popper","skills","plated","farrow","flexes","esters","brains","handel","puller","dickey","creeks","ballot","singer","sicker","spayed","spoils","rubier","missed","framed","bonnet","molder","mugger","waived","taster","robles","tracks","nearer","lister","horsed","drakes","lopped","lubber","busied","button","eluded","ceases","sought","realer","lasers","pollen","crisps","binned","darrel","crafty","gleams","lonely","gordon","harley","damian","whiles","wilton","lesser","mallow","kenyon","wimped","scened","risked","hunter","rooter","ramses","inches","goaded","ferber","freaky","nerved","spoken","lovers","letter","marrow","bulbed","braver","sloped","breads","cannes","bassos","orated","clever","darren","bredes","gouger","servos","trites","troths","flunky","jammed","bugged","watter","motive","humped","writer","pestle","rilled","packer","foists","croats","floury","napier","floors","scotty","sevens","harrow","welter","quacks","daybed","lorded","pulses","pokier","fatten","midges","joints","snoopy","looter","monies","canted","riffed","misses","bunker","piston","yessed","earner","hawked","wedged","brewer","nested","graver","hoaxes","slaves","pricks","magpie","bernie","rapier","roster","poohed","corner","trysts","rogers","whirls","bathed","teasel","opener","minced","sister","dreamy","worker","rinked","panted","triton","mervin","snowed","leafed","thinks","lesson","millet","larson","lagged","likely","stormy","fortes","hordes","wovens","kinked","mettle","seated","shirts","solver","giants","jilted","leaded","mendez","lowers","bidder","greats","pepped","flours","versus","canton","weller","cowper","tapped","dueled","mussed","rubies","bonged","steals","formed","smalls","sculls","docket","ouster","gunned","thumps","curred","withes","putted","buttes","bloats","parsed","galley","preses","tagged","hanger","planes","chords","shafts","carson","posits","zinger","solves","tensed","tastes","rinsed","kenned","bitten","leslie","chanty","candor","daises","baggie","wedded","paints","moored","haloed","hornet","lifted","fender","guiles","swifts","flicks","lancer","spares","pellet","passed","finked","joanna","bidden","swamps","lapped","leered","served","shirrs","choker","limper","marker","nudged","triter","thanks","peered","bruins","loaves","fabled","lathes","pipers","hooped","orates","burned","swines","sprats","warder","colder","crazes","reined","prized","majors","darrow","waifed","rooked","rickey","patter","shrive","gropes","gassed","throve","region","weaken","hettie","walton","galled","convoy","wesson","exudes","tinted","clanks","blinks","slacks","stilts","franny","socket","wished","kidded","knotty","turves","cashes","geared","sunned","glowed","sadden","harlem","testes","sweets","becket","blazes","batter","fellow","clovis","copier","shaped","husked","gimlet","rooney","taints","sashes","bossed","cootie","franck","probed","bagged","smocks","batten","spared","chills","relics","meyers","grader","tromps","dimmer","pasted","pepper","capped","played","junket","easier","palmed","pander","vaguer","bulged","dissed","borges","raises","wallow","jigged","bogged","burped","neater","rammed","fibers","castor","skirts","cancer","tilled","spored","dander","denims","budges","trucks","sowers","yapped","cadges","wrists","hacker","graved","vipers","noshes","minted","lessor","cassia","wrecks","hidden","brando","honeys","chilli","ragged","breded","punier","stacey","sisses","jocked","croaks","dinned","walker","heston","flares","coined","cannot","chocks","leases","wander","balder","warmer","bawled","donnie","damson","header","chilly","models","simper","watery","milked","poises","combed","toilet","gallop","sonnet","loosed","yawned","splays","pauses","bother","graphs","shrews","scones","manuel","milers","hotels","bennie","flores","spells","grimed","tenses","staged","puffer","posies","motion","fudged","fainer","tatter","seraph","nansen","months","muppet","tamera","shaman","falser","becker","lisbon","clefts","weeper","mendel","girder","takers","torsos","forked","dances","stated","yelled","scants","frothy","rolled","yodels","listen","craned","brooms","suffer","easter","shills","craves","bleeps","belled","dished","bordon","zither","jacket","lammer","kirked","shaved","atoned","frumpy","nosier","vender","graced","clingy","chants","wrests","cursed","prunes","tarter","stripe","coffee","veiled","tweeds","shrine","spines","kegged","melvin","gasser","market","marten","peeped","sanger","somber","spider","netted","radium","slings","scarfs","mended","creels","shaves","payers","bunked","movers","beings","conked","cozies","benton","codger","prints","gusset","longed","burner","jambed","mullet","fogged","scores","carbon","sleeks","helped","waxier","gilded","harlot","winces","tenser","lowell","ramsey","kennan","booted","beaver","rested","shouts","hickey","looped","swings","wonted","dilled","defers","lolled","pupped","cruets","solved","romper","defter","chokes","kithed","garnet","bookie","stared","stares","latter","lazies","fanned","wagged","dunces","corked","cloned","prided","baxter","pusses","boomed","masses","warren","weaves","delves","handed","merton","lusher","hepper","gibber","sender","parsec","snares","masher","seamed","sweats","welles","gagged","curter","mother","beeped","vealed","shoved","slaver","hacked","gutted","ranged","bashed","closer","storks","meshed","cortex","copper","severn","gripes","carlos","scares","crates","boiled","ginned","mouses","raided","greyed","verier","slopes","fenced","sniper","priced","flawed","buffed","spacey","favors","platen","miller","walled","cutter","skated","holier","beamed","waiter","drowns","clomps","quarks","bested","frisks","purged","scalds","marian","flower","howled","plover","bikers","trails","hagged","smirks","sitter","carmen","lanced","plants","nobler","yakked","thesis","lassen","margin","wagner","sifter","houses","screws","booker","dormer","meters","padded","loaded","cartel","sutton","willis","chatty","dunked","dreamt","dalton","fables","coveys","muller","shanty","adders","tailor","helper","liters","butted","maiman","hollie","gallon","xavier","shrank","mickey","rather","powers","keened","doused","kisses","flanks","dotted","phased","dumped","linger","kramer","spaced","soften","strife","rowers","hovers","crimes","crooks","carrel","braces","lander","shrove","skulks","banker","itches","dropsy","misted","pulped","cloche","fawned","states","teared","beeper","raider","groves","livery","aerier","keenan","severe","sabres","bogies","coated","harlow","tanked","mellow","cozier","shanks","spooky","blamed","tricks","sleets","punted","jumped","caxton","warped","halley","frisky","shines","skater","lumber","truces","sliced","gibbet","narked","chives","graves","gummed","holler","glazes","nieves","hushed","nought","prated","chored","cloudy","kidder","huston","straws","twined","gifted","rodney","haloes","france","wirier","mercia","rubbed","coaxed","sumner","snipes","nipper","leiden","madman","margie","footed","firmed","budded","froths","senior","hoover","tailed","glider","straps","stalks","billow","racked","javier","zoomed","shades","whores","braids","roused","sudden","dogies","fencer","snaked","flings","traded","gunner","snider","staten","levees","lathed","sailor","waited","muster","clothe","lulled","cargos","revved","sooths","flamed","borers","feller","bladed","oliver","collin","wusses","murder","parted","jailed","frayed","doored","cheeks","misled","belted","winter","merges","shaven","fudges","tabbed","forget","sloths","cachet","mealed","sassed","salter","haunts","ranger","rivets","deeded","reaped","damped","crated","youths","whacks","tamers","misery","seeped","eerier","tiller","busses","gloved","hushes","cronus","pruned","casket","direst","guilds","motley","spools","fevers","snores","greece","elides","waists","rattle","trader","juster","rashes","stoney","pipped","solder","sinner","prides","rugged","steers","gnarly","titter","cities","walter","stolen","steaks","hawker","weaned","jobbed","jacked","pikers","hipper","spoilt","beeves","craved","gotten","balked","sherry","looney","crisis","callie","swiped","fished","rooted","bopped","bowler","escher","chumps","jerrod","lefter","snooty","fillet","scales","comets","lisped","decked","clowns","horned","robber","bottle","reeled","crapes","banter","martel","dowels","brandt","sweeps","heeled","tabled","manors","danger","dionne","prayer","decker","millie","boated","damned","horses","globes","failed","lammed","nigher","joyner","sobers","chided","tipper","parcel","flakes","fugger","elated","hinder","hopper","crafts","wipers","badder","jessie","matted","wafted","pealed","cheats","elites","torres","bushed","sneaks","tidies","brings","stalls","payees","zonked","danker","poshes","smelts","stoops","warden","chicks","ramsay","budged","firmer","glazed","heated","slices","hovels","belied","shifts","pauper","tinges","weston","casted","titles","droves","roomer","modals","seamen","wearer","blonde","berlin","libbed","tensor","hokier","lambed","graped","headed","copped","eroses","fagged","filler","keener","stages","civets","spills","tithed","sullen","sucked","briton","whaler","hooded","tittle","bucket","furled","darned","planet","clucks","batted","dagger","brides","severs","pathos","grainy","relied","carpel","makers","lancet","slowed","messed","ravels","faster","gabbed","chance","grayed","santos","spends","chinos","saints","swirly","dories","wilson","milton","clangs","manual","nodded","signer","stript","etched","vaster","wastes","stored","minces","purred","marvin","pinned","skulls","heaved","wadded","fowled","hashed","mullen","relief","hatted","primed","chaffs","canned","lackey","showed","shandy","chases","maggie","deafen","bussed","differ","worked","marted","ducked","socked","fussed","greyer","herder","trusty","follow","samson","babies","whorls","stanks","manson","cranes","murrow","shrink","genius","holder","lenses","yucked","termed","ruined","junker","belies","joshed","cooled","basted","greeks","fuller","healer","carver","havens","drunks","sucker","lotion","glared","healed","pocked","rifles","weaved","canoed","punter","hinton","settle","boobed","hinted","scored","harder","status","sloven","hayden","golfed","scoots","bloods","slaked","jugged","louses","cassie","shaded","rushed","pitied","barked","honked","rasher","forced","shaver","vowels","holden","pelvis","blades","chests","preyer","floods","deanna","cation","mapper","falter","dabbed","mocker","nestle","shucks","heeded","ticker","binges","summer","slumps","lusted","scampi","crofts","gorges","pardon","torses","smokes","lashed","bailey","jabbed","calmer","preset","forbes","hasted","wormed","winged","minors","banner","grazed","hewers","kernel","jolted","sniped","clunky","ratios","blinds","ganges","misers","spikes","riders","hallow","grumpy","barren","summed","infers","places","jarred","killer","plaint","goofed","subbed","prudes","sipped","kookie","whines","droopy","palled","cherry","proves","mobbed","spaded","cheese","pluses","bathes","motels","spewed","soaked","howler","puffed","malled","shrike","slided","fulled","pouted","shames","lessen","ringed","teemed","grands","linked","wooten","feuded","deaden","scents","flutes","salton"]
timeb = time.perf_counter()
print(s.ladderLengthList(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
timeb = time.perf_counter()
print(s.ladderLengthLinkedList(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
timeb = time.perf_counter()
print(s.ladderLengthLinkedListImproved(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
timeb = time.perf_counter()
print(s.ladderLengthPattern(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
# In[404]:
beginWord = "charge"
endWord = "comedo"
wordList = ["shanny","shinny","whinny","whiney","shiver","sharer","scarer","scaler","render","fluxes","teases","starks","clinks","messrs","crewed","donner","blurts","bettye","powell","castes","hackee","hackle","heckle","deckle","decile","defile","define","refine","repine","rapine","ravine","raving","roving","chased","roping","coping","coming","homing","pointy","hominy","homily","homely","comely","comedy","comedo","vagues","crocus","spiked","bobbed","dourer","smells","feared","wooden","stings","loafer","pleads","gaiter","meeter","denser","bather","deaves","wetted","pleats","cadger","curbed","grover","hinged","budget","gables","larked","flunks","fibbed","bricks","bowell","yonder","grimes","clewed","triads","legion","lacier","ridden","bogied","camper","damien","spokes","flecks","goosed","snorer","choked","choler","leakey","vagued","flumes","scanty","bugger","tablet","nilled","julies","roomed","ridges","snared","singes","slicks","toiled","verged","shitty","clicks","farmed","stunts","dowsed","brisks","skunks","linens","hammer","naiver","duster","elates","kooked","whacky","mather","loomed","soured","mosses","keeled","drains","drafty","cricks","glower","brayed","jester","mender","burros","arises","barker","father","creaks","prayed","bulges","heaped","called","volley","girted","forded","huffed","bergen","grated","douses","jagger","grovel","lashes","creeds","bonier","snacks","powder","curled","milker","posers","ribbed","tracts","stoked","russel","bummer","cusses","gouged","nailed","lobbed","novels","stands","caches","swanks","jutted","zinged","wigged","lunges","divers","cranny","pinter","guides","tigers","traces","berber","purges","hoaxer","either","bribed","camped","funked","creaky","noises","paused","splits","morrow","faults","ladies","dinged","smoker","calved","deters","kicker","wisher","ballad","filled","fobbed","tucker","steams","rubber","staled","chived","warred","draped","curfew","chafed","washer","tombed","basket","limned","rapped","swills","gashed","loaner","settee","layers","bootee","rioted","prance","sharps","wigner","ranted","hanker","leaden","groped","dalian","robbed","peeled","larder","spoofs","pushed","hallie","maiden","waller","pashas","grains","pinked","lodged","zipper","sneers","bootie","drives","former","deepen","carboy","snouts","fained","wilmer","trance","bugles","chimps","deeper","bolder","cupped","mauser","pagers","proven","teaser","plucky","curved","shoots","barged","mantes","reefer","coater","clotho","wanner","likens","swamis","troyes","breton","fences","pastas","quirky","boiler","canoes","looted","caries","stride","adorns","dwells","hatred","cloths","rotted","spooks","canyon","lances","denied","beefed","diaper","wiener","rifled","leader","ousted","sprays","ridged","mousey","darken","guiled","gasses","suited","drools","bloody","murals","lassie","babied","fitter","lessee","chiles","wrongs","malian","leaves","redder","funnel","broths","gushes","grants","doyens","simmer","locked","spoors","berger","landed","mosley","scorns","whiten","hurled","routed","careen","chorus","chasms","hopped","cadged","kicked","slewed","shrewd","mauled","saucer","jested","shriek","giblet","gnarls","foaled","roughs","copses","sacked","blends","slurps","cashew","grades","cramps","radius","tamped","truths","cleans","creams","manner","crimps","hauled","cheery","shells","asters","scalps","quotas","clears","clover","weeder","homers","pelted","hugged","marked","moaned","steely","jagged","glades","goshes","masked","ringer","eloped","vortex","gender","spotty","harken","hasten","smiths","mulled","specks","smiles","vainer","patted","harden","nicked","dooley","begged","belief","bushel","rivers","sealed","neuter","legged","garter","freaks","server","crimea","tossed","wilted","cheers","slides","cowley","snotty","willed","bowled","tortes","pranks","yelped","slaved","silver","swords","miners","fairer","trills","salted","copsed","crusts","hogged","seemed","revert","gusted","pixies","tamika","franks","crowed","rocked","fisher","sheers","pushes","drifts","scouts","sables","sallie","shiner","coupes","napped","drowse","traced","scenes","brakes","steele","beater","buries","turned","luther","bowers","lofted","blazer","serves","cagney","hansel","talker","warmed","flirts","braced","yukked","milken","forged","dodder","strafe","blurbs","snorts","jetted","picket","pistil","valved","pewter","crawls","strews","railed","clunks","smiled","dealer","cussed","hocked","spited","cowers","strobe","donned","brawls","minxes","philby","gavels","renter","losses","packet","defied","hazier","twines","balled","gaoled","esther","narrow","soused","crispy","souped","corned","cooley","rioter","talley","keaton","rocker","spades","billie","mattel","billet","horton","navels","sander","stoker","winded","wilder","cloyed","blazed","itched","docked","greene","boozed","ticket","temped","capons","bravos","rinded","brandi","massed","sobbed","shapes","yippee","script","lesion","mallet","seabed","medals","series","phases","grower","vertex","dented","tushed","barron","toffee","bushes","mouser","zenger","quaked","marley","surfed","harmed","mormon","flints","shamed","forgot","jailor","boater","sparer","shards","master","pistol","tooted","banned","drover","spices","gobbed","corals","chucks","kitten","whales","nickel","scrape","hosted","hences","morays","stomps","marcel","hummed","wonder","stoves","distil","coffer","quaker","curler","nurses","cabbed","jigger","grails","manges","larger","zipped","rovers","stints","nudges","marlin","exuded","storey","pester","longer","creeps","meaner","wallop","dewier","rivera","drones","valued","bugled","swards","cortes","charts","benson","wreaks","glares","levels","smithy","slater","suites","paired","fetter","rutted","levied","menses","wither","woolly","weeded","planed","censer","tested","pulled","hitter","slicer","tartar","chunky","whirrs","mewled","astern","walden","hilton","cached","geller","dolled","chores","sorter","soothe","reused","clumps","fueled","hurler","helled","packed","ripped","tanned","binder","flames","teased","punker","jerked","cannon","joists","whited","sagged","heaven","hansen","grayer","turfed","cranks","stater","bunted","horsey","shakes","brands","faints","barber","gorged","creamy","mowers","scrams","gashes","knacks","aeries","sticks","altars","hostel","pumped","reeves","litter","hoaxed","mushed","guided","ripper","bought","gelled","ranker","jennie","blares","saloon","bomber","mollie","scoops","coolie","hollis","shrunk","tattle","sensed","gasket","dodoes","mapped","strips","dodges","sailed","talked","sorted","lodges","livest","pastel","ladles","graded","thrice","thales","sagger","mellon","ganged","maroon","fluked","raised","nannie","dearer","lither","triked","dorset","clamps","lonnie","spates","larded","condor","sinker","narced","quaver","atones","farted","elopes","winger","mottle","loaned","smears","joanne","boozes","waster","digger","swoops","smokey","nation","drivel","ceased","miffed","faiths","pisses","frames","fooled","milled","dither","crazed","darryl","mulder","posses","sumter","weasel","pedals","brawny","charge","welted","spanks","sallow","joined","shaker","blocks","mattie","swirls","driver","belles","chomps","blower","roared","ratted","hailed","taunts","steamy","parrot","deafer","chewed","spaces","cuffed","molded","winked","runnel","hollow","fluted","bedded","crepes","stakes","vested","parley","burton","loiter","massey","carnap","closed","bailed","milder","heists","morale","putter","snyder","damion","conned","little","pooped","ticced","cocked","halves","wishes","francs","goblet","carlin","pecked","julius","raster","shocks","dawned","loosen","swears","buried","peters","treats","noshed","hedges","trumps","rabies","ronnie","forces","ticked","bodies","proved","dadoes","halved","warner","divest","thumbs","fettle","ponies","testis","ranked","clouts","slates","tauted","stools","dodged","chancy","trawls","things","sorrow","levies","glides","battle","sauced","doomed","seller","strove","ballet","bumper","gooses","foiled","plowed","glints","chanel","petals","darted","seared","trunks","hatter","yokels","vanned","tweedy","rubles","crones","nettie","roofed","dusted","dicker","fakers","rusted","bedder","darrin","bigger","baylor","crocks","niches","tented","cashed","splats","quoted","soloed","tessie","stiles","bearer","hissed","soiled","adored","bowery","snakes","wagers","rafter","crests","plaids","cordon","listed","lawson","scared","brazos","horded","greens","marred","mushes","hooper","halter","ration","calked","erodes","plumed","mummer","pinged","curios","slated","ranter","pillow","frills","whaled","bathos","madden","totted","reamed","bellow","golfer","seaman","barred","merger","hipped","silken","hastes","strays","slinks","hooted","convex","singed","leased","bummed","leaner","molted","naught","caters","tidied","forges","sealer","gulled","plumps","racket","fitted","rafted","drapes","nasser","tamara","winced","juliet","ledger","bettie","howell","reeved","spiced","thebes","apices","dorsey","welled","feeler","warded","reader","folded","lepers","cranky","bosses","ledges","player","yellow","lunged","mattes","confer","malign","shared","brandy","filmed","rhinos","pulsed","rouses","stones","mixers","cooped","joiner","papped","liston","capote","salvos","wicker","ciders","hoofed","wefted","locket","picker","nougat","limpid","hooter","jailer","peaces","mashes","custer","wallis","purees","trends","irater","honied","wavers","tanner","change","hinges","tatted","cookie","catnap","carton","crimed","betted","veined","surges","rumped","merlin","convey","placid","harped","dianna","hookey","nobles","carted","elided","whined","glover","bleats","stales","husker","hearer","tartan","weaker","skewer","lumbar","temper","gigged","gawked","mayors","pigged","gather","valves","mitten","largos","boreas","judges","cozens","censor","frilly","dumbed","downer","jogger","scolds","danced","floras","funded","lumped","dashes","azores","quites","chunks","washed","duller","bilges","cruels","brooks","fishes","smoked","leaped","hotter","trials","heaves","rouges","kissed","sleety","manses","spites","starts","banded","clings","titted","vetoed","mister","mildew","wailed","sheets","peeked","passer","felted","broken","lieges","ruffed","bracts","buster","muffed","lanker","breaks","coffey","sighed","charms","balded","kisser","booths","leaven","cheeps","billed","lauder","bumped","career","stocks","airier","limped","jeanie","roamed","carves","lilted","router","bonnie","denver","briggs","steeps","nerves","oinked","bucked","hooves","dancer","burris","parked","swells","collie","perked","cooler","fopped","wedder","malted","sabers","lidded","conner","rogues","fought","dapper","purled","crowds","barnes","bonner","globed","goners","yankee","probes","trains","sayers","jersey","valley","vatted","tauter","dulled","mucked","jotted","border","genres","banked","filter","hitler","dipper","dollie","sieves","joliet","tilted","checks","sports","soughs","ported","causes","gelded","mooter","grills","parred","tipped","placer","slayer","glided","basked","rinses","tamper","bunged","nabbed","climbs","faeces","hanson","brainy","wicket","crowns","calmed","tarred","spires","deanne","gravel","messes","snides","tugged","denier","moslem","erased","mutter","blahed","hunker","fasten","garbed","cracks","braked","rasped","ravens","mutton","tester","tories","pinker","titled","arisen","softer","woolen","disses","likest","dicier","nagged","lipton","plumbs","manged","faulty","sacred","whiter","erases","padres","haired","captor","metals","cardin","yowled","trusts","revels","boxers","toured","spouts","sodded","judged","holley","figged","pricey","lapses","harper","beaned","sewers","caused","willie","farmer","pissed","bevies","bolled","bugler","votive","person","linton","senses","supped","mashed","pincer","wetter","tangos","sticky","lodger","loader","daunts","peaked","moused","sleeps","lasted","tasked","awards","lovely","gushed","spurts","canter","mantis","coaled","groans","dannie","oopses","sneaky","vogues","mobile","plumes","chides","theses","marcia","parser","flexed","stayed","fouler","tusked","quartz","daubed","clancy","rouged","flaked","norton","dunner","corded","shelly","hester","fucker","polled","rodger","yeager","zinced","livens","browne","gonged","pubbed","sapped","thrive","placed","jensen","moises","scopes","stumpy","stocky","heller","levers","morals","wheres","gasped","jobber","leaved","champs","rosier","pallet","shooed","parses","bender","closet","pureed","routes","verges","bulled","foster","rummer","molten","condos","better","cotter","lassos","grafts","vendor","thrace","codded","tinker","bullet","beaker","garden","spiels","popper","skills","plated","farrow","flexes","esters","brains","handel","puller","dickey","creeks","ballot","singer","sicker","spayed","spoils","rubier","missed","framed","bonnet","molder","mugger","waived","taster","robles","tracks","nearer","lister","horsed","drakes","lopped","lubber","busied","button","eluded","ceases","sought","realer","lasers","pollen","crisps","binned","darrel","crafty","gleams","lonely","gordon","harley","damian","whiles","wilton","lesser","mallow","kenyon","wimped","scened","risked","hunter","rooter","ramses","inches","goaded","ferber","freaky","nerved","spoken","lovers","letter","marrow","bulbed","braver","sloped","breads","cannes","bassos","orated","clever","darren","bredes","gouger","servos","trites","troths","flunky","jammed","bugged","watter","motive","humped","writer","pestle","rilled","packer","foists","croats","floury","napier","floors","scotty","sevens","harrow","welter","quacks","daybed","lorded","pulses","pokier","fatten","midges","joints","snoopy","looter","monies","canted","riffed","misses","bunker","piston","yessed","earner","hawked","wedged","brewer","nested","graver","hoaxes","slaves","pricks","magpie","bernie","rapier","roster","poohed","corner","trysts","rogers","whirls","bathed","teasel","opener","minced","sister","dreamy","worker","rinked","panted","triton","mervin","snowed","leafed","thinks","lesson","millet","larson","lagged","likely","stormy","fortes","hordes","wovens","kinked","mettle","seated","shirts","solver","giants","jilted","leaded","mendez","lowers","bidder","greats","pepped","flours","versus","canton","weller","cowper","tapped","dueled","mussed","rubies","bonged","steals","formed","smalls","sculls","docket","ouster","gunned","thumps","curred","withes","putted","buttes","bloats","parsed","galley","preses","tagged","hanger","planes","chords","shafts","carson","posits","zinger","solves","tensed","tastes","rinsed","kenned","bitten","leslie","chanty","candor","daises","baggie","wedded","paints","moored","haloed","hornet","lifted","fender","guiles","swifts","flicks","lancer","spares","pellet","passed","finked","joanna","bidden","swamps","lapped","leered","served","shirrs","choker","limper","marker","nudged","triter","thanks","peered","bruins","loaves","fabled","lathes","pipers","hooped","orates","burned","swines","sprats","warder","colder","crazes","reined","prized","majors","darrow","waifed","rooked","rickey","patter","shrive","gropes","gassed","throve","region","weaken","hettie","walton","galled","convoy","wesson","exudes","tinted","clanks","blinks","slacks","stilts","franny","socket","wished","kidded","knotty","turves","cashes","geared","sunned","glowed","sadden","harlem","testes","sweets","becket","blazes","batter","fellow","clovis","copier","shaped","husked","gimlet","rooney","taints","sashes","bossed","cootie","franck","probed","bagged","smocks","batten","spared","chills","relics","meyers","grader","tromps","dimmer","pasted","pepper","capped","played","junket","easier","palmed","pander","vaguer","bulged","dissed","borges","raises","wallow","jigged","bogged","burped","neater","rammed","fibers","castor","skirts","cancer","tilled","spored","dander","denims","budges","trucks","sowers","yapped","cadges","wrists","hacker","graved","vipers","noshes","minted","lessor","cassia","wrecks","hidden","brando","honeys","chilli","ragged","breded","punier","stacey","sisses","jocked","croaks","dinned","walker","heston","flares","coined","cannot","chocks","leases","wander","balder","warmer","bawled","donnie","damson","header","chilly","models","simper","watery","milked","poises","combed","toilet","gallop","sonnet","loosed","yawned","splays","pauses","bother","graphs","shrews","scones","manuel","milers","hotels","bennie","flores","spells","grimed","tenses","staged","puffer","posies","motion","fudged","fainer","tatter","seraph","nansen","months","muppet","tamera","shaman","falser","becker","lisbon","clefts","weeper","mendel","girder","takers","torsos","forked","dances","stated","yelled","scants","frothy","rolled","yodels","listen","craned","brooms","suffer","easter","shills","craves","bleeps","belled","dished","bordon","zither","jacket","lammer","kirked","shaved","atoned","frumpy","nosier","vender","graced","clingy","chants","wrests","cursed","prunes","tarter","stripe","coffee","veiled","tweeds","shrine","spines","kegged","melvin","gasser","market","marten","peeped","sanger","somber","spider","netted","radium","slings","scarfs","mended","creels","shaves","payers","bunked","movers","beings","conked","cozies","benton","codger","prints","gusset","longed","burner","jambed","mullet","fogged","scores","carbon","sleeks","helped","waxier","gilded","harlot","winces","tenser","lowell","ramsey","kennan","booted","beaver","rested","shouts","hickey","looped","swings","wonted","dilled","defers","lolled","pupped","cruets","solved","romper","defter","chokes","kithed","garnet","bookie","stared","stares","latter","lazies","fanned","wagged","dunces","corked","cloned","prided","baxter","pusses","boomed","masses","warren","weaves","delves","handed","merton","lusher","hepper","gibber","sender","parsec","snares","masher","seamed","sweats","welles","gagged","curter","mother","beeped","vealed","shoved","slaver","hacked","gutted","ranged","bashed","closer","storks","meshed","cortex","copper","severn","gripes","carlos","scares","crates","boiled","ginned","mouses","raided","greyed","verier","slopes","fenced","sniper","priced","flawed","buffed","spacey","favors","platen","miller","walled","cutter","skated","holier","beamed","waiter","drowns","clomps","quarks","bested","frisks","purged","scalds","marian","flower","howled","plover","bikers","trails","hagged","smirks","sitter","carmen","lanced","plants","nobler","yakked","thesis","lassen","margin","wagner","sifter","houses","screws","booker","dormer","meters","padded","loaded","cartel","sutton","willis","chatty","dunked","dreamt","dalton","fables","coveys","muller","shanty","adders","tailor","helper","liters","butted","maiman","hollie","gallon","xavier","shrank","mickey","rather","powers","keened","doused","kisses","flanks","dotted","phased","dumped","linger","kramer","spaced","soften","strife","rowers","hovers","crimes","crooks","carrel","braces","lander","shrove","skulks","banker","itches","dropsy","misted","pulped","cloche","fawned","states","teared","beeper","raider","groves","livery","aerier","keenan","severe","sabres","bogies","coated","harlow","tanked","mellow","cozier","shanks","spooky","blamed","tricks","sleets","punted","jumped","caxton","warped","halley","frisky","shines","skater","lumber","truces","sliced","gibbet","narked","chives","graves","gummed","holler","glazes","nieves","hushed","nought","prated","chored","cloudy","kidder","huston","straws","twined","gifted","rodney","haloes","france","wirier","mercia","rubbed","coaxed","sumner","snipes","nipper","leiden","madman","margie","footed","firmed","budded","froths","senior","hoover","tailed","glider","straps","stalks","billow","racked","javier","zoomed","shades","whores","braids","roused","sudden","dogies","fencer","snaked","flings","traded","gunner","snider","staten","levees","lathed","sailor","waited","muster","clothe","lulled","cargos","revved","sooths","flamed","borers","feller","bladed","oliver","collin","wusses","murder","parted","jailed","frayed","doored","cheeks","misled","belted","winter","merges","shaven","fudges","tabbed","forget","sloths","cachet","mealed","sassed","salter","haunts","ranger","rivets","deeded","reaped","damped","crated","youths","whacks","tamers","misery","seeped","eerier","tiller","busses","gloved","hushes","cronus","pruned","casket","direst","guilds","motley","spools","fevers","snores","greece","elides","waists","rattle","trader","juster","rashes","stoney","pipped","solder","sinner","prides","rugged","steers","gnarly","titter","cities","walter","stolen","steaks","hawker","weaned","jobbed","jacked","pikers","hipper","spoilt","beeves","craved","gotten","balked","sherry","looney","crisis","callie","swiped","fished","rooted","bopped","bowler","escher","chumps","jerrod","lefter","snooty","fillet","scales","comets","lisped","decked","clowns","horned","robber","bottle","reeled","crapes","banter","martel","dowels","brandt","sweeps","heeled","tabled","manors","danger","dionne","prayer","decker","millie","boated","damned","horses","globes","failed","lammed","nigher","joyner","sobers","chided","tipper","parcel","flakes","fugger","elated","hinder","hopper","crafts","wipers","badder","jessie","matted","wafted","pealed","cheats","elites","torres","bushed","sneaks","tidies","brings","stalls","payees","zonked","danker","poshes","smelts","stoops","warden","chicks","ramsay","budged","firmer","glazed","heated","slices","hovels","belied","shifts","pauper","tinges","weston","casted","titles","droves","roomer","modals","seamen","wearer","blonde","berlin","libbed","tensor","hokier","lambed","graped","headed","copped","eroses","fagged","filler","keener","stages","civets","spills","tithed","sullen","sucked","briton","whaler","hooded","tittle","bucket","furled","darned","planet","clucks","batted","dagger","brides","severs","pathos","grainy","relied","carpel","makers","lancet","slowed","messed","ravels","faster","gabbed","chance","grayed","santos","spends","chinos","saints","swirly","dories","wilson","milton","clangs","manual","nodded","signer","stript","etched","vaster","wastes","stored","minces","purred","marvin","pinned","skulls","heaved","wadded","fowled","hashed","mullen","relief","hatted","primed","chaffs","canned","lackey","showed","shandy","chases","maggie","deafen","bussed","differ","worked","marted","ducked","socked","fussed","greyer","herder","trusty","follow","samson","babies","whorls","stanks","manson","cranes","murrow","shrink","genius","holder","lenses","yucked","termed","ruined","junker","belies","joshed","cooled","basted","greeks","fuller","healer","carver","havens","drunks","sucker","lotion","glared","healed","pocked","rifles","weaved","canoed","punter","hinton","settle","boobed","hinted","scored","harder","status","sloven","hayden","golfed","scoots","bloods","slaked","jugged","louses","cassie","shaded","rushed","pitied","barked","honked","rasher","forced","shaver","vowels","holden","pelvis","blades","chests","preyer","floods","deanna","cation","mapper","falter","dabbed","mocker","nestle","shucks","heeded","ticker","binges","summer","slumps","lusted","scampi","crofts","gorges","pardon","torses","smokes","lashed","bailey","jabbed","calmer","preset","forbes","hasted","wormed","winged","minors","banner","grazed","hewers","kernel","jolted","sniped","clunky","ratios","blinds","ganges","misers","spikes","riders","hallow","grumpy","barren","summed","infers","places","jarred","killer","plaint","goofed","subbed","prudes","sipped","kookie","whines","droopy","palled","cherry","proves","mobbed","spaded","cheese","pluses","bathes","motels","spewed","soaked","howler","puffed","malled","shrike","slided","fulled","pouted","shames","lessen","ringed","teemed","grands","linked","wooten","feuded","deaden","scents","flutes","salton"]
timeb = time.perf_counter()
s = Solution()
print(s.ladderLengthLinkedListImproved(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
timeb = time.perf_counter()
print(s.ladderLengthPattern(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
# In[405]:
beginWord = "zings"
endWord = "brown"
wordList = ["chump","sours","mcgee","piers","match","folds","rinse","films","small","umbel","assad","morin","plied","basin","moots","blurb","suits","solve","sooty","fluky","bombs","nurse","ceres","lopes","yucky","ricks","goads","loses","coyly","marcy","bonds","niece","cures","sonic","crows","dicey","gaped","buggy","riles","homer","fakir","hello","riper","makes","laked","sinus","fangs","acton","spiky","salts","boots","skiff","maker","pence","fells","cedar","kited","raved","flake","jiffy","tanks","barns","sized","gluts","amman","jumps","cavil","quaff","rents","looms","toner","gibes","aside","drawn","karin","torte","haded","psych","hacks","jesus","fumed","lisle","spays","sumps","beats","tunas","naked","bathe","gulfs","karma","snuff","boast","grins","turds","plant","spicy","risen","tints","tomas","stand","noses","toxin","sheep","paddy","abase","jeeps","dated","tough","timid","forty","kusch","pones","smack","token","havel","vanes","repay","chums","paved","chimp","spinx","smirk","pupas","bares","mites","egged","palsy","gyros","wolfe","chips","pouts","johns","barbs","slunk","hires","seals","rally","tromp","roads","writs","aches","corny","fiats","hench","gilts","blake","phony","drams","skimp","suing","horus","hewer","barfs","hewed","needs","epsom","knots","tided","befit","eager","melva","coves","plush","pawed","zebra","gales","blots","foggy","rooks","comas","laxly","cries","kirks","monks","magic","fugue","apter","limos","congo","rosin","seder","bones","holes","fated","gamay","snags","wimpy","rites","gilds","slink","staph","sioux","bends","wilma","warts","reeds","yolks","lover","demon","salve","hulas","shard","worst","leach","omits","flint","tines","julio","trots","silly","cocks","gleam","react","camps","nicks","bored","coded","swine","scope","aloes","south","hands","rainy","david","newer","ferns","jelly","index","gibbs","truly","tubes","opera","raven","noyce","whims","titus","hared","vined","dealt","slats","erick","rolls","breed","udder","oozed","prays","tsars","harry","shelf","norms","larks","hazes","brice","gifts","units","veeps","dumas","mommy","spock","dotty","molls","slobs","diane","buddy","boost","ginny","rends","marks","timur","bands","genes","slews","leeds","karyn","mobil","mixes","ronny","sadly","rinks","smash","baled","pulpy","toils","yards","piing","dried","veils","spook","snaky","sizer","spout","percy","sheol","blank","waxes","herod","attar","doped","polls","banes","penny","knelt","laded","manic","acids","squat","jerry","stony","woofs","idles","bruin","carla","sheik","hodge","goody","merge","nicer","scums","evens","lames","wends","midge","jives","tuner","reins","boars","fryer","realm","dyson","narks","torts","yawed","waked","cress","curvy","bongs","fared","jilts","liens","ducat","shaft","pesos","dulls","donna","potty","winks","marsh","giddy","tiffs","scoot","nifty","daisy","slots","stacy","colby","skims","malls","sifts","jinns","flank","molar","hatch","wiped","taped","clink","brims","credo","fezes","molds","finds","quids","terra","damns","dusky","wanes","musty","barer","snare","honey","piked","wiser","elvin","dolly","fetal","ships","reign","cause","caved","mecca","blink","close","birth","pints","reefs","amado","comae","waite","willy","lorry","nixed","quire","napes","voted","eldon","nappy","myles","laser","pesky","leant","septa","mucks","agree","sworn","lofty","slush","holst","tevet","wases","cheer","torah","treks","purge","class","popes","roans","curve","quads","magma","drier","hales","chess","prigs","sivan","romes","finch","peels","mousy","atria","offer","coals","crash","tauts","oinks","dazed","flaps","truck","treed","colas","petty","marty","cadet","clips","zones","wooed","haves","grays","gongs","minis","macaw","horde","witch","flows","heady","fuels","conks","lifts","tumid","husks","irony","pines","glops","fonds","covey","chino","riggs","tonya","slavs","caddy","poled","blent","mired","whose","scows","forte","hikes","riped","knobs","wroth","bagel","basks","nines","scams","males","holed","solid","farms","glaxo","poise","drays","ryder","slash","rajas","goons","bowed","shirt","blurs","fussy","rills","loren","helps","feels","fiefs","hines","balms","blobs","fiord","light","dinky","maids","sagas","joked","pyxed","lilly","leers","galls","malts","minos","ionic","lower","peale","ratty","tuber","newed","whirl","eases","wests","herds","clods","floes","skate","weeds","tones","rangy","kings","adder","pitts","smith","coats","lenny","sorta","floss","looks","angie","peppy","upper","darin","white","lofts","clint","jared","heros","ruler","tonia","sexed","grail","villa","topic","kenny","dopes","hoots","boobs","gerry","eries","lyres","lunch","glove","cumin","harms","races","today","crust","track","mends","snout","shark","iliad","shrew","dorky","monty","dodge","toted","worse","dream","weird","gaunt","damon","rimes","layer","salem","bards","dills","hobby","gives","shall","crazy","brace","faxed","pools","foamy","viral","strop","liver","ceded","jolts","jonah","tight","lilia","hussy","mutts","crate","girls","marge","hypos","mewls","bulls","gazes","wands","avior","sonya","slick","clump","cater","aural","agave","grief","shana","fices","moans","grape","fetid","jenna","humus","poesy","cooks","still","lease","wanda","oddly","areas","frats","imply","files","ramon","seuss","hubby","wakes","rural","nodal","doric","carry","chefs","fails","klaus","shine","filly","yawls","brows","cabby","favor","styli","filed","jinni","ferry","balls","lakes","voled","drone","lusty","tansy","among","trail","liven","slake","madge","steps","donne","sties","picks","lacks","jumpy","meade","bogie","bauer","scene","lubes","brigs","label","fines","grebe","limns","mouse","ensue","swags","bunch","kayla","micky","sneak","bulbs","camus","yours","aisha","dunne","volta","cores","dweeb","libby","flees","shops","bided","satan","socks","draws","golfs","taunt","genus","belts","orbit","taxis","hinds","fakes","chart","wings","words","digit","copse","deena","perry","sanes","huffy","chung","lucks","fills","selma","wafts","pecks","trite","combs","sooth","weary","salty","brews","kooky","robby","loans","props","huang","marry","swabs","tinny","mince","japed","ellis","lowed","newly","loath","drown","loved","joker","lints","kinky","skits","feats","hiker","doles","every","dolby","stirs","lobed","fusty","cozen","vader","byron","dozes","slows","bethe","ploys","misty","binds","bumpy","spurs","wolfs","ernie","nails","prows","seeds","visas","dowse","pores","jocks","cower","hoofs","mined","marat","gorge","souse","clack","liter","jewel","hates","boats","stark","blabs","murks","woken","stomp","peeks","perky","pasta","goats","hocks","kinks","gushy","outdo","gelds","foxes","fives","sybil","upton","taine","helga","mauls","gills","grows","bauds","aloft","cline","payer","pinch","thorn","slits","thumb","biked","cowls","grams","disks","belly","randy","hunts","prize","minty","river","chevy","gages","cysts","years","scoff","becky","inert","abler","bevel","dyers","tonne","glows","ocean","spits","bowen","tings","baths","goals","whiny","merry","fares","leila","cairo","honor","verge","teary","pimps","sarah","meets","tamed","bumps","alias","pings","wears","dante","snore","ruled","savor","gapes","loony","chaps","froth","fancy","herbs","cutes","crowd","ghana","teddy","abate","scalp","mules","patsy","minks","shuck","billy","helen","stain","moles","jodie","homed","stack","niger","denny","kinds","elves","waled","rover","medan","churn","whizz","green","reach","lajos","mates","ditch","grads","start","press","rimed","hells","vised","slums","notes","canes","taper","camry","weans","sinks","arise","crown","prier","ramps","wotan","chars","mussy","rodes","sonar","cheri","sired","snell","basel","eider","sades","times","ovule","gusto","myrna","gabby","dully","spake","beast","towns","allay","gaged","smell","skids","clone","slack","pooch","vulva","arson","blown","kongo","maize","thick","brags","spore","soles","trial","snort","price","bowel","stoke","pents","hutch","flack","arced","cubic","hiram","tongs","lades","coons","finer","games","unpin","vests","slabs","santa","tamer","asian","tease","miked","lodes","vents","leafy","stats","shuts","bully","edith","bloch","corps","bloom","doses","coins","skips","gains","hided","coops","ninja","pills","raves","hanks","seres","ewing","bests","wrath","burgs","thrum","cabin","daren","imams","junks","brood","bacon","creel","gazed","teats","halos","gypsy","ether","train","tiles","bulks","bolls","added","roger","sites","balmy","tilts","swoop","jules","bawdy","mango","stoop","girts","costs","lemur","yucks","swazi","okays","piped","ticks","tomes","filch","depth","meals","coots","bites","pansy","spelt","leeks","hills","drops","verde","japes","holds","bangs","maxed","plume","frets","lymph","modes","twits","devon","cawed","putty","sowed","likes","quips","board","loxed","slags","dilly","refit","saved","takes","meter","prove","spacy","poach","cilia","pears","lists","gated","verdi","shave","notch","culls","shams","weedy","gaols","hoops","kraft","burro","roles","rummy","click","plots","mitty","yanks","drool","papal","rearm","prose","fucks","berra","salas","tents","flues","loves","poker","parry","polyp","agent","flown","walls","studs","troll","baron","earle","panda","wiley","raged","sexes","berne","vista","rojas","cones","byway","vases","wines","forth","freya","gully","fires","sails","dusts","terse","booed","stung","basic","saver","basis","hmong","brawn","pured","locks","downs","punts","rhine","metes","title","shims","bents","blows","harte","boyle","peach","posts","olson","might","flier","rubes","lingo","tarts","nexus","woman","mains","finis","mikes","pleas","trams","shawl","gunny","sleds","ruder","aries","usher","refed","toady","caper","tries","gimpy","doors","thieu","deere","mucky","rests","mares","cards","bouts","dines","rants","giles","flunk","enact","derek","dover","conan","mooed","fiver","kaput","enrol","payed","feint","miner","shyer","whelk","perch","furor","hayes","tammy","caves","maims","cairn","tract","legal","adler","veldt","basal","spiny","surer","bolds","grove","heaps","noway","pokes","tubed","beaks","loots","drawl","jones","typed","funny","cells","beaus","bayed","rears","seats","hazed","flubs","maura","goths","rumba","morse","fumes","slide","snoot","music","sully","perth","pocks","mills","lopez","sacks","stine","gawks","gavel","rains","wound","hares","guild","leger","foxed","craws","rinds","faced","groom","lully","boded","lends","serge","sword","faked","envoy","stick","tumor","riser","bolts","trued","gasps","thoth","veers","verbs","boles","lunar","taxes","vexes","pucks","welsh","pelts","shift","booth","smote","spied","gnawn","crete","dough","tasha","timed","wired","state","hears","lauds","wills","dummy","basil","belie","calls","crams","matts","gybes","limed","snots","moder","faces","sibyl","spare","crops","drips","frown","doggy","pearl","reese","curls","earns","poles","tiara","risks","lethe","titan","tucks","trace","vises","prick","sears","ogled","preps","livid","kicky","candy","weeps","tapes","cokes","foods","wards","coifs","shirk","elsie","ketch","trunk","goofs","kodak","toyed","lance","whale","soups","roars","poxed","tombs","noons","hindi","basie","hoffa","bayou","tests","roots","shove","hoses","doled","tempt","kilos","velma","avers","dorks","comic","fanny","poops","sicks","leary","merer","finks","garbo","cains","mimed","sates","celli","flats","grown","broth","augur","chaos","sangs","chide","barks","guide","mewed","synch","rings","scrap","zings","howls","duded","noemi","geeks","nexis","comte","helot","whams","brand","hogan","moira","trips","loges","baits","winds","marla","never","louis","anted","helix","morns","heeds","crags","rowdy","becks","venue","diary","stoat","feeds","kiths","riled","drags","lucia","deeps","sends","fonts","swing","fence","stout","trice","taker","drugs","babel","plows","pends","sloes","gents","brawl","arabs","leaps","flied","fulls","meats","megan","burch","oscar","evict","betsy","lasts","ethos","mavis","petal","fever","alone","snips","assay","rocks","talon","grass","clive","discs","wrapt","calfs","razed","learn","bruce","midst","swear","merck","meyer","funks","lobby","fears","decay","sedge","alien","reaps","koran","range","enter","lepke","honed","gallo","staid","joist","lines","paler","fined","sorts","piper","highs","busch","dario","north","ashed","sands","songs","rakes","garza","pinks","rival","leann","allow","golds","hilts","berry","hicks","idler","weiss","cider","desks","skies","hulls","warns","datum","brown","leapt","dregs","dozed","stump","reply","finny","clues","diode","dicks","rabid","moors","limbs","gulls","scary","dungs","liege","vicky","nigel","peeps","dolls","blame","sings","wants","fuzes","proud","bungs","seams","bingo","buffs","shire","decks","hosed","scots","pumas","jazzy","books","ellie","hayed","snowy","twill","links","coped","spats","reyes","piles","hovel","reads","wryer","patty","sling","oneal","waves","gorse","ofter","teams","strep","mores","daily","spoil","limes","foots","dells","hakes","danny","furls","flaws","tarot","dusty","potts","tells","pager","claps","serra","josie","award","pewee","snack","lobes","damps","tanya","lures","mushy","hertz","caret","marco","parks","pithy","synge","spoon","troth","drama","bleak","lidia","banns","forms","iambs","crick","patel","mercy","waded"]
s = Solution()
timeb = time.perf_counter()
print(s.ladderLengthList(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
timeb = time.perf_counter()
print(s.ladderLengthLinkedList(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
timeb = time.perf_counter()
print(s.ladderLengthLinkedListImproved(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
timeb = time.perf_counter()
print(s.ladderLengthPattern(beginWord,endWord,wordList))
timee = time.perf_counter()
print(timee-timeb)
| 176.485981 | 23,619 | 0.640847 | 8,652 | 75,536 | 5.587957 | 0.518955 | 0.003309 | 0.006205 | 0.00786 | 0.778187 | 0.771423 | 0.768755 | 0.764577 | 0.762839 | 0.761702 | 0 | 0.001557 | 0.064883 | 75,536 | 427 | 23,620 | 176.899297 | 0.682905 | 0.040841 | 0 | 0.610169 | 1 | 0 | 0.550604 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084746 | false | 0.00678 | 0.00339 | 0.027119 | 0.2 | 0.084746 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a937664272f2bf1ff5772b2912c46c4f18e78b7a | 32,493 | py | Python | Exams/2020-2021/Exam_1/Exercise_3_tests.py | Szymon-Budziak/ASD_exercises_solutions | 36ccbdae03a6c7e4ad141a2b7b01bef9353574ee | [
"MIT"
] | 7 | 2021-12-28T23:38:42.000Z | 2022-03-29T16:36:16.000Z | Exams/2020-2021/Exam_1/Exercise_3_tests.py | Szymon-Budziak/ASD_exercises_solutions | 36ccbdae03a6c7e4ad141a2b7b01bef9353574ee | [
"MIT"
] | null | null | null | Exams/2020-2021/Exam_1/Exercise_3_tests.py | Szymon-Budziak/ASD_exercises_solutions | 36ccbdae03a6c7e4ad141a2b7b01bef9353574ee | [
"MIT"
] | 4 | 2021-06-29T20:21:52.000Z | 2022-03-12T10:04:17.000Z | import random
def gen_test():
n = 1000
k = 100
max_end = 1000
A = []
R = random.Random(0)
for _ in range(n):
a = R.randint(0, max_end)
b = R.randint(0, max_end)
if a == b:
A.append((a, a + 1))
else:
a, b = min(a, b), max(a, b)
A.append((a, b))
x = {
'A': A,
'k': k,
'res': 668,
'sol': [10, 18, 27, 44, 45, 48, 74, 75, 77, 85]
}
print(x)
exit(0)
return x
tests = [
{
'A': [(0, 4), (1, 10), (6, 7), (2, 8)],
'k': 3,
'res': 2,
'sol': [0, 1, 3]
}, {
'A': [(x, x + 2) for x in range(50)],
'k': 2,
'res': 1,
'sol': [0, 1]
}, {
'A': [(x, x + 10) for x in range(150)],
'k': 10,
'res': 1,
'sol': [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
}, {
'A': [(100 - 5 * x, 100 + 5 * x) for x in range(15)],
'k': 14,
'res': 10,
'sol': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14]
},
{
# All start at point 0
'A': [(0, 1), (0, 2), (0, 3), (0, 4), (0, 5)],
'k': 3,
'res': 3,
'sol': [2, 3, 4]
},
{
# k = 1
'A': [(0, 1), (0, 2), (0, 3), (0, 4), (0, 5)],
'k': 1,
'res': 5,
'sol': [4]
},
{
# Smaller intervals are fully contained into longer ones
'A': [(10, 11), (9, 12), (8, 13), (7, 14), (6, 15)],
'k': 3,
'res': 5,
'sol': [2, 3, 4]
},
{'A': [(394, 864), (776, 911), (41, 430), (265, 988), (497, 523), (414, 940), (802, 849), (310, 991), (366, 488),
(597, 913), (223, 929), (142, 516),
(143, 288), (97, 773), (633, 818), (256, 931), (545, 722), (616, 829), (150, 923), (101, 317), (75, 747),
(870, 920), (338, 700), (483, 573),
(103, 362), (323, 444), (625, 655), (209, 934), (565, 989), (453, 488), (533, 886), (63, 266), (824, 940),
(561, 937), (14, 95), (736, 860),
(408, 727), (803, 844), (640, 684), (1, 626), (505, 847), (341, 888), (249, 747), (333, 720), (64, 891),
(195, 939), (227, 581), (244, 822),
(145, 990), (556, 822), (93, 458), (82, 327), (520, 896), (501, 955), (111, 308), (298, 564), (127, 723),
(340, 560), (834, 944), (208, 553),
(818, 986), (560, 617), (294, 601), (93, 455), (610, 817), (324, 394), (247, 589), (188, 297), (193, 841),
(33, 191), (627, 672), (266, 487),
(70, 91), (695, 775), (133, 897), (153, 945), (39, 862), (82, 919), (716, 945), (553, 849), (400, 699),
(722, 857), (282, 537), (534, 831),
(241, 869), (220, 916), (603, 695), (845, 972), (429, 593), (281, 461), (504, 676), (656, 717), (812, 938),
(84, 365), (332, 627), (118, 498),
(601, 645), (343, 865), (194, 248), (16, 749)],
'k': 10,
'res': 668,
'sol': [10, 18, 27, 44, 45, 48, 74, 75, 77, 85]
},
{'A': [(394, 864), (776, 911), (41, 430), (265, 988), (497, 523), (414, 940), (802, 849), (310, 991), (366, 488),
(597, 913), (223, 929), (142, 516),
(143, 288), (97, 773), (633, 818), (256, 931), (545, 722), (616, 829), (150, 923), (101, 317), (75, 747),
(870, 920), (338, 700), (483, 573),
(103, 362), (323, 444), (625, 655), (209, 934), (565, 989), (453, 488), (533, 886), (63, 266), (824, 940),
(561, 937), (14, 95), (736, 860),
(408, 727), (803, 844), (640, 684), (1, 626), (505, 847), (341, 888), (249, 747), (333, 720), (64, 891),
(195, 939), (227, 581), (244, 822),
(145, 990), (556, 822), (93, 458), (82, 327), (520, 896), (501, 955), (111, 308), (298, 564), (127, 723),
(340, 560), (834, 944), (208, 553),
(818, 986), (560, 617), (294, 601), (93, 455), (610, 817), (324, 394), (247, 589), (188, 297), (193, 841),
(33, 191), (627, 672), (266, 487),
(70, 91), (695, 775), (133, 897), (153, 945), (39, 862), (82, 919), (716, 945), (553, 849), (400, 699),
(722, 857), (282, 537), (534, 831),
(241, 869), (220, 916), (603, 695), (845, 972), (429, 593), (281, 461), (504, 676), (656, 717), (812, 938),
(84, 365), (332, 627), (118, 498),
(601, 645), (343, 865), (194, 248), (16, 749), (119, 277), (225, 722), (380, 813), (174, 340), (436, 835),
(63, 103), (149, 801), (714, 875),
(46, 224), (587, 836), (649, 931), (547, 958), (616, 696), (27, 75), (127, 650), (193, 620), (589, 850),
(122, 400), (93, 379), (118, 853),
(37, 620), (22, 199), (984, 993), (189, 735), (126, 490), (215, 744), (62, 819), (695, 959), (23, 557),
(435, 635), (103, 855), (71, 266), (73, 226),
(308, 662), (358, 446), (62, 184), (478, 515), (40, 610), (103, 716), (204, 400), (266, 367), (749, 926),
(481, 858), (923, 940), (173, 583),
(688, 714), (208, 989), (59, 785), (692, 807), (162, 865), (165, 350), (256, 542), (120, 611), (452, 943),
(179, 681), (13, 482), (419, 697),
(582, 921), (520, 895), (318, 939), (365, 664), (397, 857), (256, 673), (157, 574), (12, 707), (468, 759),
(80, 343), (46, 756), (287, 557),
(138, 245), (780, 976), (360, 493), (294, 624), (367, 689), (604, 969), (648, 913), (635, 874), (135, 732),
(317, 397), (424, 766), (666, 848),
(1, 82), (196, 608), (342, 715), (163, 245), (228, 652), (387, 458), (727, 896), (581, 689), (424, 895),
(32, 411), (718, 892), (428, 581),
(678, 790), (47, 726), (169, 456), (65, 265), (161, 718), (457, 540), (498, 906), (574, 929), (618, 773),
(0, 905), (39, 506), (319, 333),
(478, 857), (51, 828), (842, 896), (831, 997), (192, 425), (561, 986), (85, 648), (742, 857), (15, 133),
(411, 972), (427, 694), (3, 323), (14, 218),
(734, 772), (2, 842), (541, 691), (100, 626), (121, 195), (622, 664), (203, 894), (286, 309), (186, 705),
(102, 487), (874, 944), (406, 642),
(22, 83), (281, 935), (463, 819), (118, 811), (262, 882), (136, 669), (533, 836), (660, 666), (117, 355),
(158, 892), (285, 871), (19, 43),
(41, 210), (265, 697), (322, 571), (375, 969), (581, 960), (869, 931), (43, 866), (767, 984), (622, 718),
(506, 671), (659, 729), (469, 924),
(445, 655), (381, 892), (182, 550), (212, 384), (298, 601), (9, 141), (154, 277), (341, 345), (376, 808),
(95, 735), (346, 798), (36, 635),
(42, 276), (153, 167), (296, 597), (369, 404), (132, 561), (117, 300), (489, 748), (245, 956), (49, 315),
(183, 877), (535, 746), (72, 309),
(412, 855), (306, 336), (111, 424), (101, 574), (492, 930), (345, 485), (817, 861), (831, 999), (127, 351),
(118, 490), (509, 716), (38, 436),
(309, 343), (703, 752), (159, 915), (170, 941), (578, 641), (384, 825), (654, 997), (67, 89), (86, 827),
(202, 767), (62, 226), (8, 394), (100, 403),
(531, 569), (296, 459), (500, 942), (598, 807), (695, 731), (222, 433), (85, 377), (225, 267), (599, 795),
(170, 441), (196, 367), (65, 117),
(841, 884), (718, 873), (28, 924), (462, 538), (693, 770), (121, 206), (407, 509), (212, 262), (43, 656),
(816, 970), (221, 638), (107, 149),
(202, 469), (370, 387), (559, 846), (107, 154), (499, 610), (151, 577), (415, 653), (433, 696), (533, 898),
(507, 695), (909, 939), (330, 853),
(510, 511), (650, 686), (206, 895), (555, 624), (224, 953), (9, 348), (722, 985), (764, 920), (325, 837),
(36, 329), (151, 537), (263, 895),
(617, 802), (159, 862), (388, 596), (301, 735), (723, 826), (67, 481), (86, 819), (528, 889), (40, 937),
(67, 230), (41, 133), (15, 307), (777, 864),
(338, 459), (164, 882), (152, 819), (671, 889), (471, 991), (380, 517), (391, 922), (514, 542), (34, 587),
(92, 694), (813, 824), (530, 776),
(78, 614), (436, 764), (772, 927), (211, 296), (548, 922), (427, 612), (845, 995), (493, 865), (810, 995),
(397, 622), (239, 600), (871, 885),
(20, 817), (672, 906), (0, 758), (186, 309), (519, 583), (260, 340), (67, 505), (268, 880), (844, 965),
(310, 791), (393, 417), (392, 829),
(63, 167), (656, 957), (130, 244), (293, 746), (342, 849), (56, 964), (36, 492), (144, 427), (503, 911),
(616, 884), (83, 734), (689, 715),
(155, 829), (361, 421), (36, 626), (395, 477), (48, 469), (103, 482), (155, 796), (20, 33), (612, 632),
(135, 645), (107, 331), (562, 716),
(354, 664), (199, 392), (795, 802), (502, 796), (113, 902), (61, 624), (478, 717), (629, 647), (345, 956),
(127, 666), (698, 992), (636, 730),
(303, 807), (130, 869), (933, 981), (396, 818), (300, 938), (763, 893), (697, 980), (124, 829), (531, 881),
(193, 804), (39, 800), (401, 455),
(380, 774), (195, 466), (365, 808), (77, 647), (45, 979), (923, 956), (40, 497), (261, 922), (27, 967),
(532, 682), (582, 585), (221, 896),
(95, 235), (794, 839), (905, 910), (642, 798), (514, 715), (430, 536), (312, 519), (116, 968), (149, 436),
(579, 913), (432, 945), (86, 958),
(107, 425), (64, 101), (425, 792), (159, 751), (31, 977), (457, 810), (441, 702), (30, 427), (508, 941),
(884, 985), (332, 739), (80, 258),
(72, 360), (124, 367), (30, 708), (353, 356), (10, 182), (850, 997), (236, 838), (72, 374), (610, 914),
(146, 212), (3, 209), (674, 689), (749, 960),
(126, 922), (7, 765), (300, 377), (25, 706), (619, 955), (238, 879), (145, 191), (115, 464), (352, 488),
(724, 982), (133, 264), (28, 989),
(213, 370), (343, 484), (299, 983), (303, 959), (899, 981), (566, 651), (188, 334), (82, 607), (105, 546),
(315, 594), (160, 385), (150, 919),
(128, 968), (228, 823), (323, 520), (242, 248), (188, 772), (298, 381), (429, 679), (47, 881), (135, 615),
(21, 403), (79, 719), (74, 135),
(306, 430), (426, 563), (758, 948), (145, 605), (305, 432), (363, 652), (86, 254), (455, 647), (378, 652),
(541, 971), (59, 385), (8, 418),
(427, 985), (745, 922), (328, 451), (208, 380), (300, 975), (93, 482), (189, 973), (111, 815), (114, 283),
(571, 620), (157, 704), (719, 814),
(456, 950), (189, 408), (431, 786), (178, 442), (253, 982), (348, 464), (535, 959), (145, 363), (473, 646),
(88, 652), (494, 773), (208, 301),
(1, 850), (459, 715), (473, 633), (7, 223), (117, 305), (644, 787), (308, 558), (159, 623), (434, 723),
(482, 769), (94, 694), (509, 778),
(237, 983), (556, 780), (286, 415), (22, 647), (123, 276), (684, 904), (0, 41), (262, 407), (538, 911),
(595, 727), (405, 455), (104, 764),
(258, 362), (290, 892), (688, 773), (200, 930), (87, 609), (36, 72), (268, 811), (312, 546), (121, 348),
(542, 880), (255, 912), (780, 942),
(69, 167), (424, 881), (289, 296), (137, 532), (535, 587), (215, 642), (107, 544), (420, 978), (556, 649),
(413, 759), (797, 925), (285, 807),
(299, 452), (380, 581), (141, 643), (126, 160), (123, 713), (390, 410), (479, 605), (142, 573), (306, 684),
(362, 647), (484, 760), (223, 425),
(488, 500), (513, 711), (325, 504), (667, 981), (61, 454), (146, 307), (507, 763), (53, 908), (220, 636),
(26, 363), (400, 482), (10, 909),
(539, 866), (68, 703), (83, 887), (702, 972), (759, 946), (404, 685), (6, 369), (42, 118), (3, 635),
(276, 894), (655, 716), (299, 744), (232, 922),
(144, 769), (294, 586), (107, 195), (444, 471), (338, 733), (172, 393), (338, 431), (663, 918), (445, 703),
(151, 458), (725, 955), (151, 536),
(132, 323), (213, 932), (191, 454), (357, 808), (398, 437), (503, 826), (398, 747), (225, 814), (200, 449),
(209, 962), (600, 727), (50, 927),
(34, 397), (239, 648), (86, 891), (191, 372), (58, 760), (653, 693), (177, 238), (304, 625), (88, 627),
(721, 889), (524, 769), (291, 789),
(898, 904), (361, 421), (55, 469), (647, 713), (528, 681), (664, 979), (560, 977), (752, 952), (440, 956),
(465, 594), (260, 501), (487, 721),
(220, 345), (43, 272), (44, 53), (166, 358), (3, 296), (7, 670), (65, 143), (438, 805), (227, 696),
(623, 993), (406, 571), (226, 943), (197, 464),
(347, 622), (104, 620), (87, 904), (326, 813), (330, 548), (466, 914), (261, 332), (29, 534), (45, 194),
(82, 377), (214, 890), (354, 537),
(192, 857), (206, 257), (688, 746), (308, 753), (319, 529), (393, 880), (260, 493), (352, 892), (245, 729),
(45, 313), (565, 956), (9, 74),
(471, 507), (448, 741), (48, 939), (422, 828), (471, 505), (120, 450), (83, 87), (101, 246), (783, 846),
(157, 423), (905, 941), (218, 451),
(78, 627), (437, 837), (572, 772), (849, 907), (40, 403), (184, 981), (255, 501), (131, 225), (860, 891),
(285, 956), (327, 360), (109, 445),
(570, 921), (292, 624), (554, 807), (206, 728), (303, 796), (452, 526), (473, 619), (549, 649), (267, 279),
(16, 237), (121, 629), (728, 802),
(101, 176), (424, 750), (223, 254), (291, 900), (675, 753), (6, 759), (527, 548), (438, 879), (50, 124),
(393, 660), (121, 279), (754, 994),
(367, 578), (235, 691), (720, 733), (559, 676), (226, 288), (757, 851), (245, 923), (66, 530), (314, 690),
(239, 335), (382, 643), (293, 491),
(175, 596), (140, 829), (15, 566), (335, 516), (375, 599), (25, 650), (132, 831), (405, 897), (158, 999),
(181, 522), (78, 138), (211, 783),
(800, 937), (508, 793), (583, 785), (712, 992), (218, 240), (135, 750), (239, 835), (393, 778), (361, 623),
(135, 605), (510, 644), (922, 939),
(110, 630), (26, 853), (539, 610), (367, 500), (316, 466), (12, 981), (225, 568), (166, 668), (676, 900),
(506, 828), (755, 976), (492, 559),
(321, 721), (80, 868), (140, 265), (411, 618), (195, 720), (324, 841), (298, 804), (393, 966), (60, 213),
(39, 322), (743, 765), (255, 984),
(351, 883), (451, 684), (675, 743), (231, 673), (266, 352), (166, 687), (17, 312), (364, 587), (59, 552),
(645, 749), (154, 361), (22, 502),
(62, 642), (25, 247), (12, 46), (231, 968), (334, 669), (68, 840), (63, 353), (679, 995), (139, 432),
(913, 944), (221, 459), (145, 445), (319, 366),
(181, 664), (336, 745), (765, 804), (391, 418), (9, 419), (270, 912), (544, 546), (753, 823), (703, 721),
(473, 777), (42, 578), (125, 943),
(418, 976), (175, 399), (3, 512), (141, 636), (677, 889), (526, 860), (716, 750), (81, 151), (243, 337),
(848, 860), (180, 840), (252, 968),
(22, 979), (825, 917), (172, 760), (806, 858), (574, 700), (172, 735), (80, 437), (885, 949), (106, 612),
(635, 643), (468, 727), (153, 629),
(41, 616), (258, 348), (755, 825), (385, 750), (27, 640), (910, 954), (37, 508), (91, 366), (299, 687),
(154, 469), (241, 519), (166, 364),
(753, 771), (345, 414), (276, 822), (504, 967), (15, 402), (318, 543), (295, 922), (480, 563), (35, 1000),
(544, 791), (565, 584), (268, 956),
(39, 703), (404, 466), (122, 738), (354, 413), (52, 507), (21, 279), (35, 757), (260, 697), (595, 696),
(719, 794), (296, 951), (702, 780),
(212, 780), (528, 541), (348, 395), (854, 995), (213, 256), (118, 579), (337, 824), (965, 991), (248, 946),
(600, 690), (545, 737), (697, 896),
(166, 361), (888, 938), (156, 921), (338, 866), (765, 839), (9, 598), (52, 835), (159, 578), (352, 975),
(298, 371), (300, 640), (330, 506),
(413, 801), (441, 615), (1, 174), (144, 807), (44, 582), (128, 452), (349, 951), (9, 971), (491, 738),
(931, 969), (680, 838), (678, 797),
(263, 765), (192, 626), (71, 562), (434, 958), (285, 972), (178, 850), (173, 542), (64, 674), (161, 654),
(112, 593), (516, 964), (647, 815),
(556, 619), (393, 772), (272, 445), (291, 318), (13, 438), (797, 839), (287, 735), (264, 968)],
'k': 10,
'res': 889,
'sol': [359, 452, 456, 476, 508, 741, 828, 896, 928, 975]
},
{'A': [(394, 864), (776, 911), (41, 430), (265, 988), (497, 523), (414, 940), (802, 849), (310, 991), (366, 488),
(597, 913), (223, 929), (142, 516),
(143, 288), (97, 773), (633, 818), (256, 931), (545, 722), (616, 829), (150, 923), (101, 317), (75, 747),
(870, 920), (338, 700), (483, 573),
(103, 362), (323, 444), (625, 655), (209, 934), (565, 989), (453, 488), (533, 886), (63, 266), (824, 940),
(561, 937), (14, 95), (736, 860),
(408, 727), (803, 844), (640, 684), (1, 626), (505, 847), (341, 888), (249, 747), (333, 720), (64, 891),
(195, 939), (227, 581), (244, 822),
(145, 990), (556, 822), (93, 458), (82, 327), (520, 896), (501, 955), (111, 308), (298, 564), (127, 723),
(340, 560), (834, 944), (208, 553),
(818, 986), (560, 617), (294, 601), (93, 455), (610, 817), (324, 394), (247, 589), (188, 297), (193, 841),
(33, 191), (627, 672), (266, 487),
(70, 91), (695, 775), (133, 897), (153, 945), (39, 862), (82, 919), (716, 945), (553, 849), (400, 699),
(722, 857), (282, 537), (534, 831),
(241, 869), (220, 916), (603, 695), (845, 972), (429, 593), (281, 461), (504, 676), (656, 717), (812, 938),
(84, 365), (332, 627), (118, 498),
(601, 645), (343, 865), (194, 248), (16, 749), (119, 277), (225, 722), (380, 813), (174, 340), (436, 835),
(63, 103), (149, 801), (714, 875),
(46, 224), (587, 836), (649, 931), (547, 958), (616, 696), (27, 75), (127, 650), (193, 620), (589, 850),
(122, 400), (93, 379), (118, 853),
(37, 620), (22, 199), (984, 993), (189, 735), (126, 490), (215, 744), (62, 819), (695, 959), (23, 557),
(435, 635), (103, 855), (71, 266), (73, 226),
(308, 662), (358, 446), (62, 184), (478, 515), (40, 610), (103, 716), (204, 400), (266, 367), (749, 926),
(481, 858), (923, 940), (173, 583),
(688, 714), (208, 989), (59, 785), (692, 807), (162, 865), (165, 350), (256, 542), (120, 611), (452, 943),
(179, 681), (13, 482), (419, 697),
(582, 921), (520, 895), (318, 939), (365, 664), (397, 857), (256, 673), (157, 574), (12, 707), (468, 759),
(80, 343), (46, 756), (287, 557),
(138, 245), (780, 976), (360, 493), (294, 624), (367, 689), (604, 969), (648, 913), (635, 874), (135, 732),
(317, 397), (424, 766), (666, 848),
(1, 82), (196, 608), (342, 715), (163, 245), (228, 652), (387, 458), (727, 896), (581, 689), (424, 895),
(32, 411), (718, 892), (428, 581),
(678, 790), (47, 726), (169, 456), (65, 265), (161, 718), (457, 540), (498, 906), (574, 929), (618, 773),
(0, 905), (39, 506), (319, 333),
(478, 857), (51, 828), (842, 896), (831, 997), (192, 425), (561, 986), (85, 648), (742, 857), (15, 133),
(411, 972), (427, 694), (3, 323), (14, 218),
(734, 772), (2, 842), (541, 691), (100, 626), (121, 195), (622, 664), (203, 894), (286, 309), (186, 705),
(102, 487), (874, 944), (406, 642),
(22, 83), (281, 935), (463, 819), (118, 811), (262, 882), (136, 669), (533, 836), (660, 666), (117, 355),
(158, 892), (285, 871), (19, 43),
(41, 210), (265, 697), (322, 571), (375, 969), (581, 960), (869, 931), (43, 866), (767, 984), (622, 718),
(506, 671), (659, 729), (469, 924),
(445, 655), (381, 892), (182, 550), (212, 384), (298, 601), (9, 141), (154, 277), (341, 345), (376, 808),
(95, 735), (346, 798), (36, 635),
(42, 276), (153, 167), (296, 597), (369, 404), (132, 561), (117, 300), (489, 748), (245, 956), (49, 315),
(183, 877), (535, 746), (72, 309),
(412, 855), (306, 336), (111, 424), (101, 574), (492, 930), (345, 485), (817, 861), (831, 999), (127, 351),
(118, 490), (509, 716), (38, 436),
(309, 343), (703, 752), (159, 915), (170, 941), (578, 641), (384, 825), (654, 997), (67, 89), (86, 827),
(202, 767), (62, 226), (8, 394), (100, 403),
(531, 569), (296, 459), (500, 942), (598, 807), (695, 731), (222, 433), (85, 377), (225, 267), (599, 795),
(170, 441), (196, 367), (65, 117),
(841, 884), (718, 873), (28, 924), (462, 538), (693, 770), (121, 206), (407, 509), (212, 262), (43, 656),
(816, 970), (221, 638), (107, 149),
(202, 469), (370, 387), (559, 846), (107, 154), (499, 610), (151, 577), (415, 653), (433, 696), (533, 898),
(507, 695), (909, 939), (330, 853),
(510, 511), (650, 686), (206, 895), (555, 624), (224, 953), (9, 348), (722, 985), (764, 920), (325, 837),
(36, 329), (151, 537), (263, 895),
(617, 802), (159, 862), (388, 596), (301, 735), (723, 826), (67, 481), (86, 819), (528, 889), (40, 937),
(67, 230), (41, 133), (15, 307), (777, 864),
(338, 459), (164, 882), (152, 819), (671, 889), (471, 991), (380, 517), (391, 922), (514, 542), (34, 587),
(92, 694), (813, 824), (530, 776),
(78, 614), (436, 764), (772, 927), (211, 296), (548, 922), (427, 612), (845, 995), (493, 865), (810, 995),
(397, 622), (239, 600), (871, 885),
(20, 817), (672, 906), (0, 758), (186, 309), (519, 583), (260, 340), (67, 505), (268, 880), (844, 965),
(310, 791), (393, 417), (392, 829),
(63, 167), (656, 957), (130, 244), (293, 746), (342, 849), (56, 964), (36, 492), (144, 427), (503, 911),
(616, 884), (83, 734), (689, 715),
(155, 829), (361, 421), (36, 626), (395, 477), (48, 469), (103, 482), (155, 796), (20, 33), (612, 632),
(135, 645), (107, 331), (562, 716),
(354, 664), (199, 392), (795, 802), (502, 796), (113, 902), (61, 624), (478, 717), (629, 647), (345, 956),
(127, 666), (698, 992), (636, 730),
(303, 807), (130, 869), (933, 981), (396, 818), (300, 938), (763, 893), (697, 980), (124, 829), (531, 881),
(193, 804), (39, 800), (401, 455),
(380, 774), (195, 466), (365, 808), (77, 647), (45, 979), (923, 956), (40, 497), (261, 922), (27, 967),
(532, 682), (582, 585), (221, 896),
(95, 235), (794, 839), (905, 910), (642, 798), (514, 715), (430, 536), (312, 519), (116, 968), (149, 436),
(579, 913), (432, 945), (86, 958),
(107, 425), (64, 101), (425, 792), (159, 751), (31, 977), (457, 810), (441, 702), (30, 427), (508, 941),
(884, 985), (332, 739), (80, 258),
(72, 360), (124, 367), (30, 708), (353, 356), (10, 182), (850, 997), (236, 838), (72, 374), (610, 914),
(146, 212), (3, 209), (674, 689), (749, 960),
(126, 922), (7, 765), (300, 377), (25, 706), (619, 955), (238, 879), (145, 191), (115, 464), (352, 488),
(724, 982), (133, 264), (28, 989),
(213, 370), (343, 484), (299, 983), (303, 959), (899, 981), (566, 651), (188, 334), (82, 607), (105, 546),
(315, 594), (160, 385), (150, 919),
(128, 968), (228, 823), (323, 520), (242, 248), (188, 772), (298, 381), (429, 679), (47, 881), (135, 615),
(21, 403), (79, 719), (74, 135),
(306, 430), (426, 563), (758, 948), (145, 605), (305, 432), (363, 652), (86, 254), (455, 647), (378, 652),
(541, 971), (59, 385), (8, 418),
(427, 985), (745, 922), (328, 451), (208, 380), (300, 975), (93, 482), (189, 973), (111, 815), (114, 283),
(571, 620), (157, 704), (719, 814),
(456, 950), (189, 408), (431, 786), (178, 442), (253, 982), (348, 464), (535, 959), (145, 363), (473, 646),
(88, 652), (494, 773), (208, 301),
(1, 850), (459, 715), (473, 633), (7, 223), (117, 305), (644, 787), (308, 558), (159, 623), (434, 723),
(482, 769), (94, 694), (509, 778),
(237, 983), (556, 780), (286, 415), (22, 647), (123, 276), (684, 904), (0, 41), (262, 407), (538, 911),
(595, 727), (405, 455), (104, 764),
(258, 362), (290, 892), (688, 773), (200, 930), (87, 609), (36, 72), (268, 811), (312, 546), (121, 348),
(542, 880), (255, 912), (780, 942),
(69, 167), (424, 881), (289, 296), (137, 532), (535, 587), (215, 642), (107, 544), (420, 978), (556, 649),
(413, 759), (797, 925), (285, 807),
(299, 452), (380, 581), (141, 643), (126, 160), (123, 713), (390, 410), (479, 605), (142, 573), (306, 684),
(362, 647), (484, 760), (223, 425),
(488, 500), (513, 711), (325, 504), (667, 981), (61, 454), (146, 307), (507, 763), (53, 908), (220, 636),
(26, 363), (400, 482), (10, 909),
(539, 866), (68, 703), (83, 887), (702, 972), (759, 946), (404, 685), (6, 369), (42, 118), (3, 635),
(276, 894), (655, 716), (299, 744), (232, 922),
(144, 769), (294, 586), (107, 195), (444, 471), (338, 733), (172, 393), (338, 431), (663, 918), (445, 703),
(151, 458), (725, 955), (151, 536),
(132, 323), (213, 932), (191, 454), (357, 808), (398, 437), (503, 826), (398, 747), (225, 814), (200, 449),
(209, 962), (600, 727), (50, 927),
(34, 397), (239, 648), (86, 891), (191, 372), (58, 760), (653, 693), (177, 238), (304, 625), (88, 627),
(721, 889), (524, 769), (291, 789),
(898, 904), (361, 421), (55, 469), (647, 713), (528, 681), (664, 979), (560, 977), (752, 952), (440, 956),
(465, 594), (260, 501), (487, 721),
(220, 345), (43, 272), (44, 53), (166, 358), (3, 296), (7, 670), (65, 143), (438, 805), (227, 696),
(623, 993), (406, 571), (226, 943), (197, 464),
(347, 622), (104, 620), (87, 904), (326, 813), (330, 548), (466, 914), (261, 332), (29, 534), (45, 194),
(82, 377), (214, 890), (354, 537),
(192, 857), (206, 257), (688, 746), (308, 753), (319, 529), (393, 880), (260, 493), (352, 892), (245, 729),
(45, 313), (565, 956), (9, 74),
(471, 507), (448, 741), (48, 939), (422, 828), (471, 505), (120, 450), (83, 87), (101, 246), (783, 846),
(157, 423), (905, 941), (218, 451),
(78, 627), (437, 837), (572, 772), (849, 907), (40, 403), (184, 981), (255, 501), (131, 225), (860, 891),
(285, 956), (327, 360), (109, 445),
(570, 921), (292, 624), (554, 807), (206, 728), (303, 796), (452, 526), (473, 619), (549, 649), (267, 279),
(16, 237), (121, 629), (728, 802),
(101, 176), (424, 750), (223, 254), (291, 900), (675, 753), (6, 759), (527, 548), (438, 879), (50, 124),
(393, 660), (121, 279), (754, 994),
(367, 578), (235, 691), (720, 733), (559, 676), (226, 288), (757, 851), (245, 923), (66, 530), (314, 690),
(239, 335), (382, 643), (293, 491),
(175, 596), (140, 829), (15, 566), (335, 516), (375, 599), (25, 650), (132, 831), (405, 897), (158, 999),
(181, 522), (78, 138), (211, 783),
(800, 937), (508, 793), (583, 785), (712, 992), (218, 240), (135, 750), (239, 835), (393, 778), (361, 623),
(135, 605), (510, 644), (922, 939),
(110, 630), (26, 853), (539, 610), (367, 500), (316, 466), (12, 981), (225, 568), (166, 668), (676, 900),
(506, 828), (755, 976), (492, 559),
(321, 721), (80, 868), (140, 265), (411, 618), (195, 720), (324, 841), (298, 804), (393, 966), (60, 213),
(39, 322), (743, 765), (255, 984),
(351, 883), (451, 684), (675, 743), (231, 673), (266, 352), (166, 687), (17, 312), (364, 587), (59, 552),
(645, 749), (154, 361), (22, 502),
(62, 642), (25, 247), (12, 46), (231, 968), (334, 669), (68, 840), (63, 353), (679, 995), (139, 432),
(913, 944), (221, 459), (145, 445), (319, 366),
(181, 664), (336, 745), (765, 804), (391, 418), (9, 419), (270, 912), (544, 546), (753, 823), (703, 721),
(473, 777), (42, 578), (125, 943),
(418, 976), (175, 399), (3, 512), (141, 636), (677, 889), (526, 860), (716, 750), (81, 151), (243, 337),
(848, 860), (180, 840), (252, 968),
(22, 979), (825, 917), (172, 760), (806, 858), (574, 700), (172, 735), (80, 437), (885, 949), (106, 612),
(635, 643), (468, 727), (153, 629),
(41, 616), (258, 348), (755, 825), (385, 750), (27, 640), (910, 954), (37, 508), (91, 366), (299, 687),
(154, 469), (241, 519), (166, 364),
(753, 771), (345, 414), (276, 822), (504, 967), (15, 402), (318, 543), (295, 922), (480, 563), (35, 1000),
(544, 791), (565, 584), (268, 956),
(39, 703), (404, 466), (122, 738), (354, 413), (52, 507), (21, 279), (35, 757), (260, 697), (595, 696),
(719, 794), (296, 951), (702, 780),
(212, 780), (528, 541), (348, 395), (854, 995), (213, 256), (118, 579), (337, 824), (965, 991), (248, 946),
(600, 690), (545, 737), (697, 896),
(166, 361), (888, 938), (156, 921), (338, 866), (765, 839), (9, 598), (52, 835), (159, 578), (352, 975),
(298, 371), (300, 640), (330, 506),
(413, 801), (441, 615), (1, 174), (144, 807), (44, 582), (128, 452), (349, 951), (9, 971), (491, 738),
(931, 969), (680, 838), (678, 797),
(263, 765), (192, 626), (71, 562), (434, 958), (285, 972), (178, 850), (173, 542), (64, 674), (161, 654),
(112, 593), (516, 964), (647, 815),
(556, 619), (393, 772), (272, 445), (291, 318), (13, 438), (797, 839), (287, 735), (264, 968)],
'k': 100,
'res': 561,
'sol': [10, 15, 18, 27, 44, 45, 47, 48, 68, 74, 75, 76, 77, 84, 85, 119, 126, 130, 146, 149, 202, 206, 219, 224,
239, 248, 273, 275, 292, 293, 298, 317,
341, 343, 352, 357, 359, 365, 366, 388, 405, 412, 428, 437, 443, 452, 456, 459, 467, 471, 476, 490, 497,
502, 508, 520, 521, 522, 528, 551, 561,
569, 581, 596, 603, 636, 640, 643, 653, 667, 675, 677, 680, 713, 717, 725, 727, 741, 756, 793, 800, 805,
807, 817, 824, 828, 836, 846, 862, 864,
883, 894, 895, 896, 928, 952, 958, 962, 975, 985]
}
# gen_test()
]
def runtests(f):
ok = True
problems_count = 0
for t in tests[:]:
A = t['A']
k = t['k']
r = t['res']
s = t['sol']
print("-------------------")
if len(A) < 20:
print("len(A) =", len(A))
print("A :", A)
print("k :", k)
print("oczekiwana dlugosc przeciecia :", r)
print("przykladowe rozwiazanie :", s)
else:
print("len(A) =", len(A))
print("A : <<prefiks>>: ", A[:10], "...")
print("k :", k)
print("oczekiwana dlugosc przeciecia :", r)
print("przykladowe rozwiazanie : <<prefiks>>", s[:5], "...")
SOL = f(A.copy(), k)
if len(A) < 20:
print("uzyskane rozwiaznie :", SOL)
else:
print("uzyskane rozwiaznie : <<prefiks>>", SOL[:5], "...")
if len(SOL) != k:
print("Problem! Niezgodna dlugosc rozwiazania")
ok = False
problems_count += 1
continue
(a, b) = A[SOL[0]]
for i in range(1, k):
a = max(a, A[SOL[i]][0])
b = min(b, A[SOL[i]][1])
RES = b - a
print("uzyskana dlugosc przeciecia :", RES)
if RES != r:
print("Problem! Bledny wynik")
ok = False
problems_count += 1
continue
print("OK")
print("===============================")
if ok:
print('Wszystko OK!')
else:
print(f'PROBLEMY! Jest ich {problems_count}!')
| 64.727092 | 118 | 0.418921 | 4,748 | 32,493 | 2.864785 | 0.200505 | 0.000882 | 0.000882 | 0.001764 | 0.921335 | 0.914865 | 0.910601 | 0.907808 | 0.907808 | 0.906484 | 0 | 0.552582 | 0.296187 | 32,493 | 501 | 119 | 64.856287 | 0.042197 | 0.002831 | 0 | 0.805439 | 0 | 0 | 0.016885 | 0.000957 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004184 | false | 0 | 0.002092 | 0 | 0.008368 | 0.043933 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
a94563d0416b781ce9872dc95cb9c199408d04f2 | 187,293 | py | Python | fn_aws_guardduty/tests/mock_artifacts.py | nickpartner-goahead/resilient-community-apps | 097c0dbefddbd221b31149d82af9809420498134 | [
"MIT"
] | 65 | 2017-12-04T13:58:32.000Z | 2022-03-24T18:33:17.000Z | fn_aws_guardduty/tests/mock_artifacts.py | nickpartner-goahead/resilient-community-apps | 097c0dbefddbd221b31149d82af9809420498134 | [
"MIT"
] | 48 | 2018-03-02T19:17:14.000Z | 2022-03-09T22:00:38.000Z | fn_aws_guardduty/tests/mock_artifacts.py | nickpartner-goahead/resilient-community-apps | 097c0dbefddbd221b31149d82af9809420498134 | [
"MIT"
] | 95 | 2018-01-11T16:23:39.000Z | 2022-03-21T11:34:29.000Z | # -*- coding: utf-8 -*-
# (c) Copyright IBM Corp. 2010, 2020. All Rights Reserved.
# pragma pylint: disable=unused-argument, no-self-use
"""Generate Mock responses to simulate AWS IAM for Unit and function tests """
import datetime
from dateutil.tz import tzlocal
def get_mock_config():
config_data = u"""[fn_aws_guardduty]
aws_gd_access_key_id=<AWS_GUARDDUTY_ACCESS_KEY_ID>
aws_gd_secret_access_key=<AWS_GUARDDUTY_SECRET_ACCESS_KEY>
# Default or master region for the integration
aws_gd_master_region=us-west-2
# Filter by GuardDuty region names. Can be a string or regular expression.
# e.g. aws_gd_regions=^(us|eu).* to get Europe and US regions.
aws_gd_regions=.*
# Interval to refresh regions information (in minutes).
aws_gd_regions_interval=60
# Interval to poll Guardduty (in minutes).
aws_gd_polling_interval=0
# Optional - severity threshold (int) to use in criterion to filter findings
# results. (default 7).
# Severity ranges: 7.0 - 8.9 -> High, 4.0 - 6.9 -> Medium, 1.0 = 3.9 -> Low
aws_gd_severity_threshold = 7
# Optional - Lookback interval in minutes to check if findings updated
# since last run. Used in criteria for filtering findings retrieval (default 60).
aws_gd_lookback_interval=60
# Optional settings for access to GuardDuty via a proxy.
#http_proxy=http://proxy:80
#https_proxy=http://proxy:80
"""
return config_data
# Mocked IAM RAW Responses for standalone tests.
def get_cli_raw_responses(op):
response = {
"describe_regions": (
{'Regions': [{'Endpoint': 'ec2.eu-north-1.amazonaws.com', 'RegionName': 'eu-north-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.ap-south-1.amazonaws.com', 'RegionName': 'ap-south-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.eu-west-3.amazonaws.com', 'RegionName': 'eu-west-3',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.eu-west-2.amazonaws.com', 'RegionName': 'eu-west-2',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.eu-west-1.amazonaws.com', 'RegionName': 'eu-west-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.ap-northeast-2.amazonaws.com', 'RegionName': 'ap-northeast-2',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.ap-northeast-1.amazonaws.com', 'RegionName': 'ap-northeast-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.sa-east-1.amazonaws.com', 'RegionName': 'sa-east-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.ca-central-1.amazonaws.com', 'RegionName': 'ca-central-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.ap-southeast-1.amazonaws.com', 'RegionName': 'ap-southeast-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.ap-southeast-2.amazonaws.com', 'RegionName': 'ap-southeast-2',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.eu-central-1.amazonaws.com', 'RegionName': 'eu-central-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.us-east-1.amazonaws.com', 'RegionName': 'us-east-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.us-east-2.amazonaws.com', 'RegionName': 'us-east-2',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.us-west-1.amazonaws.com', 'RegionName': 'us-west-1',
'OptInStatus': 'opt-in-not-required'},
{'Endpoint': 'ec2.us-west-2.amazonaws.com', 'RegionName': 'us-west-2',
'OptInStatus': 'opt-in-not-required'}],
'ResponseMetadata': {'RequestId': '3f790a11-b512-4ce5-88b5-3990ffb17ddd', 'HTTPStatusCode': 200,
'HTTPHeaders': {'x-amzn-requestid': '3f790a11-b512-4ce5-88b5-3990ffb17ddd',
'cache-control': 'no-cache, no-store',
'strict-transport-security': 'max-age=31536000; includeSubDomains',
'content-type': 'text/xml;charset=UTF-8', 'content-length': '3655',
'vary': 'accept-encoding', 'date': 'Tue, 05 Jan 2021 12:23:16 GMT',
'server': 'AmazonEC2'}, 'RetryAttempts': 0}}
),
"list_detectors":(
{'ResponseMetadata': {'RequestId': '80ff249e-b880-420e-a6b9-0dea2621538c', 'HTTPStatusCode': 200,
'HTTPHeaders': {'date': 'Tue, 05 Jan 2021 12:38:42 GMT',
'content-type': 'application/json', 'content-length': '52',
'connection': 'keep-alive',
'x-amzn-requestid': '80ff249e-b880-420e-a6b9-0dea2621538c',
'x-amz-apigw-id': 'YrOY1ELbvHcFkgw=',
'x-amzn-trace-id': 'Root=1-5ff45dd2-121d57220a7661b070273cf3;Sampled=0'},
'RetryAttempts': 0}, 'DetectorIds': ['f2baedb0ac74f8f42fc929e15f56da6a']}
),
"list_findings": (
{'ResponseMetadata': {'RequestId': '5ac15376-acfc-4cfc-92df-99a800abcef4', 'HTTPStatusCode': 200,
'HTTPHeaders': {'date': 'Tue, 05 Jan 2021 12:39:39 GMT',
'content-type': 'application/json', 'content-length': '1827',
'connection': 'keep-alive',
'x-amzn-requestid': '5ac15376-acfc-4cfc-92df-99a800abcef4',
'x-amz-apigw-id': 'YrOhyEPQPHcFXXA=',
'x-amzn-trace-id': 'Root=1-5ff45e0b-4fe3b87818161b715dcf6423;Sampled=0'},
'RetryAttempts': 0},
'FindingIds': ['60baffd3f9042e38640f2300d5c5a631', 'b8baffd3f90472ebf26adce5cea33685',
'18baffd3f9039ba840cdf3ad226e36f7', '46baffd3f9047da341ecae8fe62222f1',
'58baffd3f903af9ee55660d3641e94d0', '86baffd3f9043322d9b01e914bf3ed06',
'ecbaffd3f90450f165ef8c16dc6761fa', '32baffd3f90387cc1097fdf9a63469d7',
'36baffd3f903e6ef24171ab379d6f2e5', '3ebaffd3f903e855c376579fadac3ab5',
'44baee3076b87eb8dc8f9cf466ad7be6', '84baffd3f903f16f015f27f8c5538971',
'aebaffd3f90321416f8dc65e7a33432d', 'd6baffd3f9030f1ee7e94aaf9b24dd0f',
'f0baffd3f90304127334a900af6b2ec0', 'f2baffd3f903a8b8f3b790876c898dd7',
'1cbaffd3f903727580993b3463f58d54', '26baffd3f9028d65d75617c853bbbde3',
'4cbaffd3f9032a2afb8f978410910037', '52baffd3f902c25f8bfe50fd3c24e65e',
'9cbaffd3f902cb9f1c0d72b159da54a9', 'a0baffd3f90359ef7dabf6e909ac4de5',
'b0baffd3f9035d4b2bf7f2f0ef9640f8', 'b4baffd3f903725f70a2ca050d4d069e',
'bebaffd3f903497caebd68453183fe6d', 'ccbaffd3f90334ead7c11d7c272e2f62',
'eabaffd3f90321c72661e4a6fd2c3a5f', '26baffd3f90271f68a72fd9b4dd70781',
'46baffd3f902c428bc63ecfa58b8fcca', '96baffd3f9029e01c233b528fa69beaf',
'aabaffd3f902de9df72d792845053730', '5cbaffd3f902047f93b465380729a395',
'c6baffd3f9023c12725d124dccc067f2', 'f8baffd3f9020ae11f6865ecd70e0f3b',
'4ebaffd3f901acd98e67760e8f59e276', '7abaffd3f901b30ed065c0e1b3ecedf6',
'c2baffd3f901f11dd8497af9e33514da', '6abaffd3f900b4dc806783697914ec74',
'c6baffd3f9015cef6d96bf4180d4d628', 'f0baffd3f9016aa774371b376f5c9453',
'f4baffd3f900aac88ce9932ee235c8e5', 'febaffd3f9017e10f9bfefb50f1a5567',
'0abaffd3f9001c147f1d263373f0befd', '14baffd3f900d3d455e790bb312fd897',
'58baffd3f90035ff28700f6ed4bd897f', 'b8baffd3f900972453b724a4147cba6f',
'e4baffd3f900ab39aace4e4e33237282', 'ecbaffd3f90098b4f377f05bf55aa3d1',
'10baffd3f9001e735506ba027e50f6a2', '3ebaffd3f90067948f21e00a686a41c2'],
'NextToken': '1606403892611-3ebaffd3f90067948f21e00a686a41c2'}
),
"get_findings": (
{'ResponseMetadata': {'RequestId': 'bad216aa-23fc-41d5-9c3e-bf0aa6188aa9', 'HTTPStatusCode': 200,
'HTTPHeaders': {'date': 'Tue, 05 Jan 2021 12:32:38 GMT',
'content-type': 'application/json', 'content-length': '4443',
'connection': 'keep-alive',
'x-amzn-requestid': 'bad216aa-23fc-41d5-9c3e-bf0aa6188aa9',
'x-amz-apigw-id': 'YrNf_FTZvHcFbDw=',
'x-amzn-trace-id': 'Root=1-5ff45c66-7ce841230ba9c7ca49bc09dd;Sampled=0'},
'RetryAttempts': 0}, 'Findings': [{'AccountId': '834299573936',
'Arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'CreatedAt': '2020-11-25T13:46:37.960Z',
'Description': 'An API was used to access a bucket from an IP address on a custom threat list.',
'Id': '60baffd3f9042e38640f2300d5c5a631',
'Partition': 'aws', 'Region': 'us-west-2',
'Resource': {'AccessKeyDetails': {
'AccessKeyId': 'GeneratedFindingAccessKeyId',
'PrincipalId': 'GeneratedFindingPrincipalId',
'UserName': 'GeneratedFindingUserName',
'UserType': 'IAMUser'}, 'S3BucketDetails': [
{'Arn': 'arn:aws:s3:::bucketName',
'Name': 'bucketName', 'Type': 'Destination',
'CreatedAt': datetime.datetime(2017, 12, 18,
15, 58, 11,
551000,
tzinfo=tzlocal()),
'Owner': {'Id': 'CanonicalId of Owner'},
'Tags': [{'Key': 'foo', 'Value': 'bar'}],
'DefaultServerSideEncryption': {
'EncryptionType': 'SSEAlgorithm',
'KmsMasterKeyArn': 'arn:aws:kms:region:123456789012:key/key-id'},
'PublicAccess': {'PermissionConfiguration': {
'BucketLevelPermissions': {
'AccessControlList': {
'AllowsPublicReadAccess': False,
'AllowsPublicWriteAccess': False},
'BucketPolicy': {
'AllowsPublicReadAccess': False,
'AllowsPublicWriteAccess': False},
'BlockPublicAccess': {
'IgnorePublicAcls': False,
'RestrictPublicBuckets': False,
'BlockPublicAcls': False,
'BlockPublicPolicy': False}},
'AccountLevelPermissions': {
'BlockPublicAccess': {
'IgnorePublicAcls': False,
'RestrictPublicBuckets': False,
'BlockPublicAcls': False,
'BlockPublicPolicy': False}}},
'EffectivePermission': 'NOT_PUBLIC'}}],
'InstanceDetails': {
'AvailabilityZone': 'GeneratedFindingInstaceAvailabilityZone',
'IamInstanceProfile': {
'Arn': 'arn:aws:iam::834299573936:example/instance/profile',
'Id': 'GeneratedFindingInstanceProfileId'},
'ImageDescription': 'GeneratedFindingInstaceImageDescription',
'ImageId': 'ami-99999999',
'InstanceId': 'i-99999999',
'InstanceState': 'running',
'InstanceType': 'm3.xlarge',
'OutpostArn': 'arn:aws:outposts:us-west-2:123456789000:outpost/op-0fbc006e9abbc73c3',
'LaunchTime': '2016-08-02T02:05:06Z',
'NetworkInterfaces': [
{'Ipv6Addresses': [],
'NetworkInterfaceId': 'eni-bfcffe88',
'PrivateDnsName': 'GeneratedFindingPrivateDnsName',
'PrivateIpAddress': '10.0.0.1',
'PrivateIpAddresses': [{
'PrivateDnsName': 'GeneratedFindingPrivateName',
'PrivateIpAddress': '10.0.0.1'}],
'PublicDnsName': 'GeneratedFindingPublicDNSName',
'PublicIp': '198.51.100.0',
'SecurityGroups': [{
'GroupId': 'GeneratedFindingSecurityId',
'GroupName': 'GeneratedFindingSecurityGroupName'}],
'SubnetId': 'GeneratedFindingSubnetId',
'VpcId': 'GeneratedFindingVPCId'}],
'ProductCodes': [{}], 'Tags': [{
'Key': 'GeneratedFindingInstaceTag1',
'Value': 'GeneratedFindingInstaceValue1'},
{
'Key': 'GeneratedFindingInstaceTag2',
'Value': 'GeneratedFindingInstaceTagValue2'},
{
'Key': 'GeneratedFindingInstaceTag3',
'Value': 'GeneratedFindingInstaceTagValue3'},
{
'Key': 'GeneratedFindingInstaceTag4',
'Value': 'GeneratedFindingInstaceTagValue4'},
{
'Key': 'GeneratedFindingInstaceTag5',
'Value': 'GeneratedFindingInstaceTagValue5'},
{
'Key': 'GeneratedFindingInstaceTag6',
'Value': 'GeneratedFindingInstaceTagValue6'},
{
'Key': 'GeneratedFindingInstaceTag7',
'Value': 'GeneratedFindingInstaceTagValue7'},
{
'Key': 'GeneratedFindingInstaceTag8',
'Value': 'GeneratedFindingInstaceTagValue8'},
{
'Key': 'GeneratedFindingInstaceTag9',
'Value': 'GeneratedFindingInstaceTagValue9'}]},
'ResourceType': 'S3Bucket'},
'SchemaVersion': '2.0', 'Service': {
'Action': {'ActionType': 'AWS_API_CALL',
'AwsApiCallAction': {'Api': 'GeneratedFindingAPIName', 'CallerType': 'Remote IP',
'RemoteIpDetails': {
'City': {'CityName': 'GeneratedFindingCityName'},
'Country': {'CountryName': 'GeneratedFindingCountryName'},
'GeoLocation': {'Lat': 0, 'Lon': 0},
'IpAddressV4': '198.51.100.0', 'Organization': {'Asn': '-1',
'AsnOrg': 'GeneratedFindingASNOrg',
'Isp': 'GeneratedFindingISP',
'Org': 'GeneratedFindingORG'}},
'ServiceName': 'GeneratedFindingAPIServiceName'}},
'Archived': False, 'Count': 4, 'DetectorId': 'f2baedb0ac74f8f42fc929e15f56da6a',
'EventFirstSeen': '2020-11-25T13:46:37.960Z', 'EventLastSeen': '2020-11-26T15:18:12.620Z',
'ResourceRole': 'TARGET', 'ServiceName': 'guardduty'}, 'Severity': 2,
'Title': 'API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'Type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'UpdatedAt': '2020-11-26T15:18:12.620Z'}]}
),
"archive_findings": (
{'ResponseMetadata': {'RetryAttempts': 0, 'HTTPStatusCode': 200,
'RequestId': 'af84ee9f-7267-478e-ade9-4ec1ce50b66c',
'HTTPHeaders': {'x-amzn-requestid': 'af84ee9f-7267-478e-ade9-4ec1ce50b66c',
'content-length': '0', 'x-amz-apigw-id': 'Z6P_6ET5iYcFlXg=',
'x-amzn-trace-id': 'Root=1-6013f9ff-7991c29e4458ff242c99d482;Sampled=0',
'connection': 'keep-alive', 'date': 'Fri, 29 Jan 2021 12:05:19 GMT',
'content-type': 'application/json'}}}
)
}
return response[op]
## Mock results.
def get_mocked_results(type):
response = {
"finding_payload_with_artifacts": (
{
'name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'description': {'format': 'text',
'content': 'An API was used to access a bucket from an IP address on a custom threat list.'},
'discovered_date': '2020-11-25T13:46:37.960Z', 'severity_code': 'Low',
'properties': {'aws_guardduty_finding_id': '60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'aws_guardduty_finding_updated_at': '2020-11-26T15:18:12.620Z',
'aws_guardduty_region': 'us-west-2', 'aws_guardduty_resource_type': 'S3Bucket',
'aws_guardduty_count': '4',
'aws_guardduty_detector_id': 'f2baedb0ac74f8f42fc929e15f56da6a'},
'artifacts': [],
'comments': [{'text': {'format': 'text',
'content': "AWS GuardDuty finding Payload:\n{ 'AccountId': '834299573936',\n 'Arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',\n 'CreatedAt': '2020-11-25T13:46:37.960Z',\n 'Description': 'An API was used to access a bucket from an IP address on a '\n 'custom threat list.',\n 'Id': '60baffd3f9042e38640f2300d5c5a631',\n 'Partition': 'aws',\n 'Region': 'us-west-2',\n 'Resource': { 'AccessKeyDetails': { 'AccessKeyId': 'GeneratedFindingAccessKeyId',\n 'PrincipalId': 'GeneratedFindingPrincipalId',\n 'UserName': 'GeneratedFindingUserName',\n 'UserType': 'IAMUser'},\n 'InstanceDetails': { 'AvailabilityZone': 'GeneratedFindingInstaceAvailabilityZone',\n 'IamInstanceProfile': { 'Arn': 'arn:aws:iam::834299573936:example/instance/profile',\n 'Id': 'GeneratedFindingInstanceProfileId'},\n 'ImageDescription': 'GeneratedFindingInstaceImageDescription',\n 'ImageId': 'ami-99999999',\n 'InstanceId': 'i-99999999',\n 'InstanceState': 'running',\n 'InstanceType': 'm3.xlarge',\n 'LaunchTime': '2016-08-02T02:05:06Z',\n 'NetworkInterfaces': [ { 'Ipv6Addresses': [ ],\n 'NetworkInterfaceId': 'eni-bfcffe88',\n 'PrivateDnsName': 'GeneratedFindingPrivateDnsName',\n 'PrivateIpAddress': '10.0.0.1',\n 'PrivateIpAddresses': [ { 'PrivateDnsName': 'GeneratedFindingPrivateName',\n 'PrivateIpAddress': '10.0.0.1'}],\n 'PublicDnsName': 'GeneratedFindingPublicDNSName',\n 'PublicIp': '198.51.100.0',\n 'SecurityGroups': [ { 'GroupId': 'GeneratedFindingSecurityId',\n 'GroupName': 'GeneratedFindingSecurityGroupName'}],\n 'SubnetId': 'GeneratedFindingSubnetId',\n 'VpcId': 'GeneratedFindingVPCId'}],\n 'OutpostArn': 'arn:aws:outposts:us-west-2:123456789000:outpost/op-0fbc006e9abbc73c3',\n 'ProductCodes': [{}],\n 'Tags': [ { 'Key': 'GeneratedFindingInstaceTag1',\n 'Value': 'GeneratedFindingInstaceValue1'},\n { 'Key': 'GeneratedFindingInstaceTag2',\n 'Value': 'GeneratedFindingInstaceTagValue2'},\n { 'Key': 'GeneratedFindingInstaceTag3',\n 'Value': 'GeneratedFindingInstaceTagValue3'},\n { 'Key': 'GeneratedFindingInstaceTag4',\n 'Value': 'GeneratedFindingInstaceTagValue4'},\n { 'Key': 'GeneratedFindingInstaceTag5',\n 'Value': 'GeneratedFindingInstaceTagValue5'},\n { 'Key': 'GeneratedFindingInstaceTag6',\n 'Value': 'GeneratedFindingInstaceTagValue6'},\n { 'Key': 'GeneratedFindingInstaceTag7',\n 'Value': 'GeneratedFindingInstaceTagValue7'},\n { 'Key': 'GeneratedFindingInstaceTag8',\n 'Value': 'GeneratedFindingInstaceTagValue8'},\n { 'Key': 'GeneratedFindingInstaceTag9',\n 'Value': 'GeneratedFindingInstaceTagValue9'}]},\n 'ResourceType': 'S3Bucket',\n 'S3BucketDetails': [ { 'Arn': 'arn:aws:s3:::bucketName',\n 'CreatedAt': '2017-12-18 '\n '15:58:11',\n 'DefaultServerSideEncryption': { 'EncryptionType': 'SSEAlgorithm',\n 'KmsMasterKeyArn': 'arn:aws:kms:region:123456789012:key/key-id'},\n 'Name': 'bucketName',\n 'Owner': { 'Id': 'CanonicalId '\n 'of Owner'},\n 'PublicAccess': { 'EffectivePermission': 'NOT_PUBLIC',\n 'PermissionConfiguration': { 'AccountLevelPermissions': { 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False}},\n 'BucketLevelPermissions': { 'AccessControlList': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False},\n 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False},\n 'BucketPolicy': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False}}}},\n 'Tags': [ { 'Key': 'foo',\n 'Value': 'bar'}],\n 'Type': 'Destination'}]},\n 'SchemaVersion': '2.0',\n 'Service': { 'Action': { 'ActionType': 'AWS_API_CALL',\n 'AwsApiCallAction': { 'Api': 'GeneratedFindingAPIName',\n 'CallerType': 'Remote '\n 'IP',\n 'RemoteIpDetails': { 'City': { 'CityName': 'GeneratedFindingCityName'},\n 'Country': { 'CountryName': 'GeneratedFindingCountryName'},\n 'GeoLocation': { 'Lat': 0,\n 'Lon': 0},\n 'IpAddressV4': '198.51.100.0',\n 'Organization': { 'Asn': '-1',\n 'AsnOrg': 'GeneratedFindingASNOrg',\n 'Isp': 'GeneratedFindingISP',\n 'Org': 'GeneratedFindingORG'}},\n 'ServiceName': 'GeneratedFindingAPIServiceName'}},\n 'Archived': False,\n 'Count': 4,\n 'DetectorId': 'f2baedb0ac74f8f42fc929e15f56da6a',\n 'EventFirstSeen': '2020-11-25T13:46:37.960Z',\n 'EventLastSeen': '2020-11-26T15:18:12.620Z',\n 'ResourceRole': 'TARGET',\n 'ServiceName': 'guardduty'},\n 'Severity': 2,\n 'Title': 'API GeneratedFindingAPIName was invoked from an IP address on a '\n 'custom threat list.',\n 'Type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',\n 'UpdatedAt': '2020-11-26T15:18:12.620Z'}"}}]} ),
"finding_payload_no_artifacts":(
{
'name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'description': {'format': 'text',
'content': 'An API was used to access a bucket from an IP address on a custom threat list.'},
'discovered_date': '2020-11-25T13:46:37.960Z', 'severity_code': 'Low',
'properties': {'aws_guardduty_finding_id': '60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'aws_guardduty_finding_updated_at': '2020-11-26T15:18:12.620Z',
'aws_guardduty_region': 'us-west-2', 'aws_guardduty_resource_type': 'S3Bucket',
'aws_guardduty_count': '4',
'aws_guardduty_detector_id': 'f2baedb0ac74f8f42fc929e15f56da6a'}, 'artifacts': [
{'type': {'name': 'aws_iam_access_key_id'}, 'description': {'format': 'text',
'content': "'AWS IAM Access Key ID' extracted from GuardDuty from finding property 'AccessKeyId' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingAccessKeyId'}, {'type': {'name': 'aws_iam_user_name'},
'description': {'format': 'text',
'content': "'AWS IAM User Name' extracted from GuardDuty from finding property 'UserName' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingUserName'},
{'type': {'name': 'IP Address'}, 'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'IpAddressV4' at path '['Service', 'Action', 'AwsApiCallAction', 'RemoteIpDetails']'."},
'value': '198.51.100.0'}, {'type': {'name': 'IP Address'}, 'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'PrivateIpAddress' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': '10.0.0.1'}, {'type': {'name': 'DNS Name'},
'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PublicDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPublicDNSName'},
{'type': {'name': 'DNS Name'}, 'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPrivateDnsName'}, {'type': {'name': 'DNS Name'},
'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0, 'PrivateIpAddresses', 0]'."},
'value': 'GeneratedFindingPrivateName'}], 'comments': [{
'text': {
'format': 'text',
'content': "AWS GuardDuty finding Payload:\n{ 'AccountId': '834299573936',\n 'Arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',\n 'CreatedAt': '2020-11-25T13:46:37.960Z',\n 'Description': 'An API was used to access a bucket from an IP address on a '\n 'custom threat list.',\n 'Id': '60baffd3f9042e38640f2300d5c5a631',\n 'Partition': 'aws',\n 'Region': 'us-west-2',\n 'Resource': { 'AccessKeyDetails': { 'AccessKeyId': 'GeneratedFindingAccessKeyId',\n 'PrincipalId': 'GeneratedFindingPrincipalId',\n 'UserName': 'GeneratedFindingUserName',\n 'UserType': 'IAMUser'},\n 'InstanceDetails': { 'AvailabilityZone': 'GeneratedFindingInstaceAvailabilityZone',\n 'IamInstanceProfile': { 'Arn': 'arn:aws:iam::834299573936:example/instance/profile',\n 'Id': 'GeneratedFindingInstanceProfileId'},\n 'ImageDescription': 'GeneratedFindingInstaceImageDescription',\n 'ImageId': 'ami-99999999',\n 'InstanceId': 'i-99999999',\n 'InstanceState': 'running',\n 'InstanceType': 'm3.xlarge',\n 'LaunchTime': '2016-08-02T02:05:06Z',\n 'NetworkInterfaces': [ { 'Ipv6Addresses': [ ],\n 'NetworkInterfaceId': 'eni-bfcffe88',\n 'PrivateDnsName': 'GeneratedFindingPrivateDnsName',\n 'PrivateIpAddress': '10.0.0.1',\n 'PrivateIpAddresses': [ { 'PrivateDnsName': 'GeneratedFindingPrivateName',\n 'PrivateIpAddress': '10.0.0.1'}],\n 'PublicDnsName': 'GeneratedFindingPublicDNSName',\n 'PublicIp': '198.51.100.0',\n 'SecurityGroups': [ { 'GroupId': 'GeneratedFindingSecurityId',\n 'GroupName': 'GeneratedFindingSecurityGroupName'}],\n 'SubnetId': 'GeneratedFindingSubnetId',\n 'VpcId': 'GeneratedFindingVPCId'}],\n 'OutpostArn': 'arn:aws:outposts:us-west-2:123456789000:outpost/op-0fbc006e9abbc73c3',\n 'ProductCodes': [{}],\n 'Tags': [ { 'Key': 'GeneratedFindingInstaceTag1',\n 'Value': 'GeneratedFindingInstaceValue1'},\n { 'Key': 'GeneratedFindingInstaceTag2',\n 'Value': 'GeneratedFindingInstaceTagValue2'},\n { 'Key': 'GeneratedFindingInstaceTag3',\n 'Value': 'GeneratedFindingInstaceTagValue3'},\n { 'Key': 'GeneratedFindingInstaceTag4',\n 'Value': 'GeneratedFindingInstaceTagValue4'},\n { 'Key': 'GeneratedFindingInstaceTag5',\n 'Value': 'GeneratedFindingInstaceTagValue5'},\n { 'Key': 'GeneratedFindingInstaceTag6',\n 'Value': 'GeneratedFindingInstaceTagValue6'},\n { 'Key': 'GeneratedFindingInstaceTag7',\n 'Value': 'GeneratedFindingInstaceTagValue7'},\n { 'Key': 'GeneratedFindingInstaceTag8',\n 'Value': 'GeneratedFindingInstaceTagValue8'},\n { 'Key': 'GeneratedFindingInstaceTag9',\n 'Value': 'GeneratedFindingInstaceTagValue9'}]},\n 'ResourceType': 'S3Bucket',\n 'S3BucketDetails': [ { 'Arn': 'arn:aws:s3:::bucketName',\n 'CreatedAt': '2017-12-18 '\n '15:58:11',\n 'DefaultServerSideEncryption': { 'EncryptionType': 'SSEAlgorithm',\n 'KmsMasterKeyArn': 'arn:aws:kms:region:123456789012:key/key-id'},\n 'Name': 'bucketName',\n 'Owner': { 'Id': 'CanonicalId '\n 'of Owner'},\n 'PublicAccess': { 'EffectivePermission': 'NOT_PUBLIC',\n 'PermissionConfiguration': { 'AccountLevelPermissions': { 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False}},\n 'BucketLevelPermissions': { 'AccessControlList': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False},\n 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False},\n 'BucketPolicy': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False}}}},\n 'Tags': [ { 'Key': 'foo',\n 'Value': 'bar'}],\n 'Type': 'Destination'}]},\n 'SchemaVersion': '2.0',\n 'Service': { 'Action': { 'ActionType': 'AWS_API_CALL',\n 'AwsApiCallAction': { 'Api': 'GeneratedFindingAPIName',\n 'CallerType': 'Remote '\n 'IP',\n 'RemoteIpDetails': { 'City': { 'CityName': 'GeneratedFindingCityName'},\n 'Country': { 'CountryName': 'GeneratedFindingCountryName'},\n 'GeoLocation': { 'Lat': 0,\n 'Lon': 0},\n 'IpAddressV4': '198.51.100.0',\n 'Organization': { 'Asn': '-1',\n 'AsnOrg': 'GeneratedFindingASNOrg',\n 'Isp': 'GeneratedFindingISP',\n 'Org': 'GeneratedFindingORG'}},\n 'ServiceName': 'GeneratedFindingAPIServiceName'}},\n 'Archived': False,\n 'Count': 4,\n 'DetectorId': 'f2baedb0ac74f8f42fc929e15f56da6a',\n 'EventFirstSeen': '2020-11-25T13:46:37.960Z',\n 'EventLastSeen': '2020-11-26T15:18:12.620Z',\n 'ResourceRole': 'TARGET',\n 'ServiceName': 'guardduty'},\n 'Severity': 2,\n 'Title': 'API GeneratedFindingAPIName was invoked from an IP address on a '\n 'custom threat list.',\n 'Type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',\n 'UpdatedAt': '2020-11-26T15:18:12.620Z'}"}}]}
),
"finding_payload_with_artifacts_with_refresh": (
{
'name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'description': {'format': 'text',
'content': 'An API was used to access a bucket from an IP address on a custom threat list.'},
'discovered_date': '2020-11-25T13:46:37.960Z', 'severity_code': 'Low',
'properties': {'aws_guardduty_finding_id': '60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'aws_guardduty_finding_updated_at': '2020-11-26T15:18:12.620Z',
'aws_guardduty_region': 'us-west-2', 'aws_guardduty_resource_type': 'S3Bucket',
'aws_guardduty_count': '4',
'aws_guardduty_detector_id': 'f2baedb0ac74f8f42fc929e15f56da6a'}, 'artifacts': [
{'type': {'name': 'aws_iam_access_key_id'}, 'description': {'format': 'text',
'content': "'AWS IAM Access Key ID' extracted from GuardDuty from finding property 'AccessKeyId' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingAccessKeyId'}, {'type': {'name': 'aws_iam_user_name'},
'description': {'format': 'text',
'content': "'AWS IAM User Name' extracted from GuardDuty from finding property 'UserName' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingUserName'},
{'type': {'name': 'IP Address'}, 'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'IpAddressV4' at path '['Service', 'Action', 'AwsApiCallAction', 'RemoteIpDetails']'."},
'value': '198.51.100.0'}, {'type': {'name': 'IP Address'}, 'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'PrivateIpAddress' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': '10.0.0.1'}, {'type': {'name': 'DNS Name'},
'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PublicDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPublicDNSName'},
{'type': {'name': 'DNS Name'}, 'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPrivateDnsName'}, {'type': {'name': 'DNS Name'},
'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0, 'PrivateIpAddresses', 0]'."},
'value': 'GeneratedFindingPrivateName'}], 'comments': [{
'text': {
'format': 'text',
'content': "AWS GuardDuty finding Payload for refresh:\n{ 'AccountId': '834299573936',\n 'Arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',\n 'CreatedAt': '2020-11-25T13:46:37.960Z',\n 'Description': 'An API was used to access a bucket from an IP address on a '\n 'custom threat list.',\n 'Id': '60baffd3f9042e38640f2300d5c5a631',\n 'Partition': 'aws',\n 'Region': 'us-west-2',\n 'Resource': { 'AccessKeyDetails': { 'AccessKeyId': 'GeneratedFindingAccessKeyId',\n 'PrincipalId': 'GeneratedFindingPrincipalId',\n 'UserName': 'GeneratedFindingUserName',\n 'UserType': 'IAMUser'},\n 'InstanceDetails': { 'AvailabilityZone': 'GeneratedFindingInstaceAvailabilityZone',\n 'IamInstanceProfile': { 'Arn': 'arn:aws:iam::834299573936:example/instance/profile',\n 'Id': 'GeneratedFindingInstanceProfileId'},\n 'ImageDescription': 'GeneratedFindingInstaceImageDescription',\n 'ImageId': 'ami-99999999',\n 'InstanceId': 'i-99999999',\n 'InstanceState': 'running',\n 'InstanceType': 'm3.xlarge',\n 'LaunchTime': '2016-08-02T02:05:06Z',\n 'NetworkInterfaces': [ { 'Ipv6Addresses': [ ],\n 'NetworkInterfaceId': 'eni-bfcffe88',\n 'PrivateDnsName': 'GeneratedFindingPrivateDnsName',\n 'PrivateIpAddress': '10.0.0.1',\n 'PrivateIpAddresses': [ { 'PrivateDnsName': 'GeneratedFindingPrivateName',\n 'PrivateIpAddress': '10.0.0.1'}],\n 'PublicDnsName': 'GeneratedFindingPublicDNSName',\n 'PublicIp': '198.51.100.0',\n 'SecurityGroups': [ { 'GroupId': 'GeneratedFindingSecurityId',\n 'GroupName': 'GeneratedFindingSecurityGroupName'}],\n 'SubnetId': 'GeneratedFindingSubnetId',\n 'VpcId': 'GeneratedFindingVPCId'}],\n 'OutpostArn': 'arn:aws:outposts:us-west-2:123456789000:outpost/op-0fbc006e9abbc73c3',\n 'ProductCodes': [{}],\n 'Tags': [ { 'Key': 'GeneratedFindingInstaceTag1',\n 'Value': 'GeneratedFindingInstaceValue1'},\n { 'Key': 'GeneratedFindingInstaceTag2',\n 'Value': 'GeneratedFindingInstaceTagValue2'},\n { 'Key': 'GeneratedFindingInstaceTag3',\n 'Value': 'GeneratedFindingInstaceTagValue3'},\n { 'Key': 'GeneratedFindingInstaceTag4',\n 'Value': 'GeneratedFindingInstaceTagValue4'},\n { 'Key': 'GeneratedFindingInstaceTag5',\n 'Value': 'GeneratedFindingInstaceTagValue5'},\n { 'Key': 'GeneratedFindingInstaceTag6',\n 'Value': 'GeneratedFindingInstaceTagValue6'},\n { 'Key': 'GeneratedFindingInstaceTag7',\n 'Value': 'GeneratedFindingInstaceTagValue7'},\n { 'Key': 'GeneratedFindingInstaceTag8',\n 'Value': 'GeneratedFindingInstaceTagValue8'},\n { 'Key': 'GeneratedFindingInstaceTag9',\n 'Value': 'GeneratedFindingInstaceTagValue9'}]},\n 'ResourceType': 'S3Bucket',\n 'S3BucketDetails': [ { 'Arn': 'arn:aws:s3:::bucketName',\n 'CreatedAt': '2017-12-18 '\n '15:58:11',\n 'DefaultServerSideEncryption': { 'EncryptionType': 'SSEAlgorithm',\n 'KmsMasterKeyArn': 'arn:aws:kms:region:123456789012:key/key-id'},\n 'Name': 'bucketName',\n 'Owner': { 'Id': 'CanonicalId '\n 'of Owner'},\n 'PublicAccess': { 'EffectivePermission': 'NOT_PUBLIC',\n 'PermissionConfiguration': { 'AccountLevelPermissions': { 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False}},\n 'BucketLevelPermissions': { 'AccessControlList': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False},\n 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False},\n 'BucketPolicy': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False}}}},\n 'Tags': [ { 'Key': 'foo',\n 'Value': 'bar'}],\n 'Type': 'Destination'}]},\n 'SchemaVersion': '2.0',\n 'Service': { 'Action': { 'ActionType': 'AWS_API_CALL',\n 'AwsApiCallAction': { 'Api': 'GeneratedFindingAPIName',\n 'CallerType': 'Remote '\n 'IP',\n 'RemoteIpDetails': { 'City': { 'CityName': 'GeneratedFindingCityName'},\n 'Country': { 'CountryName': 'GeneratedFindingCountryName'},\n 'GeoLocation': { 'Lat': 0,\n 'Lon': 0},\n 'IpAddressV4': '198.51.100.0',\n 'Organization': { 'Asn': '-1',\n 'AsnOrg': 'GeneratedFindingASNOrg',\n 'Isp': 'GeneratedFindingISP',\n 'Org': 'GeneratedFindingORG'}},\n 'ServiceName': 'GeneratedFindingAPIServiceName'}},\n 'Archived': False,\n 'Count': 4,\n 'DetectorId': 'f2baedb0ac74f8f42fc929e15f56da6a',\n 'EventFirstSeen': '2020-11-25T13:46:37.960Z',\n 'EventLastSeen': '2020-11-26T15:18:12.620Z',\n 'ResourceRole': 'TARGET',\n 'ServiceName': 'guardduty'},\n 'Severity': 2,\n 'Title': 'API GeneratedFindingAPIName was invoked from an IP address on a '\n 'custom threat list.',\n 'Type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',\n 'UpdatedAt': '2020-11-26T15:18:12.620Z'}"}}]} ),
"finding_payload_data_tables":(
{"gd_finding_overview": [{"cells": {"count": {"value": "4"},
"severity": {"value": "2"},
"query_execution_date": {"value": "2021-01-22 15:45:26"},
"resource_id": {"value": "bucketName"},
"created_at": {"value": "2020-11-25T13:46:37.960Z"},
"updated_at": {"value": "2020-11-26T15:18:12.620Z"},
"region": {"value": "us-west-2"},
"account_id": {"value": "834299573936"}}}
],
"gd_instance_details": [{"cells": {"public_ip": {"value": "198.51.100.0"},
"private_dns_name": {"value": "GeneratedFindingPrivateDnsName"},
"query_execution_date": {"value": "2021-01-22 15:45:26"},
"instance_id": {"value": "i-99999999"},
"instance_state": {"value": "running"},
"public_dns_name": {"value": "GeneratedFindingPublicDNSName"},
"type": {"value": "m3.xlarge"}, "private_ip": {"value": "10.0.0.1"}}}
],
"gd_access_key_details": [{"cells": {"principal_id": {"value": "GeneratedFindingPrincipalId"},
"user_name": {"value": "GeneratedFindingUserName"},
"user_type": {"value": "IAMUser"},
"query_execution_date": {"value": "2021-01-22 15:45:26"},
"access_key_id": {"value": "GeneratedFindingAccessKeyId"}}}
],
"gd_resource_affected": [{"cells": {"instance_id": {"value": "bucketName"},
"instance_type": {"value": "Destination"},
"resource_role": {"value": "TARGET"},
"query_execution_date": {"value": "2021-01-22 15:45:26"},
"resource_type": {"value": "S3Bucket"}}}
],
"gd_action_details": [{"cells": {"action_type": {"value": "AWS_API_CALL"},
"query_execution_date": {"value": "2021-01-22 15:45:26"},
"actor_caller_type": {"value": "Remote IP"},
"service_name": {"value": "GeneratedFindingAPIServiceName"},
"isp": {"value": "GeneratedFindingISP"},
"action_api": {"value": "GeneratedFindingAPIName"},
"country_name": {"value": "GeneratedFindingCountryName"},
"event_first_seen": {"value": "2020-11-25T13:46:37.960Z"},
"event_last_seen": {"value": "2020-11-26T15:18:12.620Z"},
"asn_org": {"value": "GeneratedFindingASNOrg"},
"city_name": {"value": "GeneratedFindingCityName"},
"org": {"value": "GeneratedFindingORG"},
"asn": {"value": "-1"},
"remote_ip": {"value": "198.51.100.0"}}}
],
"gd_s3_bucket_details": [{"cells": {"bucket_type": {"value": "Destination"},
"query_execution_date": {"value": "2021-01-22 15:45:26"},
"encryption_type": {"value": "SSEAlgorithm"},
"bucket_name": {"value": "bucketName"},
"kms_master_key_arn": {
"value": "arn:aws:kms:region:123456789012:key/key-id"},
"bucket_arn": {"value": "arn:aws:s3:::bucketName"},
"bucket_owner": {"value": "CanonicalId of Owner"},
"effective_permissions": {"value": "NOT_PUBLIC"}}}
]
}
),
"replace_datetime": (
{'description': {
'content': 'An API was used to access a bucket from an IP address on a custom threat list.',
'format': 'text'},
'discovered_date': '2020-11-25T13:46:37.960Z',
'name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'properties': {'aws_guardduty_archived': 'False',
'aws_guardduty_count': '4',
'aws_guardduty_detector_id': 'f2baedb0ac74f8f42fc929e15f56da6a',
'aws_guardduty_finding_arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_id': '60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'aws_guardduty_finding_updated_at': '2020-11-26T15:18:12.620Z',
'aws_guardduty_region': 'us-west-2',
'aws_guardduty_resource_type': 'S3Bucket',
'aws_guardduty_severity': '2'},
'severity_code': 'Low'}
),
"refresh_finding_no_artifacts": (
{
'name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'description': {'format': 'text',
'content': 'An API was used to access a bucket from an IP address on a custom threat list.'},
'discovered_date': '2020-11-25T13:46:37.960Z', 'severity_code': 'Low',
'properties': {'aws_guardduty_finding_id': '60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'aws_guardduty_finding_updated_at': '2020-11-26T15:18:12.620Z',
'aws_guardduty_region': 'us-west-2', 'aws_guardduty_severity': '2',
'aws_guardduty_resource_type': 'S3Bucket',
'aws_guardduty_detector_id': 'f2baedb0ac74f8f42fc929e15f56da6a',
'aws_guardduty_count': '4', 'aws_guardduty_archived': 'False'}, 'artifacts': [
{'type': {'name': 'aws_iam_access_key_id'}, 'description': {'format': 'text',
'content': "'AWS IAM Access Key ID' extracted from GuardDuty from finding property 'AccessKeyId' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingAccessKeyId'}, {'type': {'name': 'aws_iam_user_name'},
'description': {'format': 'text',
'content': "'AWS IAM User Name' extracted from GuardDuty from finding property 'UserName' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingUserName'},
{'type': {'name': 'aws_s3_bucket_name'}, 'description': {'format': 'text',
'content': "'AWS S3 Bucket Name' extracted from GuardDuty from finding property 'Name' at path '['Resource', 'S3BucketDetails', 0]'."},
'value': 'bucketName'}, {'type': {'name': 'IP Address'}, 'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'IpAddressV4' at path '['Service', 'Action', 'AwsApiCallAction', 'RemoteIpDetails']'."},
'value': '198.51.100.0'}, {'type': {'name': 'IP Address'},
'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'PrivateIpAddress' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': '10.0.0.1'},
{'type': {'name': 'DNS Name'}, 'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PublicDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPublicDNSName'}, {'type': {'name': 'DNS Name'},
'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPrivateDnsName'},
{'type': {'name': 'DNS Name'}, 'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0, 'PrivateIpAddresses', 0]'."},
'value': 'GeneratedFindingPrivateName'}], 'comments': []}
),
"refresh_finding_no_artifacts_py2": (
{u'discovered_date': u'2020-11-25T13:46:37.960Z', u'description': {
u'content': u'An API was used to access a bucket from an IP address on a custom threat list.',
u'format': u'text'}, u'artifacts': [{u'type': {u'name': u'DNS Name'}, u'description': {
u'content': u"'DNS Name' extracted from GuardDuty from finding property 'PublicDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'.",
u'format': u'text'}, u'value': u'GeneratedFindingPublicDNSName'},
{u'type': {u'name': u'aws_iam_user_name'}, u'description': {
u'content': u"'AWS IAM User Name' extracted from GuardDuty from finding property 'UserName' at path '['Resource', 'AccessKeyDetails']'.",
u'format': u'text'}, u'value': u'GeneratedFindingUserName'},
{u'type': {u'name': u'aws_s3_bucket_name'}, u'description': {
u'content': u"'AWS S3 Bucket Name' extracted from GuardDuty from finding property 'Name' at path '['Resource', 'S3BucketDetails', 0]'.",
u'format': u'text'}, u'value': u'bucketName'},
{u'type': {u'name': u'aws_iam_access_key_id'}, u'description': {
u'content': u"'AWS IAM Access Key ID' extracted from GuardDuty from finding property 'AccessKeyId' at path '['Resource', 'AccessKeyDetails']'.",
u'format': u'text'}, u'value': u'GeneratedFindingAccessKeyId'},
{u'type': {u'name': u'DNS Name'}, u'description': {
u'content': u"'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0, 'PrivateIpAddresses', 0]'.",
u'format': u'text'}, u'value': u'GeneratedFindingPrivateName'},
{u'type': {u'name': u'IP Address'}, u'description': {
u'content': u"'IP Address' extracted from GuardDuty from finding property 'IpAddressV4' at path '['Service', 'Action', 'AwsApiCallAction', 'RemoteIpDetails']'.",
u'format': u'text'}, u'value': u'198.51.100.0'},
{u'type': {u'name': u'IP Address'}, u'description': {
u'content': u"'IP Address' extracted from GuardDuty from finding property 'PrivateIpAddress' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0, 'PrivateIpAddresses', 0]'.",
u'format': u'text'}, u'value': u'10.0.0.1'},
{u'type': {u'name': u'DNS Name'}, u'description': {
u'content': u"'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'.",
u'format': u'text'},
u'value': u'GeneratedFindingPrivateDnsName'}],
u'severity_code': u'Low', u'comments': [], u'properties': {u'aws_guardduty_resource_type': u'S3Bucket',
u'aws_guardduty_finding_updated_at': u'2020-11-26T15:18:12.620Z',
u'aws_guardduty_count': '4',
u'aws_guardduty_finding_arn': u'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
u'aws_guardduty_detector_id': u'f2baedb0ac74f8f42fc929e15f56da6a',
u'aws_guardduty_finding_type': u'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
u'aws_guardduty_finding_id': u'60baffd3f9042e38640f2300d5c5a631',
u'aws_guardduty_archived': u'False',
u'aws_guardduty_region': u'us-west-2',
u'aws_guardduty_severity': '2'},
u'name': u'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.'}
),
"refresh_finding_with_artifacts": (
[]
),
"refresh_finding_to_json": (
{'timestamp': '2021-01-22 15:45:26',
'region': 'us-west-2',
'finding': {'AccountId': '834299573936',
'Arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'CreatedAt': '2020-11-25T13:46:37.960Z',
'Description': 'An API was used to access a bucket from an IP address on a custom threat list.',
'Id': '60baffd3f9042e38640f2300d5c5a631', 'Partition': 'aws', 'Region': 'us-west-2',
'Resource': {'AccessKeyDetails': {'AccessKeyId': 'GeneratedFindingAccessKeyId',
'PrincipalId': 'GeneratedFindingPrincipalId',
'UserName': 'GeneratedFindingUserName',
'UserType': 'IAMUser'}, 'S3BucketDetails': [
{'Arn': 'arn:aws:s3:::bucketName', 'Name': 'bucketName', 'Type': 'Destination',
'CreatedAt': '2017-12-18 15:58:11', 'Owner': {'Id': 'CanonicalId of Owner'},
'Tags': [{'Key': 'foo', 'Value': 'bar'}],
'DefaultServerSideEncryption': {'EncryptionType': 'SSEAlgorithm',
'KmsMasterKeyArn': 'arn:aws:kms:region:123456789012:key/key-id'},
'PublicAccess': {'PermissionConfiguration': {'BucketLevelPermissions': {
'AccessControlList': {'AllowsPublicReadAccess': False,
'AllowsPublicWriteAccess': False},
'BucketPolicy': {'AllowsPublicReadAccess': False, 'AllowsPublicWriteAccess': False},
'BlockPublicAccess': {'IgnorePublicAcls': False, 'RestrictPublicBuckets': False,
'BlockPublicAcls': False, 'BlockPublicPolicy': False}},
'AccountLevelPermissions': {
'BlockPublicAccess': {
'IgnorePublicAcls': False,
'RestrictPublicBuckets': False,
'BlockPublicAcls': False,
'BlockPublicPolicy': False}}},
'EffectivePermission': 'NOT_PUBLIC'}}],
'InstanceDetails': {'AvailabilityZone': 'GeneratedFindingInstaceAvailabilityZone',
'IamInstanceProfile': {
'Arn': 'arn:aws:iam::834299573936:example/instance/profile',
'Id': 'GeneratedFindingInstanceProfileId'},
'ImageDescription': 'GeneratedFindingInstaceImageDescription',
'ImageId': 'ami-99999999', 'InstanceId': 'i-99999999',
'InstanceState': 'running', 'InstanceType': 'm3.xlarge',
'OutpostArn': 'arn:aws:outposts:us-west-2:123456789000:outpost/op-0fbc006e9abbc73c3',
'LaunchTime': '2016-08-02T02:05:06Z', 'NetworkInterfaces': [
{'Ipv6Addresses': [], 'NetworkInterfaceId': 'eni-bfcffe88',
'PrivateDnsName': 'GeneratedFindingPrivateDnsName',
'PrivateIpAddress': '10.0.0.1', 'PrivateIpAddresses': [
{'PrivateDnsName': 'GeneratedFindingPrivateName',
'PrivateIpAddress': '10.0.0.1'}],
'PublicDnsName': 'GeneratedFindingPublicDNSName',
'PublicIp': '198.51.100.0', 'SecurityGroups': [
{'GroupId': 'GeneratedFindingSecurityId',
'GroupName': 'GeneratedFindingSecurityGroupName'}],
'SubnetId': 'GeneratedFindingSubnetId',
'VpcId': 'GeneratedFindingVPCId'}], 'ProductCodes': [{}], 'Tags': [
{'Key': 'GeneratedFindingInstaceTag1',
'Value': 'GeneratedFindingInstaceValue1'},
{'Key': 'GeneratedFindingInstaceTag2',
'Value': 'GeneratedFindingInstaceTagValue2'},
{'Key': 'GeneratedFindingInstaceTag3',
'Value': 'GeneratedFindingInstaceTagValue3'},
{'Key': 'GeneratedFindingInstaceTag4',
'Value': 'GeneratedFindingInstaceTagValue4'},
{'Key': 'GeneratedFindingInstaceTag5',
'Value': 'GeneratedFindingInstaceTagValue5'},
{'Key': 'GeneratedFindingInstaceTag6',
'Value': 'GeneratedFindingInstaceTagValue6'},
{'Key': 'GeneratedFindingInstaceTag7',
'Value': 'GeneratedFindingInstaceTagValue7'},
{'Key': 'GeneratedFindingInstaceTag8',
'Value': 'GeneratedFindingInstaceTagValue8'},
{'Key': 'GeneratedFindingInstaceTag9',
'Value': 'GeneratedFindingInstaceTagValue9'}]},
'ResourceType': 'S3Bucket'}, 'SchemaVersion': '2.0', 'Service': {
'Action': {'ActionType': 'AWS_API_CALL',
'AwsApiCallAction': {'Api': 'GeneratedFindingAPIName', 'CallerType': 'Remote IP',
'RemoteIpDetails': {
'City': {'CityName': 'GeneratedFindingCityName'},
'Country': {'CountryName': 'GeneratedFindingCountryName'},
'GeoLocation': {'Lat': 0, 'Lon': 0},
'IpAddressV4': '198.51.100.0', 'Organization': {'Asn': '-1',
'AsnOrg': 'GeneratedFindingASNOrg',
'Isp': 'GeneratedFindingISP',
'Org': 'GeneratedFindingORG'}},
'ServiceName': 'GeneratedFindingAPIServiceName'}},
'Archived': False, 'Count': 4, 'DetectorId': 'f2baedb0ac74f8f42fc929e15f56da6a',
'EventFirstSeen': '2020-11-25T13:46:37.960Z', 'EventLastSeen': '2020-11-26T15:18:12.620Z',
'ResourceRole': 'TARGET', 'ServiceName': 'guardduty'}, 'Severity': 2,
'Title': 'API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'Type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'UpdatedAt': '2020-11-26T15:18:12.620Z'}, 'payload': {
'name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'description': {'format': 'text',
'content': 'An API was used to access a bucket from an IP address on a custom threat list.'},
'discovered_date': '2020-11-25T13:46:37.960Z', 'severity_code': 'Low',
'properties': {'aws_guardduty_finding_id': '60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'aws_guardduty_finding_updated_at': '2020-11-26T15:18:12.620Z',
'aws_guardduty_region': 'us-west-2', 'aws_guardduty_resource_type': 'S3Bucket',
'aws_guardduty_count': '4',
'aws_guardduty_detector_id': 'f2baedb0ac74f8f42fc929e15f56da6a'}, 'artifacts': [
{'type': {'name': 'aws_iam_access_key_id'}, 'description': {'format': 'text',
'content': "'AWS IAM Access Key ID' extracted from GuardDuty from finding property 'AccessKeyId' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingAccessKeyId'}, {'type': {'name': 'aws_iam_user_name'},
'description': {'format': 'text',
'content': "'AWS IAM User Name' extracted from GuardDuty from finding property 'UserName' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingUserName'},
{'type': {'name': 'IP Address'}, 'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'IpAddressV4' at path '['Service', 'Action', 'AwsApiCallAction', 'RemoteIpDetails']'."},
'value': '198.51.100.0'}, {'type': {'name': 'IP Address'}, 'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'PrivateIpAddress' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': '10.0.0.1'}, {'type': {'name': 'DNS Name'},
'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PublicDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPublicDNSName'},
{'type': {'name': 'DNS Name'}, 'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPrivateDnsName'}, {'type': {'name': 'DNS Name'},
'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0, 'PrivateIpAddresses', 0]'."},
'value': 'GeneratedFindingPrivateName'}],
'comments': [{'text': {'format': 'text',
'content': "AWS GuardDuty finding Payload:\n{ 'AccountId': '834299573936',\n 'Arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',\n 'CreatedAt': '2020-11-25T13:46:37.960Z',\n 'Description': 'An API was used to access a bucket from an IP address on a '\n 'custom threat list.',\n 'Id': '60baffd3f9042e38640f2300d5c5a631',\n 'Partition': 'aws',\n 'Region': 'us-west-2',\n 'Resource': { 'AccessKeyDetails': { 'AccessKeyId': 'GeneratedFindingAccessKeyId',\n 'PrincipalId': 'GeneratedFindingPrincipalId',\n 'UserName': 'GeneratedFindingUserName',\n 'UserType': 'IAMUser'},\n 'InstanceDetails': { 'AvailabilityZone': 'GeneratedFindingInstaceAvailabilityZone',\n 'IamInstanceProfile': { 'Arn': 'arn:aws:iam::834299573936:example/instance/profile',\n 'Id': 'GeneratedFindingInstanceProfileId'},\n 'ImageDescription': 'GeneratedFindingInstaceImageDescription',\n 'ImageId': 'ami-99999999',\n 'InstanceId': 'i-99999999',\n 'InstanceState': 'running',\n 'InstanceType': 'm3.xlarge',\n 'LaunchTime': '2016-08-02T02:05:06Z',\n 'NetworkInterfaces': [ { 'Ipv6Addresses': [ ],\n 'NetworkInterfaceId': 'eni-bfcffe88',\n 'PrivateDnsName': 'GeneratedFindingPrivateDnsName',\n 'PrivateIpAddress': '10.0.0.1',\n 'PrivateIpAddresses': [ { 'PrivateDnsName': 'GeneratedFindingPrivateName',\n 'PrivateIpAddress': '10.0.0.1'}],\n 'PublicDnsName': 'GeneratedFindingPublicDNSName',\n 'PublicIp': '198.51.100.0',\n 'SecurityGroups': [ { 'GroupId': 'GeneratedFindingSecurityId',\n 'GroupName': 'GeneratedFindingSecurityGroupName'}],\n 'SubnetId': 'GeneratedFindingSubnetId',\n 'VpcId': 'GeneratedFindingVPCId'}],\n 'OutpostArn': 'arn:aws:outposts:us-west-2:123456789000:outpost/op-0fbc006e9abbc73c3',\n 'ProductCodes': [{}],\n 'Tags': [ { 'Key': 'GeneratedFindingInstaceTag1',\n 'Value': 'GeneratedFindingInstaceValue1'},\n { 'Key': 'GeneratedFindingInstaceTag2',\n 'Value': 'GeneratedFindingInstaceTagValue2'},\n { 'Key': 'GeneratedFindingInstaceTag3',\n 'Value': 'GeneratedFindingInstaceTagValue3'},\n { 'Key': 'GeneratedFindingInstaceTag4',\n 'Value': 'GeneratedFindingInstaceTagValue4'},\n { 'Key': 'GeneratedFindingInstaceTag5',\n 'Value': 'GeneratedFindingInstaceTagValue5'},\n { 'Key': 'GeneratedFindingInstaceTag6',\n 'Value': 'GeneratedFindingInstaceTagValue6'},\n { 'Key': 'GeneratedFindingInstaceTag7',\n 'Value': 'GeneratedFindingInstaceTagValue7'},\n { 'Key': 'GeneratedFindingInstaceTag8',\n 'Value': 'GeneratedFindingInstaceTagValue8'},\n { 'Key': 'GeneratedFindingInstaceTag9',\n 'Value': 'GeneratedFindingInstaceTagValue9'}]},\n 'ResourceType': 'S3Bucket',\n 'S3BucketDetails': [ { 'Arn': 'arn:aws:s3:::bucketName',\n 'CreatedAt': '2017-12-18 '\n '15:58:11',\n 'DefaultServerSideEncryption': { 'EncryptionType': 'SSEAlgorithm',\n 'KmsMasterKeyArn': 'arn:aws:kms:region:123456789012:key/key-id'},\n 'Name': 'bucketName',\n 'Owner': { 'Id': 'CanonicalId '\n 'of Owner'},\n 'PublicAccess': { 'EffectivePermission': 'NOT_PUBLIC',\n 'PermissionConfiguration': { 'AccountLevelPermissions': { 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False}},\n 'BucketLevelPermissions': { 'AccessControlList': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False},\n 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False},\n 'BucketPolicy': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False}}}},\n 'Tags': [ { 'Key': 'foo',\n 'Value': 'bar'}],\n 'Type': 'Destination'}]},\n 'SchemaVersion': '2.0',\n 'Service': { 'Action': { 'ActionType': 'AWS_API_CALL',\n 'AwsApiCallAction': { 'Api': 'GeneratedFindingAPIName',\n 'CallerType': 'Remote '\n 'IP',\n 'RemoteIpDetails': { 'City': { 'CityName': 'GeneratedFindingCityName'},\n 'Country': { 'CountryName': 'GeneratedFindingCountryName'},\n 'GeoLocation': { 'Lat': 0,\n 'Lon': 0},\n 'IpAddressV4': '198.51.100.0',\n 'Organization': { 'Asn': '-1',\n 'AsnOrg': 'GeneratedFindingASNOrg',\n 'Isp': 'GeneratedFindingISP',\n 'Org': 'GeneratedFindingORG'}},\n 'ServiceName': 'GeneratedFindingAPIServiceName'}},\n 'Archived': False,\n 'Count': 4,\n 'DetectorId': 'f2baedb0ac74f8f42fc929e15f56da6a',\n 'EventFirstSeen': '2020-11-25T13:46:37.960Z',\n 'EventLastSeen': '2020-11-26T15:18:12.620Z',\n 'ResourceRole': 'TARGET',\n 'ServiceName': 'guardduty'},\n 'Severity': 2,\n 'Title': 'API GeneratedFindingAPIName was invoked from an IP address on a '\n 'custom threat list.',\n 'Type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',\n 'UpdatedAt': '2020-11-26T15:18:12.620Z'}"}}]},
'data_tables': {'gd_action_details': [{'cells': {'action_type': {'value': 'AWS_API_CALL'},
'action_api': {'value': 'GeneratedFindingAPIName'},
'event_first_seen': {'value': '2020-11-25T13:46:37.960Z'},
'event_last_seen': {'value': '2020-11-26T15:18:12.620Z'},
'actor_caller_type': {'value': 'Remote IP'},
'city_name': {'value': 'GeneratedFindingCityName'},
'country_name': {'value': 'GeneratedFindingCountryName'},
'asn': {'value': '-1'},
'asn_org': {'value': 'GeneratedFindingASNOrg'},
'isp': {'value': 'GeneratedFindingISP'},
'org': {'value': 'GeneratedFindingORG'},
'action_service_name': {
'value': 'GeneratedFindingAPIServiceName'},
'remote_ip': {'value': '198.51.100.0'}}}],
'gd_resource_affected': [{'cells': {'resource_type': {'value': 'S3Bucket'},
'instance_id': {'value': 'bucketName'},
'instance_type': {'value': 'Destination'},
'instance_state': {'value': 'running'},
'resource_role': {'value': 'TARGET'},
'instance_private_ip': {'value': '10.0.0.1'},
'instance_private_dns': {
'value': 'GeneratedFindingPrivateName'},
'instance_public_ip': {'value': '198.51.100.0'},
'instance_public_dns': {
'value': 'GeneratedFindingPublicDNSName'},
's3bucket_name': {'value': 'bucketName'},
's3bucket_owner': {
'value': 'CanonicalId of Owner'}}}]}}
)
}
return response[type]
## Mock results - Expected results before updates/filters applied.
def get_mocked_finding_data(type):
response = {
"replace_datetime_finding":({"Severity": 7,
"CreatedAt": datetime.datetime(2017, 12, 18, 15, 58, 11, 551000, tzinfo=tzlocal()),
"Dates": [{'TestDate': datetime.datetime(2017, 10, 16, 13, 56, 10, 551000, tzinfo=tzlocal())}],
"OtherDates": {'TestDate2': datetime.datetime(2016, 10, 16, 13, 56, 10, 551000, tzinfo=tzlocal())}
}
),
"convert_unicode_finding": ({u"Severity": 7,
u"CreatedAt": u"2017-12-26T15:18:12.620Z",
u"Dates": [{'TestDate': u"2019-11-26T15:18:12.620Z"}],
u"OtherDates": {u'TestDate2': u"2020-11-26T15:18:12.620Z"}
}
),
}
return response[type]
# Mocked resilient for standalone tests.
def get_resilient_responses(op):
response = {
"find_resilient_incident_for_req": (
[{'name': 'AWS GuardDuty: API UpdateLoginProfile was invoked using root credentials.',
'description': '<div>API UpdateLoginProfile was invoked using root credentials from IP address 173.48.174.220.</div>',
'phase_id': 1009, 'inc_training': False, 'id': 2167, 'sequence_code': '5908-73',
'discovered_date': 1605624125336, 'due_date': None, 'create_date': 1610387339603, 'owner_id': 4,
'severity_code': 100, 'plan_status': 'A'}
]
),
"create_incident": (
{'dtm': {}, 'cm': {'unassigneds': [], 'total': 0, 'geo_counts': {}}, 'regulators': {'ids': []},
'hipaa': {'hipaa_adverse': None, 'hipaa_misused': None, 'hipaa_acquired': None,
'hipaa_additional_misuse': None, 'hipaa_breach': None, 'hipaa_adverse_comment': None,
'hipaa_misused_comment': None, 'hipaa_acquired_comment': None,
'hipaa_additional_misuse_comment': None, 'hipaa_breach_comment': None}, 'tasks': None,
'artifacts': [{'id': 1008, 'type': 2, 'value': 'GeneratedFindingPrivateDnsName',
'description': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'.",
'attachment': None, 'parent_id': None,
'creator': {'id': 4, 'fname': 'Resilient', 'lname': 'Sysadmin',
'display_name': 'Resilient Sysadmin', 'status': 'A', 'email': 'a@a.com',
'locked': False, 'password_changed': False, 'is_external': False},
'inc_id': 2239,
'inc_name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'inc_owner': 4, 'hits': [], 'created': 1610391037106, 'last_modified_time': 1610391037108,
'last_modified_by': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'}, 'pending_sources': [],
'perms': {'read': True, 'write': True, 'delete': True}, 'properties': None,
'whois': {'pending': False, 'invalid': True}, 'actions': [],
'hash': '5c846bd15c8a1acdf13163d01e7cf74a5e7babea8f52563aa1f2a5aba6716447',
'relating': None, 'creator_principal': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'},
'ip': {'source': None, 'destination': None}},
{'id': 1009, 'type': 2, 'value': 'GeneratedFindingPrivateName',
'description': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0, 'PrivateIpAddresses', 0]'.",
'attachment': None, 'parent_id': None,
'creator': {'id': 4, 'fname': 'Resilient', 'lname': 'Sysadmin',
'display_name': 'Resilient Sysadmin', 'status': 'A', 'email': 'a@a.com',
'locked': False, 'password_changed': False, 'is_external': False},
'inc_id': 2239,
'inc_name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'inc_owner': 4, 'hits': [], 'created': 1610391037123, 'last_modified_time': 1610391037125,
'last_modified_by': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'}, 'pending_sources': [],
'perms': {'read': True, 'write': True, 'delete': True}, 'properties': None,
'whois': {'pending': False, 'invalid': True}, 'actions': [],
'hash': 'c666da9ec698918d2dcd7639f73600bf9df58740d4200e344714e6adcc5c4605',
'relating': None, 'creator_principal': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'},
'ip': {'source': None, 'destination': None}},
{'id': 1004, 'type': 1082, 'value': 'GeneratedFindingUserName',
'description': "'AWS IAM User Name' extracted from GuardDuty from finding property 'UserName' at path '['Resource', 'AccessKeyDetails']'.",
'attachment': None, 'parent_id': None,
'creator': {'id': 4, 'fname': 'Resilient', 'lname': 'Sysadmin',
'display_name': 'Resilient Sysadmin', 'status': 'A', 'email': 'a@a.com',
'locked': False, 'password_changed': False, 'is_external': False},
'inc_id': 2239,
'inc_name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'inc_owner': 4, 'hits': [], 'created': 1610391037036, 'last_modified_time': 1610391037039,
'last_modified_by': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'}, 'pending_sources': [],
'perms': {'read': True, 'write': True, 'delete': True}, 'properties': None, 'actions': [],
'hash': 'e18bc81b2e031729d3b74245f7e3d42efdf9a8fffc2d484aa424b2f2073e8bfe',
'relating': None, 'creator_principal': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'},
'ip': {'source': None, 'destination': None}},
{'id': 1005, 'type': 1, 'value': '198.51.100.0',
'description': "'IP Address' extracted from GuardDuty from finding property 'IpAddressV4' at path '['Service', 'Action', 'AwsApiCallAction', 'RemoteIpDetails']'.",
'attachment': None, 'parent_id': None,
'creator': {'id': 4, 'fname': 'Resilient', 'lname': 'Sysadmin',
'display_name': 'Resilient Sysadmin', 'status': 'A', 'email': 'a@a.com',
'locked': False, 'password_changed': False, 'is_external': False},
'inc_id': 2239,
'inc_name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'inc_owner': 4, 'hits': [], 'created': 1610391037053, 'last_modified_time': 1610391037055,
'last_modified_by': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'}, 'pending_sources': [],
'perms': {'read': True, 'write': True, 'delete': True}, 'properties': None, 'actions': [],
'hash': '74d22f6d57644f8a2e67577dec95a74996f8b115f4ea20c4e5bb32b2981111a7',
'relating': None, 'creator_principal': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'},
'ip': {'source': None, 'destination': None}},
{'id': 1003, 'type': 1080, 'value': 'GeneratedFindingAccessKeyId',
'description': "'AWS IAM Access Key ID' extracted from GuardDuty from finding property 'AccessKeyId' at path '['Resource', 'AccessKeyDetails']'.",
'attachment': None, 'parent_id': None,
'creator': {'id': 4, 'fname': 'Resilient', 'lname': 'Sysadmin',
'display_name': 'Resilient Sysadmin', 'status': 'A', 'email': 'a@a.com',
'locked': False, 'password_changed': False, 'is_external': False},
'inc_id': 2239,
'inc_name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'inc_owner': 4, 'hits': [], 'created': 1610391037016, 'last_modified_time': 1610391037020,
'last_modified_by': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'}, 'pending_sources': [],
'perms': {'read': True, 'write': True, 'delete': True}, 'properties': None, 'actions': [],
'hash': '83376857d0fb2499b26c6a6152d92d6dd5bef5380e54b6cd0a3c91161f6f43e9',
'relating': None, 'creator_principal': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'},
'ip': {'source': None, 'destination': None}}, {'id': 1006, 'type': 1, 'value': '10.0.0.1',
'description': "'IP Address' extracted from GuardDuty from finding property 'PrivateIpAddress' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'.",
'attachment': None, 'parent_id': None,
'creator': {'id': 4, 'fname': 'Resilient',
'lname': 'Sysadmin',
'display_name': 'Resilient Sysadmin',
'status': 'A',
'email': 'a@a.com',
'locked': False,
'password_changed': False,
'is_external': False},
'inc_id': 2239,
'inc_name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'inc_owner': 4, 'hits': [],
'created': 1610391037070,
'last_modified_time': 1610391037072,
'last_modified_by': {'id': 4, 'type': 'user',
'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'},
'pending_sources': [],
'perms': {'read': True, 'write': True,
'delete': True},
'properties': None, 'actions': [],
'hash': 'a4ebc6e0512c26fd6ae1df8ad942c5008a7fdfa7e1339168c8edd94ae488e77d',
'relating': None,
'creator_principal': {'id': 4,
'type': 'user',
'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'},
'ip': {'source': None, 'destination': None}},
{'id': 1007, 'type': 2, 'value': 'GeneratedFindingPublicDNSName',
'description': "'DNS Name' extracted from GuardDuty from finding property 'PublicDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'.",
'attachment': None, 'parent_id': None,
'creator': {'id': 4, 'fname': 'Resilient', 'lname': 'Sysadmin',
'display_name': 'Resilient Sysadmin', 'status': 'A', 'email': 'a@a.com',
'locked': False, 'password_changed': False, 'is_external': False},
'inc_id': 2239,
'inc_name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'inc_owner': 4, 'hits': [], 'created': 1610391037087, 'last_modified_time': 1610391037089,
'last_modified_by': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'}, 'pending_sources': [],
'perms': {'read': True, 'write': True, 'delete': True}, 'properties': None,
'whois': {'pending': False, 'invalid': True}, 'actions': [],
'hash': '3a733bc5b809b1920418bf12995676ef55c1ef066103874ed15ef5d8b0c46ee8',
'relating': None, 'creator_principal': {'id': 4, 'type': 'user', 'name': 'a@a.com',
'display_name': 'Resilient Sysadmin'},
'ip': {'source': None, 'destination': None}}],
'name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'description': '<div>An API was used to access a bucket from an IP address on a custom threat list.</div>',
'phase_id': 1009, 'inc_training': False, 'vers': 2, 'addr': None, 'city': None,
'creator': {'id': 4, 'fname': 'Resilient', 'lname': 'Sysadmin', 'display_name': 'Resilient Sysadmin',
'status': 'A', 'email': 'a@a.com', 'locked': False, 'password_changed': False,
'is_external': False},
'creator_principal': {'id': 4, 'type': 'user', 'name': 'a@a.com', 'display_name': 'Resilient Sysadmin'},
'exposure_type_id': 1, 'incident_type_ids': [], 'reporter': None, 'state': None, 'country': None,
'zip': None, 'workspace': 2, 'exposure': 0, 'org_handle': 202, 'members': [], 'negative_pr_likely': None,
'perms': {'read': True, 'write': True, 'comment': True, 'assign': True, 'close': True,
'change_members': True, 'attach_file': True, 'read_attachments': True,
'delete_attachments': True, 'create_milestones': True, 'list_milestones': True,
'create_artifacts': True, 'list_artifacts': True, 'delete': True, 'change_workspace': True},
'confirmed': False, 'task_changes': {'added': [], 'removed': []},
'assessment': '<?xml version="1.0" encoding="UTF-8" standalone="yes"?>\n<assessment>\n <rollups/>\n <optional>There are 1 required and 0 optional tasks from 1 regulators.</optional>\n</assessment>\n',
'data_compromised': None, 'draft': False,
'properties': {'aws_guardduty_finding_id': '60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_updated_at': '2020-11-26T15:18:12.620Z', 'aws_guardduty_count': '4',
'internal_customizations_field': None,
'aws_guardduty_finding_type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'aws_guardduty_resource_type': 'S3Bucket',
'aws_guardduty_detector_id': 'f2baedb0ac74f8f42fc929e15f56da6a',
'aws_guardduty_region': 'us-west-2'}, 'resolution_id': None, 'resolution_summary': None,
'pii': {'data_compromised': None, 'determined_date': 1606311997960, 'harmstatus_id': 2,
'data_encrypted': None, 'data_contained': None, 'impact_likely': None, 'ny_impact_likely': None,
'or_impact_likely': None, 'wa_impact_likely': None, 'dc_impact_likely': None,
'data_source_ids': [], 'data_format': None,
'assessment': '<?xml version="1.0" encoding="UTF-8" standalone="yes"?>\n<assessment>\n <rollups/>\n <optional>There are 1 required and 0 optional tasks from 1 regulators.</optional>\n</assessment>\n',
'exposure': 0, 'gdpr_harm_risk': None, 'gdpr_lawful_data_processing_categories': [],
'alberta_health_risk_assessment': None},
'gdpr': {'gdpr_breach_circumstances': [], 'gdpr_breach_type': None, 'gdpr_personal_data': None,
'gdpr_identification': None, 'gdpr_consequences': None, 'gdpr_final_assessment': None,
'gdpr_breach_type_comment': None, 'gdpr_personal_data_comment': None,
'gdpr_identification_comment': None, 'gdpr_consequences_comment': None,
'gdpr_final_assessment_comment': None, 'gdpr_subsequent_notification': None},
'regulator_risk': {}, 'inc_last_modified_date': 1610391037140, 'comments': [
{'type': 'incident', 'id': 253, 'parent_id': None, 'user_id': 4, 'user_fname': 'Resilient',
'user_lname': 'Sysadmin',
'text': '<div>AWS GuardDuty finding Payload:<br/>{ 'AccountId': '834299573936',<br/> 'Arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',<br/> 'CreatedAt': '2020-11-25T13:46:37.960Z',<br/> 'Description': 'An API was used to access a bucket from an IP address on a '<br/> 'custom threat list.',<br/> 'Id': '60baffd3f9042e38640f2300d5c5a631',<br/> 'Partition': 'aws',<br/> 'Region': 'us-west-2',<br/> 'Resource': { 'AccessKeyDetails': { 'AccessKeyId': 'GeneratedFindingAccessKeyId',<br/> 'PrincipalId': 'GeneratedFindingPrincipalId',<br/> 'UserName': 'GeneratedFindingUserName',<br/> 'UserType': 'IAMUser'},<br/> 'InstanceDetails': { 'AvailabilityZone': 'GeneratedFindingInstaceAvailabilityZone',<br/> 'IamInstanceProfile': { 'Arn': 'arn:aws:iam::834299573936:example/instance/profile',<br/> 'Id': 'GeneratedFindingInstanceProfileId'},<br/> 'ImageDescription': 'GeneratedFindingInstaceImageDescription',<br/> 'ImageId': 'ami-99999999',<br/> 'InstanceId': 'i-99999999',<br/> 'InstanceState': 'running',<br/> 'InstanceType': 'm3.xlarge',<br/> 'LaunchTime': '2016-08-02T02:05:06Z',<br/> 'NetworkInterfaces': [ { 'Ipv6Addresses': [ ],<br/> 'NetworkInterfaceId': 'eni-bfcffe88',<br/> 'PrivateDnsName': 'GeneratedFindingPrivateDnsName',<br/> 'PrivateIpAddress': '10.0.0.1',<br/> 'PrivateIpAddresses': [ { 'PrivateDnsName': 'GeneratedFindingPrivateName',<br/> 'PrivateIpAddress': '10.0.0.1'}],<br/> 'PublicDnsName': 'GeneratedFindingPublicDNSName',<br/> 'PublicIp': '198.51.100.0',<br/> 'SecurityGroups': [ { 'GroupId': 'GeneratedFindingSecurityId',<br/> 'GroupName': 'GeneratedFindingSecurityGroupName'}],<br/> 'SubnetId': 'GeneratedFindingSubnetId',<br/> 'VpcId': 'GeneratedFindingVPCId'}],<br/> 'OutpostArn': 'arn:aws:outposts:us-west-2:123456789000:outpost/op-0fbc006e9abbc73c3',<br/> 'ProductCodes': [{}],<br/> 'Tags': [ { 'Key': 'GeneratedFindingInstaceTag1',<br/> 'Value': 'GeneratedFindingInstaceValue1'},<br/> { 'Key': 'GeneratedFindingInstaceTag2',<br/> 'Value': 'GeneratedFindingInstaceTagValue2'},<br/> { 'Key': 'GeneratedFindingInstaceTag3',<br/> 'Value': 'GeneratedFindingInstaceTagValue3'},<br/> { 'Key': 'GeneratedFindingInstaceTag4',<br/> 'Value': 'GeneratedFindingInstaceTagValue4'},<br/> { 'Key': 'GeneratedFindingInstaceTag5',<br/> 'Value': 'GeneratedFindingInstaceTagValue5'},<br/> { 'Key': 'GeneratedFindingInstaceTag6',<br/> 'Value': 'GeneratedFindingInstaceTagValue6'},<br/> { 'Key': 'GeneratedFindingInstaceTag7',<br/> 'Value': 'GeneratedFindingInstaceTagValue7'},<br/> { 'Key': 'GeneratedFindingInstaceTag8',<br/> 'Value': 'GeneratedFindingInstaceTagValue8'},<br/> { 'Key': 'GeneratedFindingInstaceTag9',<br/> 'Value': 'GeneratedFindingInstaceTagValue9'}]},<br/> 'ResourceType': 'S3Bucket',<br/> 'S3BucketDetails': [ { 'Arn': 'arn:aws:s3:::bucketName',<br/> 'CreatedAt': datetime.datetime(2017, 12, 18, 15, 58, 11, 551000, tzinfo=tzlocal()),<br/> 'DefaultServerSideEncryption': { 'EncryptionType': 'SSEAlgorithm',<br/> 'KmsMasterKeyArn': 'arn:aws:kms:region:123456789012:key/key-id'},<br/> 'Name': 'bucketName',<br/> 'Owner': { 'Id': 'CanonicalId '<br/> 'of Owner'},<br/> 'PublicAccess': { 'EffectivePermission': 'NOT_PUBLIC',<br/> 'PermissionConfiguration': { 'AccountLevelPermissions': { 'BlockPublicAccess': { 'BlockPublicAcls': False,<br/> 'BlockPublicPolicy': False,<br/> 'IgnorePublicAcls': False,<br/> 'RestrictPublicBuckets': False}},<br/> 'BucketLevelPermissions': { 'AccessControlList': { 'AllowsPublicReadAccess': False,<br/> 'AllowsPublicWriteAccess': False},<br/> 'BlockPublicAccess': { 'BlockPublicAcls': False,<br/> 'BlockPublicPolicy': False,<br/> 'IgnorePublicAcls': False,<br/> 'RestrictPublicBuckets': False},<br/> 'BucketPolicy': { 'AllowsPublicReadAccess': False,<br/> 'AllowsPublicWriteAccess': False}}}},<br/> 'Tags': [ { 'Key': 'foo',<br/> 'Value': 'bar'}],<br/> 'Type': 'Destination'}]},<br/> 'SchemaVersion': '2.0',<br/> 'Service': { 'Action': { 'ActionType': 'AWS_API_CALL',<br/> 'AwsApiCallAction': { 'Api': 'GeneratedFindingAPIName',<br/> 'CallerType': 'Remote '<br/> 'IP',<br/> 'RemoteIpDetails': { 'City': { 'CityName': 'GeneratedFindingCityName'},<br/> 'Country': { 'CountryName': 'GeneratedFindingCountryName'},<br/> 'GeoLocation': { 'Lat': 0,<br/> 'Lon': 0},<br/> 'IpAddressV4': '198.51.100.0',<br/> 'Organization': { 'Asn': '-1',<br/> 'AsnOrg': 'GeneratedFindingASNOrg',<br/> 'Isp': 'GeneratedFindingISP',<br/> 'Org': 'GeneratedFindingORG'}},<br/> 'ServiceName': 'GeneratedFindingAPIServiceName'}},<br/> 'Archived': False,<br/> 'Count': 4,<br/> 'DetectorId': 'f2baedb0ac74f8f42fc929e15f56da6a',<br/> 'EventFirstSeen': '2020-11-25T13:46:37.960Z',<br/> 'EventLastSeen': '2020-11-26T15:18:12.620Z',<br/> 'ResourceRole': 'TARGET',<br/> 'ServiceName': 'guardduty'},<br/> 'Severity': 2,<br/> 'Title': 'API GeneratedFindingAPIName was invoked from an IP address on a '<br/> 'custom threat list.',<br/> 'Type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',<br/> 'UpdatedAt': '2020-11-26T15:18:12.620Z'}</div>',
'create_date': 1610391037127, 'modify_date': 1610391037127, 'children': [], 'mentioned_users': [],
'is_deleted': False, 'modify_user': {'id': 4, 'first_name': 'Resilient', 'last_name': 'Sysadmin'},
'actions': [], 'inc_id': 2239,
'inc_name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'task_id': None, 'task_name': None, 'task_custom': None, 'task_members': None, 'task_at_id': None,
'inc_owner': 4, 'user_name': 'Resilient Sysadmin',
'modify_principal': {'id': 4, 'type': 'user', 'name': 'a@a.com', 'display_name': 'Resilient Sysadmin'},
'comment_perms': {'update': True, 'delete': True}}], 'actions': [],
'timer_field_summarized_incident_data': [], 'admin_id': None, 'creator_id': 4, 'crimestatus_id': 1,
'employee_involved': None, 'end_date': None, 'exposure_dept_id': None, 'exposure_individual_name': None,
'exposure_vendor_id': None, 'jurisdiction_name': None, 'jurisdiction_reg_id': None, 'start_date': None,
'inc_start': None, 'org_id': 202, 'is_scenario': False, 'hard_liability': 0, 'nist_attack_vectors': [],
'id': 2239, 'sequence_code': None, 'discovered_date': 1606311997960, 'due_date': None,
'create_date': 1610391036688, 'owner_id': 4, 'severity_code': 100, 'plan_status': 'A'}
),
"find_resilient_artifacts_for_incident_with_artifacts": ({
'GeneratedFindingPrivateName': 'DNS Name',
'10.0.0.1': 'IP Address',
'GeneratedFindingPublicDNSName': 'DNS Name',
'GeneratedFindingAccessKeyId': 'AWS IAM Access Key ID',
'198.51.100.0': 'IP Address',
'GeneratedFindingUserName': 'AWS IAM User Name',
'GeneratedFindingPrivateDnsName': 'DNS Name'
}),
"find_resilient_artifacts_for_incident_no_artifacts": (
{}
),
"page_incidents": (
{'name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.', 'description': '<div>An API was used to access a bucket from an IP address on a custom threat list.</div>', 'phase_id': 1009, 'inc_training': False, 'vers': 2, 'addr': None, 'city': None, 'creator': {'id': 4, 'fname': 'Resilient', 'lname': 'Sysadmin', 'display_name': 'Resilient Sysadmin', 'status': 'A', 'email': 'a@a.com', 'locked': False, 'password_changed': False, 'is_external': False}, 'creator_principal': {'id': 4, 'type': 'user', 'name': 'a@a.com', 'display_name': 'Resilient Sysadmin'}, 'exposure_type_id': 1, 'incident_type_ids': [], 'reporter': None, 'state': None, 'country': None, 'zip': None, 'workspace': 2, 'exposure': 0, 'org_handle': 202, 'members': [], 'negative_pr_likely': None, 'perms': {'read': True, 'write': True, 'comment': True, 'assign': True, 'close': True, 'change_members': True, 'attach_file': True, 'read_attachments': True, 'delete_attachments': True, 'create_milestones': True, 'list_milestones': True, 'create_artifacts': True, 'list_artifacts': True, 'delete': True, 'change_workspace': True}, 'confirmed': False, 'task_changes': {'added': [], 'removed': []}, 'assessment': '<?xml version="1.0" encoding="UTF-8" standalone="yes"?>\n<assessment>\n <rollups/>\n <optional>There are 1 required and 0 optional tasks from 1 regulators.</optional>\n</assessment>\n', 'data_compromised': None, 'draft': False, 'properties': {'aws_guardduty_finding_id': '60baffd3f9042e38640f2300d5c5a631', 'aws_guardduty_finding_arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631', 'aws_guardduty_finding_updated_at': '2020-11-26T15:18:12.620Z', 'aws_guardduty_count': '4', 'internal_customizations_field': None, 'aws_guardduty_archived': 'False', 'aws_guardduty_finding_type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom', 'aws_guardduty_resource_type': 'S3Bucket', 'aws_guardduty_severity': '2', 'aws_guardduty_detector_id': 'f2baedb0ac74f8f42fc929e15f56da6a', 'aws_guardduty_region': 'us-west-2'}, 'resolution_id': None, 'resolution_summary': None, 'pii': {'data_compromised': None, 'determined_date': 1606311997960, 'harmstatus_id': 2, 'data_encrypted': None, 'data_contained': None, 'impact_likely': None, 'ny_impact_likely': None, 'or_impact_likely': None, 'wa_impact_likely': None, 'dc_impact_likely': None, 'data_source_ids': [], 'data_format': None, 'assessment': '<?xml version="1.0" encoding="UTF-8" standalone="yes"?>\n<assessment>\n <rollups/>\n <optional>There are 1 required and 0 optional tasks from 1 regulators.</optional>\n</assessment>\n', 'exposure': 0, 'gdpr_harm_risk': None, 'gdpr_lawful_data_processing_categories': [], 'alberta_health_risk_assessment': None}, 'gdpr': {'gdpr_breach_circumstances': [], 'gdpr_breach_type': None, 'gdpr_personal_data': None, 'gdpr_identification': None, 'gdpr_consequences': None, 'gdpr_final_assessment': None, 'gdpr_breach_type_comment': None, 'gdpr_personal_data_comment': None, 'gdpr_identification_comment': None, 'gdpr_consequences_comment': None, 'gdpr_final_assessment_comment': None, 'gdpr_subsequent_notification': None}, 'regulator_risk': {}, 'inc_last_modified_date': 1612532328297, 'admin_id': None, 'creator_id': 4, 'crimestatus_id': 1, 'employee_involved': None, 'end_date': None, 'exposure_dept_id': None, 'exposure_individual_name': None, 'exposure_vendor_id': None, 'jurisdiction_name': None, 'jurisdiction_reg_id': None, 'start_date': None, 'inc_start': None, 'org_id': 202, 'is_scenario': False, 'hard_liability': 0, 'nist_attack_vectors': [], 'id': 2100, 'sequence_code': '5908-6', 'discovered_date': 1606311997960, 'due_date': None, 'create_date': 1612532324833, 'owner_id': 4, 'severity_code': 100, 'plan_status': 'A'}
)
}
return response[op]
# Mocked function parameters.
def get_function_params(op):
response = {
"data": (
{
'name': 'AWS GuardDuty: API GeneratedFindingAPIName was invoked from an IP address on a custom threat list.',
'description': {'format': 'text',
'content': 'An API was used to access a bucket from an IP address on a custom threat list.'},
'discovered_date': '2020-11-25T13:46:37.960Z', 'severity_code': 'Low',
'properties': {'aws_guardduty_finding_id': '60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',
'aws_guardduty_finding_type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',
'aws_guardduty_finding_updated_at': '2020-11-26T15:18:12.620Z',
'aws_guardduty_region': 'us-west-2', 'aws_guardduty_resource_type': 'S3Bucket',
'aws_guardduty_count': '4',
'aws_guardduty_detector_id': 'f2baedb0ac74f8f42fc929e15f56da6a'},
'artifacts': [
{'type': {'name': 'aws_iam_access_key_id'}, 'description': {'format': 'text',
'content': "'AWS IAM Access Key ID' extracted from GuardDuty from finding property 'AccessKeyId' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingAccessKeyId'}, {'type': {'name': 'aws_iam_user_name'},
'description': {'format': 'text',
'content': "'AWS IAM User Name' extracted from GuardDuty from finding property 'UserName' at path '['Resource', 'AccessKeyDetails']'."},
'value': 'GeneratedFindingUserName'},
{'type': {'name': 'IP Address'}, 'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'IpAddressV4' at path '['Service', 'Action', 'AwsApiCallAction', 'RemoteIpDetails']'."},
'value': '198.51.100.0'}, {'type': {'name': 'IP Address'}, 'description': {'format': 'text',
'content': "'IP Address' extracted from GuardDuty from finding property 'PrivateIpAddress' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': '10.0.0.1'}, {'type': {'name': 'DNS Name'},
'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PublicDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPublicDNSName'},
{'type': {'name': 'DNS Name'}, 'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0]'."},
'value': 'GeneratedFindingPrivateDnsName'}, {'type': {'name': 'DNS Name'},
'description': {'format': 'text',
'content': "'DNS Name' extracted from GuardDuty from finding property 'PrivateDnsName' at path '['Resource', 'InstanceDetails', 'NetworkInterfaces', 0, 'PrivateIpAddresses', 0]'."},
'value': 'GeneratedFindingPrivateName'}], 'comments': [{
'text': {
'format': 'text',
'content': "AWS GuardDuty finding Payload:\n{ 'AccountId': '834299573936',\n 'Arn': 'arn:aws:guardduty:us-west-2:834299573936:detector/f2baedb0ac74f8f42fc929e15f56da6a/finding/60baffd3f9042e38640f2300d5c5a631',\n 'CreatedAt': '2020-11-25T13:46:37.960Z',\n 'Description': 'An API was used to access a bucket from an IP address on a '\n 'custom threat list.',\n 'Id': '60baffd3f9042e38640f2300d5c5a631',\n 'Partition': 'aws',\n 'Region': 'us-west-2',\n 'Resource': { 'AccessKeyDetails': { 'AccessKeyId': 'GeneratedFindingAccessKeyId',\n 'PrincipalId': 'GeneratedFindingPrincipalId',\n 'UserName': 'GeneratedFindingUserName',\n 'UserType': 'IAMUser'},\n 'InstanceDetails': { 'AvailabilityZone': 'GeneratedFindingInstaceAvailabilityZone',\n 'IamInstanceProfile': { 'Arn': 'arn:aws:iam::834299573936:example/instance/profile',\n 'Id': 'GeneratedFindingInstanceProfileId'},\n 'ImageDescription': 'GeneratedFindingInstaceImageDescription',\n 'ImageId': 'ami-99999999',\n 'InstanceId': 'i-99999999',\n 'InstanceState': 'running',\n 'InstanceType': 'm3.xlarge',\n 'LaunchTime': '2016-08-02T02:05:06Z',\n 'NetworkInterfaces': [ { 'Ipv6Addresses': [ ],\n 'NetworkInterfaceId': 'eni-bfcffe88',\n 'PrivateDnsName': 'GeneratedFindingPrivateDnsName',\n 'PrivateIpAddress': '10.0.0.1',\n 'PrivateIpAddresses': [ { 'PrivateDnsName': 'GeneratedFindingPrivateName',\n 'PrivateIpAddress': '10.0.0.1'}],\n 'PublicDnsName': 'GeneratedFindingPublicDNSName',\n 'PublicIp': '198.51.100.0',\n 'SecurityGroups': [ { 'GroupId': 'GeneratedFindingSecurityId',\n 'GroupName': 'GeneratedFindingSecurityGroupName'}],\n 'SubnetId': 'GeneratedFindingSubnetId',\n 'VpcId': 'GeneratedFindingVPCId'}],\n 'OutpostArn': 'arn:aws:outposts:us-west-2:123456789000:outpost/op-0fbc006e9abbc73c3',\n 'ProductCodes': [{}],\n 'Tags': [ { 'Key': 'GeneratedFindingInstaceTag1',\n 'Value': 'GeneratedFindingInstaceValue1'},\n { 'Key': 'GeneratedFindingInstaceTag2',\n 'Value': 'GeneratedFindingInstaceTagValue2'},\n { 'Key': 'GeneratedFindingInstaceTag3',\n 'Value': 'GeneratedFindingInstaceTagValue3'},\n { 'Key': 'GeneratedFindingInstaceTag4',\n 'Value': 'GeneratedFindingInstaceTagValue4'},\n { 'Key': 'GeneratedFindingInstaceTag5',\n 'Value': 'GeneratedFindingInstaceTagValue5'},\n { 'Key': 'GeneratedFindingInstaceTag6',\n 'Value': 'GeneratedFindingInstaceTagValue6'},\n { 'Key': 'GeneratedFindingInstaceTag7',\n 'Value': 'GeneratedFindingInstaceTagValue7'},\n { 'Key': 'GeneratedFindingInstaceTag8',\n 'Value': 'GeneratedFindingInstaceTagValue8'},\n { 'Key': 'GeneratedFindingInstaceTag9',\n 'Value': 'GeneratedFindingInstaceTagValue9'}]},\n 'ResourceType': 'S3Bucket',\n 'S3BucketDetails': [ { 'Arn': 'arn:aws:s3:::bucketName',\n 'CreatedAt': datetime.datetime(2017, 12, 18, 15, 58, 11, 551000, tzinfo=tzlocal()),\n 'DefaultServerSideEncryption': { 'EncryptionType': 'SSEAlgorithm',\n 'KmsMasterKeyArn': 'arn:aws:kms:region:123456789012:key/key-id'},\n 'Name': 'bucketName',\n 'Owner': { 'Id': 'CanonicalId '\n 'of Owner'},\n 'PublicAccess': { 'EffectivePermission': 'NOT_PUBLIC',\n 'PermissionConfiguration': { 'AccountLevelPermissions': { 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False}},\n 'BucketLevelPermissions': { 'AccessControlList': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False},\n 'BlockPublicAccess': { 'BlockPublicAcls': False,\n 'BlockPublicPolicy': False,\n 'IgnorePublicAcls': False,\n 'RestrictPublicBuckets': False},\n 'BucketPolicy': { 'AllowsPublicReadAccess': False,\n 'AllowsPublicWriteAccess': False}}}},\n 'Tags': [ { 'Key': 'foo',\n 'Value': 'bar'}],\n 'Type': 'Destination'}]},\n 'SchemaVersion': '2.0',\n 'Service': { 'Action': { 'ActionType': 'AWS_API_CALL',\n 'AwsApiCallAction': { 'Api': 'GeneratedFindingAPIName',\n 'CallerType': 'Remote '\n 'IP',\n 'RemoteIpDetails': { 'City': { 'CityName': 'GeneratedFindingCityName'},\n 'Country': { 'CountryName': 'GeneratedFindingCountryName'},\n 'GeoLocation': { 'Lat': 0,\n 'Lon': 0},\n 'IpAddressV4': '198.51.100.0',\n 'Organization': { 'Asn': '-1',\n 'AsnOrg': 'GeneratedFindingASNOrg',\n 'Isp': 'GeneratedFindingISP',\n 'Org': 'GeneratedFindingORG'}},\n 'ServiceName': 'GeneratedFindingAPIServiceName'}},\n 'Archived': False,\n 'Count': 4,\n 'DetectorId': 'f2baedb0ac74f8f42fc929e15f56da6a',\n 'EventFirstSeen': '2020-11-25T13:46:37.960Z',\n 'EventLastSeen': '2020-11-26T15:18:12.620Z',\n 'ResourceRole': 'TARGET',\n 'ServiceName': 'guardduty'},\n 'Severity': 2,\n 'Title': 'API GeneratedFindingAPIName was invoked from an IP address on a '\n 'custom threat list.',\n 'Type': 'UnauthorizedAccess:S3/MaliciousIPCaller.Custom',\n 'UpdatedAt': '2020-11-26T15:18:12.620Z'}"}}]}
),
"tables": (
{'gd_action_details': [{'cells': {'action_type': {'value': 'AWS_API_CALL'},
'action_api': {'value': 'GeneratedFindingAPIName'},
'event_first_seen': {'value': '2020-11-25T13:46:37.960Z'},
'event_last_seen': {'value': '2020-11-26T15:18:12.620Z'},
'actor_caller_type': {'value': 'Remote IP'},
'city_name': {'value': 'GeneratedFindingCityName'},
'country_name': {'value': 'GeneratedFindingCountryName'},
'asn': {'value': '-1'}, 'asn_org': {'value': 'GeneratedFindingASNOrg'},
'isp': {'value': 'GeneratedFindingISP'},
'org': {'value': 'GeneratedFindingORG'},
'action_service_name': {'value': 'GeneratedFindingAPIServiceName'},
'remote_ip': {'value': '198.51.100.0'}}}],
'gd_resource_affected': [{'cells': {'resource_type': {'value': 'S3Bucket'},
'instance_id': {'value': 'bucketName'},
'instance_type': {'value': 'Destination'},
'instance_state': {'value': 'running'},
'resource_role': {'value': 'TARGET'},
'instance_private_ip': {'value': '10.0.0.1'},
'instance_private_dns': {'value': 'GeneratedFindingPrivateName'},
'instance_public_ip': {'value': '198.51.100.0'},
'instance_public_dns': {'value': 'GeneratedFindingPublicDNSName'},
's3bucket_name': {'value': 'bucketName'},
's3bucket_owner': {'value': 'CanonicalId of Owner'}
}
}
]
}
)
}
return response[op]
# Mocked GuardDuty AWS client response for standalone tests.
def mocked_aws(*args, **kwargs):
class MockResponse:
"""Class will be used by the mock to replace gd client in circuits tests"""
def __init__(self, *args, **kwargs):
self.exceptions = {}
def get_findings(self):
return get_cli_raw_responses("get_findings")
def __init__(self, *args, **kwargs):
self.exceptions = {}
def can_paginate(self, op):
if op in ["list_findings", "list_detectors"]:
return True
else:
return False
def archive_findings(self):
return get_cli_raw_responses("archive_findings")
return MockResponse(args, **kwargs)
# Mocked paginate class for standalone tests.
class Paginate(object):
def __init__(self, op):
#op = op + "_paginated"
self.responses = [get_cli_raw_responses(op)]
self.index = 0
def __iter__(self):
for response in self.responses:
yield response
# Mocked paginator response for standalone tests.
def mocked_client_paginator(*args, **kwargs):
class MockPaginate:
"""Class will be used by the mock to replace paginator in circuits tests"""
def __init__(self, *args, **kwargs):
self.op = args[0][0]
def paginate(self):
return Paginate(self.op)
return MockPaginate(args, **kwargs)
def mocked_gd_client(*args, **kwargs):
class MockResponse:
"""Class will be used by the mock to replace amp_client in circuits tests"""
def __init__(self, *args, **kwargs):
pass
def __contains__(self, key):
return True if key in self.__dict__.keys() else False
def get(self, op=None, paginate=False, **kwargs):
if op == "describe_regions":
return get_cli_raw_responses("describe_regions")["Regions"]
if op == "list_detectors":
return get_cli_raw_responses("list_detectors")["DetectorIds"]
if op == "list_findings":
return get_cli_raw_responses("list_findings")["FindingIds"]
if op == "get_findings":
if "60baffd3f9042e38640f2300d5c5a630" in kwargs["FindingIds"]:
return []
else:
return get_cli_raw_responses("get_findings")["Findings"]
def post(self, op, **kwargs):
if op == "archive_findings":
if "32b7017d2019dfe922abc4e07c3fdfff" in kwargs["DetectorId"]:
return {"status": "error",
"msg": "An error occurred (BadRequestException) when calling the ArchiveFindings operation: "
"The request is rejected because the input detectorId is not owned by the current account."}
else:
return{"status": "ok"}
return MockResponse(*args, **kwargs)
def mocked_ResSvc(*args, **kwargs):
class MockResponse:
"""Class will be used by the mock to replace AWS GuardDuty resilient_service in circuits tests"""
def __init__(self, *args, **kwargs):
pass
def __contains__(self, key):
return True if key in self.__dict__.keys() else False
def find_resilient_incident_for_req(self, finding, f_fields):
return get_resilient_responses("find_resilient_incident_for_req")
def create_incident(self, finding, f_fields):
return get_resilient_responses("create_incident")
def add_datatables(self, i_payload):
return
def find_resilient_artifacts_for_incident(self, incident_id):
if incident_id == 1:
return get_resilient_responses("find_resilient_artifacts_for_incident_with_artifacts")
else:
return get_resilient_responses("find_resilient_artifacts_for_incident_no_artifacts")
def add_comment(self, incident_id, note):
return
def page_incidents(self, region=None, f_fields=None):
yield get_resilient_responses("page_incidents")
return MockResponse(*args, **kwargs)
def get_mock_config():
config_data = u"""[fn_aws_guardduty]
aws_gd_access_key_id=AKAABBCCDDEEFFGGHH12
aws_gd_secret_access_key=pplXXEEK/aAbBcCdDeEfFgGhHiH1234567+sssss
aws_gd_master_region="us-west-2"
aws_gd_regions=".*"
aws_gd_regions_interval=2
aws_gd_polling_interval=1
aws_gd_severity_threshold = 7
aws_gd_lookback_interval=60
#http_proxy=http://proxy:80
#https_proxy=http://proxy:443
"""
return config_data | 171.200183 | 28,683 | 0.430384 | 13,533 | 187,293 | 5.872164 | 0.060075 | 0.286103 | 0.4136 | 0.533347 | 0.848669 | 0.83158 | 0.817084 | 0.80601 | 0.799039 | 0.784467 | 0 | 0.07244 | 0.436076 | 187,293 | 1,094 | 28,684 | 171.200183 | 0.679963 | 0.004656 | 0 | 0.485632 | 1 | 0.073755 | 0.683517 | 0.184792 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030651 | false | 0.010536 | 0.001916 | 0.008621 | 0.066092 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
a99efd60df99643fe219081f04bc6e8c744566a6 | 4,763 | py | Python | gen_sh.py | sanixa/gan-leaks-custom | f2efd8c8f4d267dd728bf00c8936d6f04a63736e | [
"MIT"
] | null | null | null | gen_sh.py | sanixa/gan-leaks-custom | f2efd8c8f4d267dd728bf00c8936d6f04a63736e | [
"MIT"
] | null | null | null | gen_sh.py | sanixa/gan-leaks-custom | f2efd8c8f4d267dd728bf00c8936d6f04a63736e | [
"MIT"
] | null | null | null | atk_type = "fbb"
tp = "cifar"
#------------------------------fbb----------------------------------------------------#
# model = "GS_WGAN"
# model = "CGAN"
# if(model =="GS_WGAN"):
# model_dir = "gswgan"
# elif(model =="CGAN"):
# model_dir = "dpcgan"
# fl = open(f"{atk_type}_{model_dir}_{tp}_iters.sh","w")
# iters = [1000,5000,10000,20000]
# eps = [1,10,100,1000,"inf"]
# # eps = [10]
# for it in iters:
# for e in eps:
# s = f"python attack_models/{atk_type}.py -name {model_dir}_eps{e}_{it}_{tp} -posdir ./data/{model}/{tp}/eps{e}/diff_iter/{it}/train/ -negdir ./data/{model}/{tp}/eps{e}/diff_iter/{it}/test -gdir ./data/{model}/{tp}/eps{e}/diff_iter/{it} --data_num 64"
# fl.write(s+"\n")
# s2 = f"python attack_models/tools/eval_roc.py --attack_type {atk_type} -ldir ./attack_models/results/{atk_type}/{model_dir}_eps{e}_{it}_{tp}"
# fl.write(s2+"\n")
# fl.close()
# for e in eps:
# fl = open(f"{atk_type}_{model_dir}_acc_eps{e}_{tp}.sh","w")
# if(e==1):
# accs = [10,20,30,40]
# elif(e==10):
# accs = [10,20,30,40,50,60]
# elif(e==100 ):
# accs = [10,20,30,40,50,60,70]
# elif(e==1000):
# accs = [10,20,30,40,50,60,70,80]
# elif(e=="inf"):
# accs =[10,20,30,40,50,60,70,80]
# for acc in accs:
# s = f"python attack_models/{atk_type}.py -name {model_dir}_eps{e}_{acc}_{tp} -posdir ./data/{model}/{tp}/eps{e}/diff_acc/{acc}/train/ -negdir ./data/{model}/{tp}/eps{e}/diff_acc/{acc}/test -gdir ./data/{model}/{tp}/eps{e}/diff_acc/{acc} --data_num 64"
# fl.write(s+"\n")
# s2 = f"python attack_models/tools/eval_roc.py --attack_type {atk_type} -ldir ./attack_models/results/{atk_type}/{model_dir}_eps{e}_{acc}_{tp}"
# fl.write(s2+"\n")
# fl.close()
#--------------------------------------above fbb-----------------------------------------#
#-------------------------------------wb-------------------------------------------------#
atk_type = "wb"
tp = "mnist"
model = "CGAN"
if(model =="GS_WGAN"):
model_dir = "gswgan"
elif(model =="CGAN"):
model_dir = "dpcgan"
# # python wb_gswgan.py -name gswgan_mnist_1000 -posdir ../data/GS_WGAN/eps1/diff_iter/1000/train/ -negdir ../data/GS_WGAN/eps1/diff_iter/1000/test -gdir ./data/GS_WGAN/eps{e}/diff_iter/1000 --data_num 1000
fl = open(f"{atk_type}_{model_dir}_iters.sh","w")
fl2 = open(f"{atk_type}_{model_dir}_iters_print.sh","w")
iters = [1000,5000,10000,20000]
eps = [1,10,100,1000,"inf"]
f = open(f"convert_{model_dir}_iters_mnist.sh","w")
for it in iters:
for e in eps:
s2 = f"python convert.py -src ./checkpoint/{tp}/{model}/eps{e}/diff_iter/DPCGAN_MNIST_eps{e}_iteration{it}.ckpt -dest ./checkpoint/{tp}/{model}/eps{e}/diff_iter/DPCGAN_MNIST_eps{e}_iteration{it}_.ckpt"
f.write(s2+"\n")
s = f"CUDA_VISIBLE_DEVICES=5 python attack_models/{atk_type}_dpcgan.py -name {model}_eps{e}_{it} -model_name DPCGAN_MNIST_eps{e}_iteration{it}_.ckpt -posdir ./data/{model}/{tp}/eps{e}/diff_iter/{it}/train/ -negdir ./data/{model}/{tp}/eps{e}/diff_iter/{it}/test -gdir ./checkpoint/{tp}/{model}/eps{e}/diff_iter/ --data_num 64"
fl.write(s+"\n")
s3 = f"python attack_models/tools/eval_roc.py --attack_type {atk_type} -ldir ./attack_models/results/{atk_type}/{model}_eps{e}_{it}"
fl2.write(s3+"\n")
fl.close()
f.close()
fl2.close()
for e in eps:
fl = open(f"convert_{atk_type}_{model_dir}_acc_eps{e}_{tp}.sh","w")
fl2 = open(f"{atk_type}_{model_dir}_acc_eps{e}_{tp}.sh","w")
if(e==1):
accs = [10,20,30,40]
elif(e==10):
accs = [10,20,30,40,50]
elif(e==100 ):
accs = [10,20,30,40,50,60,70]
elif(e==1000):
accs = [10,20,30,40,50,60,70,80]
elif(e=="inf"):
accs =[10,20,30,40,50,60,70,80]
for acc in accs:
s2 = f"python convert.py -src ./checkpoint/{tp}/{model}/eps{e}/diff_acc/DPCGAN_MNIST_eps{e}_acc{acc}.ckpt -dest ./checkpoint/{tp}/{model}/eps{e}/diff_iter/DPCGAN_MNIST_eps{e}_iteration{it}_.ckpt"
fl.write(s2+"\n")
s = f"python attack_models/{atk_type}_dpcgan.py -name {model_dir}_eps{e}_{acc}_{tp} -posdir ./data/{model}/{tp}/eps{e}/diff_acc/{acc}/train/ -negdir ./data/{model}/{tp}/eps{e}/diff_acc/{acc}/test -gdir ./data/{model}/{tp}/eps{e}/diff_acc/{acc} --data_num 64"
fl2.write(s+"\n")
s2 = f"python attack_models/tools/eval_roc.py --attack_type {atk_type} -ldir ./attack_models/results/{atk_type}/{model_dir}_eps{e}_{acc}_{tp}"
fl2.write(s2+"\n")
fl.close()
fl2.close()
#--------------------------------------above wb-----------------------------------------# | 50.136842 | 334 | 0.560361 | 752 | 4,763 | 3.337766 | 0.111702 | 0.05259 | 0.054183 | 0.061355 | 0.868924 | 0.855777 | 0.854183 | 0.767331 | 0.7251 | 0.684462 | 0 | 0.064938 | 0.165862 | 4,763 | 95 | 335 | 50.136842 | 0.566826 | 0.453706 | 0 | 0.170213 | 0 | 0.12766 | 0.594397 | 0.455542 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.021277 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5704b0ae095ae501456ce9bffefd82dd1bea3250 | 23,116 | py | Python | bobstack/tests/abstractTransportConnectionTestCase.py | bobjects/BobStack | c177b286075044832f44baf9ace201780c8b4320 | [
"Apache-2.0"
] | null | null | null | bobstack/tests/abstractTransportConnectionTestCase.py | bobjects/BobStack | c177b286075044832f44baf9ace201780c8b4320 | [
"Apache-2.0"
] | null | null | null | bobstack/tests/abstractTransportConnectionTestCase.py | bobjects/BobStack | c177b286075044832f44baf9ace201780c8b4320 | [
"Apache-2.0"
] | null | null | null | from unittest import TestCase
from ..sipmessaging import SIPMessageFactory
from ..sipmessaging import SIPURI
from ..siptransport import SimulatedSIPTransport
from ..sipentity import SIPStatelessProxy
from ..siptransport import SimulatedNetwork
class AbstractTransportConnectionTestCase(TestCase):
# We will be refactoring most of the test behavior down to here, and override
# to test different transports as we implement them.
@property
def bindAddress1(self):
return '127.0.0.80'
@property
def bindPort1(self):
return 5060
@property
def bindAddress2(self):
return '127.0.0.81'
@property
def bindPort2(self):
return 5062
@property
def bindAddress3(self):
return '127.0.0.82'
@property
def bindPort3(self):
return 5063
@property
def sampleRequest(self):
message_string = ('INVITE sip:3122221000@example.com:5061;user=phone;transport=TLS SIP/2.0\r\n'
'From: <sip:200.25.3.150:5061>;tag=0ee8d3e272e31c9195299efc500\r\n'
'To: <sip:example.com:5061>\r\n'
'Call-ID: 0ee8d3e272e31c9195299efc500\r\n'
'CSeq: 6711 SIPMETHODTOREPLACE\r\n'
'Max-Forwards: 70\r\n'
'Via: SIP/2.0/TLS 200.25.3.150;branch=z9hG4bK0ee8d3e272e31ca195299efc500\r\n'
'Via: SIP/2.0/TLS 200.25.3.250;branch=z9hG4bKfdkajhdiruyalkghjladksjf\r\n'
'Via: SIP/2.0/TLS 200.25.3.255;branch=z9hG4bKduyroiuryaludhgviukfhlasf\r\n'
'User-Agent: Example User Agent\r\n'
'Contact: <sip:invalid@200.25.3.150:5061;transport=tls>\r\n'
'Route: <sip:200.25.3.230:5061;transport=tls;lr>\r\n'
'Accept: application/sdp,application/isup,application/dtmf,application/dtmf-relay,multipart/mixed\r\n'
'Accept-Encoding: x-nortel-short\r\n'
'Accept-Language: en-us,fr-fr\r\n'
'Allow: ACK,BYE,CANCEL,INFO,INVITE,OPTIONS,REGISTER,SUBSCRIBE,UPDATE\r\n'
'Authorization: Digest username="3122221000",realm="SomeRealm",nonce="1111790769596",uri="sip:3122211004@example.com",response="9bf77d8238664fe08dafd4d2abb6f1cb",algorithm=MD5\r\n'
'Call-Info: <https://lsc14pa.example.com:443/pa/direct/pictureServlet?user=3126805100@example.com>;Purpose=icon\r\n'
'Content-Disposition: session;handling=required\r\n'
'Content-Type: application/sdp\r\n'
'Date: Sat, 01 Feb 2014 22:07:34 GMT\r\n'
'Record-Route: <sip:200.25.3.230:5061;transport=tls;lr>\r\n'
'Require: sdp-anat\r\n'
'Retry-After: 30\r\n'
'Server: Blargomatic 2.0\r\n'
'Session-Expires: 1200\r\n'
'Supported: 100rel,histinfo,join,replaces,sdp-anat,timer\r\n'
'Timestamp: 1392061773\r\n'
'WWW-Authenticate: Digest algorithm=MD5,nonce="1111790769596",realm="SomeRealm"\r\n'
'Warning: 370 200.21.3.10 "Insufficient Bandwidth"\r\n'
'X-RTP-Stat: PR=0;ER=0;PL=0;RB=0/0;DE=PCMU;EN=PCMU;JI=0;DL=0,0;IP=10.1.0.33:16384,132.52.127.200:20048\r\n'
'x-channel: ds/ds1-3/12;IP=132.52.127.16\r\n'
'Referred-By: <sip:6006665100@example.com;user=phone> ; CorrelationID="0508817f84e7ce64745ef9753e2fbff4664321a4@200.23.3.240"\r\n'
'Refer-To: <sip:6006665499;rfrid=28661859@example.com;user=phone?x-nt-resource-priority=YNBvf.2j00qao>\r\n'
'Subject: Need more boxes\r\n'
'Referred-By: <sip:5556785103@example.com;user=phone> ; CorrelationID="348058f0947acec8745efd367e33542c5cb01436@192.168.0.3"\r\n'
'Refer-To: <sip:5556645204@example.com:5064;user=phone;transport=udp>\r\n'
'Allow-Events: dialog,message-summary\r\n'
'Event: refer;id=10498\r\n'
'Content-Encoding: gzip\r\n'
'RAck: 1 1 INVITE\r\n'
'P-Charge: <sip:6425555555@10.10.10.10>;npi=ISDN;noa=2\r\n'
'Reply-To: Bob <sip:bob@biloxi.com>\r\n'
'Unsupported: foo\r\n'
'P-Asserted-Identity: "500 - SIP Test" <sip:500@192.168.0.3>\r\n'
'P-Preferred-Identity: "User 5103" <sip:3126705103@192.168.0.3:5060>\r\n'
'Remote-Party-ID: "1234567890" <sip:1234567890@192.168.1.195>;party=calling;privacy=off;screen=no\r\n'
'Alert-Info: <cid:internal@example.com>;alert-type=internal\r\n'
'History-Info: "555122221002" <sip:555122221002@example.com>;index=1.1\r\n'
'P-Called-Party-Id: <sip:2135881@example.com;user=phone>\r\n'
'P-RTP-Stat: PS=0,OS=0,PR=5429,OR=955504,PL=0,JI=0,LA=0,DU=108\r\n'
'Privacy: id\r\n'
'Proxy-Authenticate: Digest realm="1.1.1.1", nonce="8dd33eb2-e3c4-11e5-a55b-83b175043a03", algorithm=MD5, qop="auth"\r\n'
'Proxy-Authorization: Digest username="100",realm="209.105.255.124",nonce="7bebcf02-e01d-11e5-931d-83b175043a03",uri="sip:90011@209.105.255.124",response="63faaa2604cae36e9b38f2d5cd0abba4",cnonce="4b41f53e6f00c05",nc=00000001,qop="auth",algorithm=MD5\r\n'
'Proxy-Require: foo\r\n'
'Reason: Q.850; cause=16; reason=Terminated\r\n'
'Record-Session-Expires: 1200;refresher=uac\r\n'
'Replaces: 19cd9bf094ff5f0c1745ef975c1cf65d34beb908f@192.168.0.3;to-tag=29bd570-f0a1ec8-13c5-50029-aa872-7d78286-aa872;from-tag=7ca31b4791\r\n'
'Subscription-State: active;reason=deactivated;expires=50\r\n'
'Min-Expires: 1800\r\n'
'Content-Length: 0')
return SIPMessageFactory().next_for_string(message_string)
@property
def sampleRequest2(self):
message_string = ('BYE sip:3122221000@example.com:5061;user=phone;transport=TLS SIP/2.0\r\n'
'From: <sip:200.25.3.150:5061>;tag=0ee8d3e272e31c9195299efc500\r\n'
'To: <sip:example.com:5061>\r\n'
'Call-ID: 0ee8d3e272e31c9195299efc500\r\n'
'CSeq: 6711 SIPMETHODTOREPLACE\r\n'
'Max-Forwards: 70\r\n'
'Via: SIP/2.0/TLS 200.25.3.150;branch=z9hG4bK0ee8d3e272e31ca195299efc500\r\n'
'Via: SIP/2.0/TLS 200.25.3.250;branch=z9hG4bKfdkajhdiruyalkghjladksjf\r\n'
'Via: SIP/2.0/TLS 200.25.3.255;branch=z9hG4bKduyroiuryaludhgviukfhlasf\r\n'
'User-Agent: Example User Agent\r\n'
'Contact: <sip:invalid@200.25.3.150:5061;transport=tls>\r\n'
'Route: <sip:200.25.3.230:5061;transport=tls;lr>\r\n'
'Accept: application/sdp,application/isup,application/dtmf,application/dtmf-relay,multipart/mixed\r\n'
'Accept-Encoding: x-nortel-short\r\n'
'Accept-Language: en-us,fr-fr\r\n'
'Allow: ACK,BYE,CANCEL,INFO,INVITE,OPTIONS,REGISTER,SUBSCRIBE,UPDATE\r\n'
'Authorization: Digest username="3122221000",realm="SomeRealm",nonce="1111790769596",uri="sip:3122211004@example.com",response="9bf77d8238664fe08dafd4d2abb6f1cb",algorithm=MD5\r\n'
'Call-Info: <https://lsc14pa.example.com:443/pa/direct/pictureServlet?user=3126805100@example.com>;Purpose=icon\r\n'
'Content-Disposition: session;handling=required\r\n'
'Content-Type: application/sdp\r\n'
'Date: Sat, 01 Feb 2014 22:07:34 GMT\r\n'
'Record-Route: <sip:200.25.3.230:5061;transport=tls;lr>\r\n'
'Require: sdp-anat\r\n'
'Retry-After: 30\r\n'
'Server: Blargomatic 2.0\r\n'
'Session-Expires: 1200\r\n'
'Supported: 100rel,histinfo,join,replaces,sdp-anat,timer\r\n'
'Timestamp: 1392061773\r\n'
'WWW-Authenticate: Digest algorithm=MD5,nonce="1111790769596",realm="SomeRealm"\r\n'
'Warning: 370 200.21.3.10 "Insufficient Bandwidth"\r\n'
'X-RTP-Stat: PR=0;ER=0;PL=0;RB=0/0;DE=PCMU;EN=PCMU;JI=0;DL=0,0;IP=10.1.0.33:16384,132.52.127.200:20048\r\n'
'x-channel: ds/ds1-3/12;IP=132.52.127.16\r\n'
'Referred-By: <sip:6006665100@example.com;user=phone> ; CorrelationID="0508817f84e7ce64745ef9753e2fbff4664321a4@200.23.3.240"\r\n'
'Refer-To: <sip:6006665499;rfrid=28661859@example.com;user=phone?x-nt-resource-priority=YNBvf.2j00qao>\r\n'
'Subject: Need more boxes\r\n'
'Referred-By: <sip:5556785103@example.com;user=phone> ; CorrelationID="348058f0947acec8745efd367e33542c5cb01436@192.168.0.3"\r\n'
'Refer-To: <sip:5556645204@example.com:5064;user=phone;transport=udp>\r\n'
'Allow-Events: dialog,message-summary\r\n'
'Event: refer;id=10498\r\n'
'Content-Encoding: gzip\r\n'
'RAck: 1 1 INVITE\r\n'
'P-Charge: <sip:6425555555@10.10.10.10>;npi=ISDN;noa=2\r\n'
'Reply-To: Bob <sip:bob@biloxi.com>\r\n'
'Unsupported: foo\r\n'
'P-Asserted-Identity: "500 - SIP Test" <sip:500@192.168.0.3>\r\n'
'P-Preferred-Identity: "User 5103" <sip:3126705103@192.168.0.3:5060>\r\n'
'Remote-Party-ID: "1234567890" <sip:1234567890@192.168.1.195>;party=calling;privacy=off;screen=no\r\n'
'Alert-Info: <cid:internal@example.com>;alert-type=internal\r\n'
'History-Info: "555122221002" <sip:555122221002@example.com>;index=1.1\r\n'
'P-Called-Party-Id: <sip:2135881@example.com;user=phone>\r\n'
'P-RTP-Stat: PS=0,OS=0,PR=5429,OR=955504,PL=0,JI=0,LA=0,DU=108\r\n'
'Privacy: id\r\n'
'Proxy-Authenticate: Digest realm="1.1.1.1", nonce="8dd33eb2-e3c4-11e5-a55b-83b175043a03", algorithm=MD5, qop="auth"\r\n'
'Proxy-Authorization: Digest username="100",realm="209.105.255.124",nonce="7bebcf02-e01d-11e5-931d-83b175043a03",uri="sip:90011@209.105.255.124",response="63faaa2604cae36e9b38f2d5cd0abba4",cnonce="4b41f53e6f00c05",nc=00000001,qop="auth",algorithm=MD5\r\n'
'Proxy-Require: foo\r\n'
'Reason: Q.850; cause=16; reason=Terminated\r\n'
'Record-Session-Expires: 1200;refresher=uac\r\n'
'Replaces: 19cd9bf094ff5f0c1745ef975c1cf65d34beb908f@192.168.0.3;to-tag=29bd570-f0a1ec8-13c5-50029-aa872-7d78286-aa872;from-tag=7ca31b4791\r\n'
'Subscription-State: active;reason=deactivated;expires=50\r\n'
'Min-Expires: 1800\r\n'
'Content-Length: 0')
return SIPMessageFactory().next_for_string(message_string)
@property
def sampleResponse(self):
message_string = ('SIP/2.0 200 OK\r\n'
'From: <sip:200.25.3.150:5061>;tag=0ee8d3e272e31c9195299efc500\r\n'
'To: <sip:example.com:5061>\r\n'
'Call-ID: 0ee8d3e272e31c9195299efc500\r\n'
'CSeq: 6711 SIPMETHODTOREPLACE\r\n'
'Max-Forwards: 70\r\n'
'Via: SIP/2.0/TLS 200.25.3.150;branch=z9hG4bK0ee8d3e272e31ca195299efc500\r\n'
'Via: SIP/2.0/TLS 200.25.3.250;branch=z9hG4bKfdkajhdiruyalkghjladksjf\r\n'
'Via: SIP/2.0/TLS 200.25.3.255;branch=z9hG4bKduyroiuryaludhgviukfhlasf\r\n'
'User-Agent: Example User Agent\r\n'
'Contact: <sip:invalid@200.25.3.150:5061;transport=tls>\r\n'
'Route: <sip:200.25.3.230:5061;transport=tls;lr>\r\n'
'Accept: application/sdp,application/isup,application/dtmf,application/dtmf-relay,multipart/mixed\r\n'
'Accept-Encoding: x-nortel-short\r\n'
'Accept-Language: en-us,fr-fr\r\n'
'Allow: ACK,BYE,CANCEL,INFO,INVITE,OPTIONS,REGISTER,SUBSCRIBE,UPDATE\r\n'
'Authorization: Digest username="3122221000",realm="SomeRealm",nonce="1111790769596",uri="sip:3122211004@example.com",response="9bf77d8238664fe08dafd4d2abb6f1cb",algorithm=MD5\r\n'
'Call-Info: <https://lsc14pa.example.com:443/pa/direct/pictureServlet?user=3126805100@example.com>;Purpose=icon\r\n'
'Content-Disposition: session;handling=required\r\n'
'Content-Type: application/sdp\r\n'
'Date: Sat, 01 Feb 2014 22:07:34 GMT\r\n'
'Record-Route: <sip:200.25.3.230:5061;transport=tls;lr>\r\n'
'Require: sdp-anat\r\n'
'Retry-After: 30\r\n'
'Server: Blargomatic 2.0\r\n'
'Session-Expires: 1200\r\n'
'Supported: 100rel,histinfo,join,replaces,sdp-anat,timer\r\n'
'Timestamp: 1392061773\r\n'
'WWW-Authenticate: Digest algorithm=MD5,nonce="1111790769596",realm="SomeRealm"\r\n'
'Warning: 370 200.21.3.10 "Insufficient Bandwidth"\r\n'
'X-RTP-Stat: PR=0;ER=0;PL=0;RB=0/0;DE=PCMU;EN=PCMU;JI=0;DL=0,0;IP=10.1.0.33:16384,132.52.127.200:20048\r\n'
'x-channel: ds/ds1-3/12;IP=132.52.127.16\r\n'
'Referred-By: <sip:6006665100@example.com;user=phone> ; CorrelationID="0508817f84e7ce64745ef9753e2fbff4664321a4@200.23.3.240"\r\n'
'Refer-To: <sip:6006665499;rfrid=28661859@example.com;user=phone?x-nt-resource-priority=YNBvf.2j00qao>\r\n'
'Subject: Need more boxes\r\n'
'Referred-By: <sip:5556785103@example.com;user=phone> ; CorrelationID="348058f0947acec8745efd367e33542c5cb01436@192.168.0.3"\r\n'
'Refer-To: <sip:5556645204@example.com:5064;user=phone;transport=udp>\r\n'
'Allow-Events: dialog,message-summary\r\n'
'Event: refer;id=10498\r\n'
'Content-Encoding: gzip\r\n'
'RAck: 1 1 INVITE\r\n'
'P-Charge: <sip:6425555555@10.10.10.10>;npi=ISDN;noa=2\r\n'
'Reply-To: Bob <sip:bob@biloxi.com>\r\n'
'Unsupported: foo\r\n'
'P-Asserted-Identity: "500 - SIP Test" <sip:500@192.168.0.3>\r\n'
'P-Preferred-Identity: "User 5103" <sip:3126705103@192.168.0.3:5060>\r\n'
'Remote-Party-ID: "1234567890" <sip:1234567890@192.168.1.195>;party=calling;privacy=off;screen=no\r\n'
'Alert-Info: <cid:internal@example.com>;alert-type=internal\r\n'
'History-Info: "555122221002" <sip:555122221002@example.com>;index=1.1\r\n'
'P-Called-Party-Id: <sip:2135881@example.com;user=phone>\r\n'
'P-RTP-Stat: PS=0,OS=0,PR=5429,OR=955504,PL=0,JI=0,LA=0,DU=108\r\n'
'Privacy: id\r\n'
'Proxy-Authenticate: Digest realm="1.1.1.1", nonce="8dd33eb2-e3c4-11e5-a55b-83b175043a03", algorithm=MD5, qop="auth"\r\n'
'Proxy-Authorization: Digest username="100",realm="209.105.255.124",nonce="7bebcf02-e01d-11e5-931d-83b175043a03",uri="sip:90011@209.105.255.124",response="63faaa2604cae36e9b38f2d5cd0abba4",cnonce="4b41f53e6f00c05",nc=00000001,qop="auth",algorithm=MD5\r\n'
'Proxy-Require: foo\r\n'
'Reason: Q.850; cause=16; reason=Terminated\r\n'
'Record-Session-Expires: 1200;refresher=uac\r\n'
'Replaces: 19cd9bf094ff5f0c1745ef975c1cf65d34beb908f@192.168.0.3;to-tag=29bd570-f0a1ec8-13c5-50029-aa872-7d78286-aa872;from-tag=7ca31b4791\r\n'
'Subscription-State: active;reason=deactivated;expires=50\r\n'
'Min-Expires: 1800\r\n'
'Content-Length: 0')
return SIPMessageFactory().next_for_string(message_string)
@property
def sampleResponse2(self):
message_string = ('SIP/2.0 180 Ringing\r\n'
'From: <sip:200.25.3.150:5061>;tag=0ee8d3e272e31c9195299efc500\r\n'
'To: <sip:example.com:5061>\r\n'
'Call-ID: 0ee8d3e272e31c9195299efc500\r\n'
'CSeq: 6711 SIPMETHODTOREPLACE\r\n'
'Max-Forwards: 70\r\n'
'Via: SIP/2.0/TLS 200.25.3.150;branch=z9hG4bK0ee8d3e272e31ca195299efc500\r\n'
'Via: SIP/2.0/TLS 200.25.3.250;branch=z9hG4bKfdkajhdiruyalkghjladksjf\r\n'
'Via: SIP/2.0/TLS 200.25.3.255;branch=z9hG4bKduyroiuryaludhgviukfhlasf\r\n'
'User-Agent: Example User Agent\r\n'
'Contact: <sip:invalid@200.25.3.150:5061;transport=tls>\r\n'
'Route: <sip:200.25.3.230:5061;transport=tls;lr>\r\n'
'Accept: application/sdp,application/isup,application/dtmf,application/dtmf-relay,multipart/mixed\r\n'
'Accept-Encoding: x-nortel-short\r\n'
'Accept-Language: en-us,fr-fr\r\n'
'Allow: ACK,BYE,CANCEL,INFO,INVITE,OPTIONS,REGISTER,SUBSCRIBE,UPDATE\r\n'
'Authorization: Digest username="3122221000",realm="SomeRealm",nonce="1111790769596",uri="sip:3122211004@example.com",response="9bf77d8238664fe08dafd4d2abb6f1cb",algorithm=MD5\r\n'
'Call-Info: <https://lsc14pa.example.com:443/pa/direct/pictureServlet?user=3126805100@example.com>;Purpose=icon\r\n'
'Content-Disposition: session;handling=required\r\n'
'Content-Type: application/sdp\r\n'
'Date: Sat, 01 Feb 2014 22:07:34 GMT\r\n'
'Record-Route: <sip:200.25.3.230:5061;transport=tls;lr>\r\n'
'Require: sdp-anat\r\n'
'Retry-After: 30\r\n'
'Server: Blargomatic 2.0\r\n'
'Session-Expires: 1200\r\n'
'Supported: 100rel,histinfo,join,replaces,sdp-anat,timer\r\n'
'Timestamp: 1392061773\r\n'
'WWW-Authenticate: Digest algorithm=MD5,nonce="1111790769596",realm="SomeRealm"\r\n'
'Warning: 370 200.21.3.10 "Insufficient Bandwidth"\r\n'
'X-RTP-Stat: PR=0;ER=0;PL=0;RB=0/0;DE=PCMU;EN=PCMU;JI=0;DL=0,0;IP=10.1.0.33:16384,132.52.127.200:20048\r\n'
'x-channel: ds/ds1-3/12;IP=132.52.127.16\r\n'
'Referred-By: <sip:6006665100@example.com;user=phone> ; CorrelationID="0508817f84e7ce64745ef9753e2fbff4664321a4@200.23.3.240"\r\n'
'Refer-To: <sip:6006665499;rfrid=28661859@example.com;user=phone?x-nt-resource-priority=YNBvf.2j00qao>\r\n'
'Subject: Need more boxes\r\n'
'Referred-By: <sip:5556785103@example.com;user=phone> ; CorrelationID="348058f0947acec8745efd367e33542c5cb01436@192.168.0.3"\r\n'
'Refer-To: <sip:5556645204@example.com:5064;user=phone;transport=udp>\r\n'
'Allow-Events: dialog,message-summary\r\n'
'Event: refer;id=10498\r\n'
'Content-Encoding: gzip\r\n'
'RAck: 1 1 INVITE\r\n'
'P-Charge: <sip:6425555555@10.10.10.10>;npi=ISDN;noa=2\r\n'
'Reply-To: Bob <sip:bob@biloxi.com>\r\n'
'Unsupported: foo\r\n'
'P-Asserted-Identity: "500 - SIP Test" <sip:500@192.168.0.3>\r\n'
'P-Preferred-Identity: "User 5103" <sip:3126705103@192.168.0.3:5060>\r\n'
'Remote-Party-ID: "1234567890" <sip:1234567890@192.168.1.195>;party=calling;privacy=off;screen=no\r\n'
'Alert-Info: <cid:internal@example.com>;alert-type=internal\r\n'
'History-Info: "555122221002" <sip:555122221002@example.com>;index=1.1\r\n'
'P-Called-Party-Id: <sip:2135881@example.com;user=phone>\r\n'
'P-RTP-Stat: PS=0,OS=0,PR=5429,OR=955504,PL=0,JI=0,LA=0,DU=108\r\n'
'Privacy: id\r\n'
'Proxy-Authenticate: Digest realm="1.1.1.1", nonce="8dd33eb2-e3c4-11e5-a55b-83b175043a03", algorithm=MD5, qop="auth"\r\n'
'Proxy-Authorization: Digest username="100",realm="209.105.255.124",nonce="7bebcf02-e01d-11e5-931d-83b175043a03",uri="sip:90011@209.105.255.124",response="63faaa2604cae36e9b38f2d5cd0abba4",cnonce="4b41f53e6f00c05",nc=00000001,qop="auth",algorithm=MD5\r\n'
'Proxy-Require: foo\r\n'
'Reason: Q.850; cause=16; reason=Terminated\r\n'
'Record-Session-Expires: 1200;refresher=uac\r\n'
'Replaces: 19cd9bf094ff5f0c1745ef975c1cf65d34beb908f@192.168.0.3;to-tag=29bd570-f0a1ec8-13c5-50029-aa872-7d78286-aa872;from-tag=7ca31b4791\r\n'
'Subscription-State: active;reason=deactivated;expires=50\r\n'
'Min-Expires: 1800\r\n'
'Content-Length: 0')
return SIPMessageFactory().next_for_string(message_string)
| 77.83165 | 281 | 0.546418 | 2,834 | 23,116 | 4.451306 | 0.117149 | 0.03805 | 0.013317 | 0.024098 | 0.947444 | 0.943876 | 0.940388 | 0.940388 | 0.940388 | 0.940388 | 0 | 0.20564 | 0.312727 | 23,116 | 296 | 282 | 78.094595 | 0.588406 | 0.005451 | 0 | 0.903915 | 0 | 0.405694 | 0.643289 | 0.411921 | 0 | 0 | 0 | 0 | 0.014235 | 1 | 0.035587 | false | 0 | 0.021352 | 0.021352 | 0.096085 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
57239e5447b4797ce437d8eb8a8a8150fe696609 | 117 | py | Python | medimodule/Abdomen/__init__.py | cuchoco/MI2RLNet | 4ef84e641705df9b10e627c701eb0c9ed924a82a | [
"Apache-2.0"
] | 9 | 2021-02-25T23:10:17.000Z | 2022-02-14T11:48:11.000Z | medimodule/Abdomen/__init__.py | cuchoco/MI2RLNet | 4ef84e641705df9b10e627c701eb0c9ed924a82a | [
"Apache-2.0"
] | null | null | null | medimodule/Abdomen/__init__.py | cuchoco/MI2RLNet | 4ef84e641705df9b10e627c701eb0c9ed924a82a | [
"Apache-2.0"
] | 7 | 2021-02-22T12:20:24.000Z | 2022-03-07T04:56:53.000Z | from .module import LiverSegmentation
from .module import KidneyTumorSegmentation
from .module import PolypDetection
| 29.25 | 43 | 0.871795 | 12 | 117 | 8.5 | 0.5 | 0.294118 | 0.470588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 117 | 3 | 44 | 39 | 0.971429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
93d393075268e58af5d88c945b70abc95a02f550 | 511 | py | Python | deprecated/cloudmesh/file.py | cloudmesh/cloudmesh-nist | 85766956c3c5b2395fcdddd2034cde574b1cbd73 | [
"Apache-2.0"
] | 1 | 2018-10-30T18:14:15.000Z | 2018-10-30T18:14:15.000Z | deprecated/cloudmesh/file.py | cloudmesh/cloudmesh-nist | 85766956c3c5b2395fcdddd2034cde574b1cbd73 | [
"Apache-2.0"
] | 28 | 2019-05-20T16:16:56.000Z | 2022-03-26T07:07:59.000Z | deprecated/cloudmesh/file.py | cloudmesh-community/nist | 85766956c3c5b2395fcdddd2034cde574b1cbd73 | [
"Apache-2.0"
] | 20 | 2018-07-02T09:38:22.000Z | 2019-03-06T10:55:59.000Z | def list():
'''Returns a list of files in the file store
'''
return 'TO BE IMPLEMENTED'
def add():
'''Uploads a file to the list of files in the file store
'''
return 'TO BE IMPLEMENTED'
def find_by_name(name):
'''Returns the named file in the file store
Params:
name - The name of the file
'''
return 'TO BE IMPLEMENTED'
def delete_by_name(name):
'''Deletes the named file in the file store
Params:
name - The name of the file
'''
return 'TO BE IMPLEMENTED'
| 13.810811 | 58 | 0.645793 | 81 | 511 | 4.024691 | 0.283951 | 0.128834 | 0.110429 | 0.171779 | 0.757669 | 0.748466 | 0.748466 | 0.748466 | 0.748466 | 0.748466 | 0 | 0 | 0.260274 | 511 | 36 | 59 | 14.194444 | 0.862434 | 0.508806 | 0 | 0.5 | 0 | 0 | 0.326923 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f5518f0a851fe70f458a6c4158a505f8889885ef | 112 | py | Python | winter/converters/__init__.py | EvgenySmekalin/winter | 24b6a02f958478547a4a120324823743a1f7e1a1 | [
"MIT"
] | 1 | 2020-03-28T14:54:28.000Z | 2020-03-28T14:54:28.000Z | winter/converters/__init__.py | EvgenySmekalin/winter | 24b6a02f958478547a4a120324823743a1f7e1a1 | [
"MIT"
] | null | null | null | winter/converters/__init__.py | EvgenySmekalin/winter | 24b6a02f958478547a4a120324823743a1f7e1a1 | [
"MIT"
] | null | null | null | from .convert_exception import ConvertException
from .converter import convert
from .converter import converter
| 28 | 47 | 0.866071 | 13 | 112 | 7.384615 | 0.461538 | 0.270833 | 0.395833 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 112 | 3 | 48 | 37.333333 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f568bd104d934e3c323007a4313c8ef4d2d3c688 | 10,706 | py | Python | test/ui_tests.py | mathandy/faust_python | 8e8177cae52dd905b338e3063d8598178b4100ac | [
"MIT"
] | 25 | 2015-02-16T10:58:46.000Z | 2022-02-16T14:27:10.000Z | test/ui_tests.py | mathandy/faust_python | 8e8177cae52dd905b338e3063d8598178b4100ac | [
"MIT"
] | 2 | 2020-06-13T21:35:44.000Z | 2021-11-24T08:15:14.000Z | test/ui_tests.py | mathandy/faust_python | 8e8177cae52dd905b338e3063d8598178b4100ac | [
"MIT"
] | 4 | 2018-11-01T11:56:32.000Z | 2021-11-01T23:26:03.000Z | import os
import unittest
import cffi
from . helpers import init_ffi, empty
from FAUSTPy import PythonUI
#################################
# test PythonUI
#################################
def tearDownModule():
cffi.verifier.cleanup_tmpdir(
tmpdir=os.sep.join([os.path.dirname(__file__), "__pycache__"])
)
class test_faustui(unittest.TestCase):
def setUp(self):
self.obj = empty()
self.ffi, self.C = init_ffi()
# grab the C object from the PythonUI instance
self.ui = PythonUI(self.ffi, self.obj)
def test_attributes(self):
"Verify presence of various attributes."
self.assertTrue(hasattr(self.ui, "ui"))
def test_declare_group(self):
"Test declaration of group meta-data."
c_ui = self.ui.ui
c_ui.declare(c_ui.uiInterface, self.ffi.NULL, b"key1", b"val1")
c_ui.openTabBox(c_ui.uiInterface, b"box1")
c_ui.declare(c_ui.uiInterface, self.ffi.NULL, b"key1", b"val1")
c_ui.declare(c_ui.uiInterface, self.ffi.NULL, b"key2", b"val2")
c_ui.openTabBox(c_ui.uiInterface, b"box2")
c_ui.closeBox(c_ui.uiInterface)
c_ui.declare(c_ui.uiInterface, self.ffi.NULL, b"key2", b"val2")
c_ui.declare(c_ui.uiInterface, self.ffi.NULL, b"key3", b"val3")
c_ui.openTabBox(c_ui.uiInterface, b"box3")
c_ui.closeBox(c_ui.uiInterface)
c_ui.closeBox(c_ui.uiInterface)
self.assertTrue(hasattr(self.obj.b_box1, "metadata"))
self.assertTrue(hasattr(self.obj.b_box1.b_box2, "metadata"))
self.assertTrue(hasattr(self.obj.b_box1.b_box3, "metadata"))
self.assertDictEqual(self.obj.b_box1.metadata,
{b"key1": b"val1"})
self.assertDictEqual(self.obj.b_box1.b_box2.metadata,
{b"key1": b"val1", b"key2": b"val2"})
self.assertDictEqual(self.obj.b_box1.b_box3.metadata,
{b"key2": b"val2", b"key3": b"val3"})
def test_declare_parameter(self):
"Test declaration of parameter meta-data."
c_ui = self.ui.ui
param1 = self.ffi.new("FAUSTFLOAT*", 0.0)
param2 = self.ffi.new("FAUSTFLOAT*", 0.5)
param3 = self.ffi.new("FAUSTFLOAT*", 1.0)
c_ui.declare(c_ui.uiInterface, param1, b"key1", b"val1")
c_ui.addVerticalSlider(c_ui.uiInterface, b"slider1", param1, 0.0, 0.0,
2.0, 0.1)
c_ui.declare(c_ui.uiInterface, param2, b"key1", b"val1")
c_ui.declare(c_ui.uiInterface, param2, b"key2", b"val2")
c_ui.addHorizontalSlider(c_ui.uiInterface, b"slider2", param2, 0.0,
0.0, 2.0, 0.1)
c_ui.declare(c_ui.uiInterface, param3, b"key2", b"val2")
c_ui.declare(c_ui.uiInterface, param3, b"key3", b"val3")
c_ui.addNumEntry(c_ui.uiInterface, b"numentry", param3, 0.0, 0.0, 2.0,
0.1)
# closeBox triggers assignment of parameter meta-data
c_ui.closeBox(c_ui.uiInterface)
self.assertTrue(hasattr(self.obj.p_slider1, "metadata"))
self.assertTrue(hasattr(self.obj.p_slider2, "metadata"))
self.assertTrue(hasattr(self.obj.p_numentry, "metadata"))
self.assertDictEqual(self.obj.p_slider1.metadata,
{b"key1": b"val1"})
self.assertDictEqual(self.obj.p_slider2.metadata,
{b"key1": b"val1", b"key2": b"val2"})
self.assertDictEqual(self.obj.p_numentry.metadata,
{b"key2": b"val2", b"key3": b"val3"})
def test_openVerticalBox(self):
"Test the openVerticalBox C callback."
c_ui = self.ui.ui
c_ui.openVerticalBox(c_ui.uiInterface, b"box")
c_ui.closeBox(c_ui.uiInterface)
self.assertTrue(hasattr(self.obj, "b_box"))
self.assertEqual(self.obj.b_box.layout, "vertical")
self.assertEqual(self.obj.b_box.label, b"box")
def test_openHorizontalBox(self):
"Test the openHorizontalBox C callback."
c_ui = self.ui.ui
c_ui.openHorizontalBox(c_ui.uiInterface, b"box")
c_ui.closeBox(c_ui.uiInterface)
self.assertTrue(hasattr(self.obj, "b_box"))
self.assertEqual(self.obj.b_box.layout, "horizontal")
self.assertEqual(self.obj.b_box.label, b"box")
def test_openTabBox(self):
"Test the openTabBox C callback."
c_ui = self.ui.ui
c_ui.openTabBox(c_ui.uiInterface, b"box")
c_ui.closeBox(c_ui.uiInterface)
self.assertTrue(hasattr(self.obj, "b_box"))
self.assertEqual(self.obj.b_box.layout, "tab")
self.assertEqual(self.obj.b_box.label, b"box")
def test_closeBox(self):
"Test the closeBox C callback."
c_ui = self.ui.ui
c_ui.openVerticalBox(c_ui.uiInterface, b"box1")
c_ui.openHorizontalBox(c_ui.uiInterface, b"box2")
c_ui.closeBox(c_ui.uiInterface)
c_ui.openTabBox(c_ui.uiInterface, b"box3")
c_ui.closeBox(c_ui.uiInterface)
c_ui.closeBox(c_ui.uiInterface)
c_ui.openTabBox(c_ui.uiInterface, b"box4")
c_ui.closeBox(c_ui.uiInterface)
self.assertTrue(hasattr(self.obj, "b_box1"))
self.assertTrue(hasattr(self.obj.b_box1, "b_box2"))
self.assertTrue(hasattr(self.obj.b_box1, "b_box3"))
self.assertTrue(hasattr(self.obj, "b_box4"))
def test_addHorizontalSlider(self):
"Test the addHorizontalSlider C callback."
c_ui = self.ui.ui
param = self.ffi.new("FAUSTFLOAT*", 1.0)
self.assertEqual(param[0], 1.0)
c_ui.addHorizontalSlider(c_ui.uiInterface, b"slider", param, 0.0, 0.0,
2.0, 0.1)
self.assertTrue(hasattr(self.obj, "p_slider"))
self.assertEqual(self.obj.p_slider.label, b"slider")
self.assertEqual(self.obj.p_slider.zone, 0.0)
self.assertEqual(self.obj.p_slider.min, 0.0)
self.assertEqual(self.obj.p_slider.max, 2.0)
self.assertAlmostEqual(self.obj.p_slider.step, 0.1, 8)
self.assertEqual(self.obj.p_slider.default, 0.0)
self.assertEqual(self.obj.p_slider.metadata, {})
self.assertEqual(self.obj.p_slider.type, "HorizontalSlider")
self.obj.p_slider.zone = 0.5
self.assertEqual(self.obj.p_slider.zone, param[0])
def test_addVerticalSlider(self):
"Test the addVerticalSlider C callback."
c_ui = self.ui.ui
param = self.ffi.new("FAUSTFLOAT*", 1.0)
self.assertEqual(param[0], 1.0)
c_ui.addVerticalSlider(c_ui.uiInterface, b"slider", param, 0.0, 0.0,
2.0, 0.1)
self.assertTrue(hasattr(self.obj, "p_slider"))
self.assertEqual(self.obj.p_slider.label, b"slider")
self.assertEqual(self.obj.p_slider.zone, 0.0)
self.assertEqual(self.obj.p_slider.min, 0.0)
self.assertEqual(self.obj.p_slider.max, 2.0)
self.assertAlmostEqual(self.obj.p_slider.step, 0.1, 8)
self.assertEqual(self.obj.p_slider.default, 0.0)
self.assertEqual(self.obj.p_slider.metadata, {})
self.assertEqual(self.obj.p_slider.type, "VerticalSlider")
self.obj.p_slider.zone = 0.5
self.assertEqual(self.obj.p_slider.zone, param[0])
def test_addNumEntry(self):
"Test the addNumEntry C callback."
c_ui = self.ui.ui
param = self.ffi.new("FAUSTFLOAT*", 1.0)
self.assertEqual(param[0], 1.0)
c_ui.addNumEntry(c_ui.uiInterface, b"numentry", param, 0.0, 0.0, 2.0,
0.1)
self.assertTrue(hasattr(self.obj, "p_numentry"))
self.assertEqual(self.obj.p_numentry.label, b"numentry")
self.assertEqual(self.obj.p_numentry.zone, 0.0)
self.assertEqual(self.obj.p_numentry.min, 0.0)
self.assertEqual(self.obj.p_numentry.max, 2.0)
self.assertAlmostEqual(self.obj.p_numentry.step, 0.1, 8)
self.assertEqual(self.obj.p_numentry.default, 0.0)
self.assertEqual(self.obj.p_numentry.metadata, {})
self.assertEqual(self.obj.p_numentry.type, "NumEntry")
self.obj.p_numentry.zone = 0.5
self.assertEqual(self.obj.p_numentry.zone, param[0])
def test_addButton(self):
"Test the addButton C callback."
c_ui = self.ui.ui
param = self.ffi.new("FAUSTFLOAT*", 1.0)
c_ui.addButton(c_ui.uiInterface, b"button", param)
self.assertTrue(hasattr(self.obj, "p_button"))
self.assertEqual(self.obj.p_button.label, b"button")
self.assertEqual(self.obj.p_button.zone, 0.0)
self.assertEqual(self.obj.p_button.min, 0.0)
self.assertEqual(self.obj.p_button.max, 1.0)
self.assertEqual(self.obj.p_button.step, 1)
self.assertEqual(self.obj.p_button.default, 0.0)
self.assertEqual(self.obj.p_button.metadata, {})
self.assertEqual(self.obj.p_button.type, "Button")
self.obj.p_button.zone = 1
self.assertEqual(self.obj.p_button.zone, param[0])
def test_addToggleButton(self):
"Test the addToggleButton C callback."
c_ui = self.ui.ui
param = self.ffi.new("FAUSTFLOAT*", 1.0)
c_ui.addToggleButton(c_ui.uiInterface, b"button", param)
self.assertTrue(hasattr(self.obj, "p_button"))
self.assertEqual(self.obj.p_button.label, b"button")
self.assertEqual(self.obj.p_button.zone, 0.0)
self.assertEqual(self.obj.p_button.min, 0.0)
self.assertEqual(self.obj.p_button.max, 1.0)
self.assertEqual(self.obj.p_button.step, 1)
self.assertEqual(self.obj.p_button.default, 0.0)
self.assertEqual(self.obj.p_button.metadata, {})
self.assertEqual(self.obj.p_button.type, "ToggleButton")
self.obj.p_button.zone = 1
self.assertEqual(self.obj.p_button.zone, param[0])
def test_addCheckButton(self):
"Test the addCheckButton C callback."
c_ui = self.ui.ui
param = self.ffi.new("FAUSTFLOAT*", 1.0)
c_ui.addCheckButton(c_ui.uiInterface, b"button", param)
self.assertTrue(hasattr(self.obj, "p_button"))
self.assertEqual(self.obj.p_button.label, b"button")
self.assertEqual(self.obj.p_button.zone, 0.0)
self.assertEqual(self.obj.p_button.min, 0.0)
self.assertEqual(self.obj.p_button.max, 1.0)
self.assertEqual(self.obj.p_button.step, 1)
self.assertEqual(self.obj.p_button.default, 0.0)
self.assertEqual(self.obj.p_button.metadata, {})
self.assertEqual(self.obj.p_button.type, "CheckButton")
self.obj.p_button.zone = 1
self.assertEqual(self.obj.p_button.zone, param[0])
| 38.37276 | 78 | 0.628339 | 1,526 | 10,706 | 4.26671 | 0.072739 | 0.099985 | 0.088466 | 0.192597 | 0.830287 | 0.812471 | 0.778068 | 0.711104 | 0.664107 | 0.613116 | 0 | 0.027838 | 0.228283 | 10,706 | 278 | 79 | 38.510791 | 0.760228 | 0.054455 | 0 | 0.504808 | 0 | 0 | 0.104503 | 0 | 0 | 0 | 0 | 0 | 0.427885 | 1 | 0.072115 | false | 0 | 0.024038 | 0 | 0.100962 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
191fd4754188b1a366915d5d6b47adbdfbf62591 | 11,102 | py | Python | calplus/tests/unit/v1/compute/test_client.py | nghiadt16/CALplus | 68c108e6abf7eeac4937b870dc7462dd6ee2fcc3 | [
"Apache-2.0"
] | null | null | null | calplus/tests/unit/v1/compute/test_client.py | nghiadt16/CALplus | 68c108e6abf7eeac4937b870dc7462dd6ee2fcc3 | [
"Apache-2.0"
] | 4 | 2017-04-05T16:14:07.000Z | 2018-12-14T14:19:15.000Z | calplus/tests/unit/v1/compute/test_client.py | nghiadt16/CALplus | 68c108e6abf7eeac4937b870dc7462dd6ee2fcc3 | [
"Apache-2.0"
] | 2 | 2017-04-18T16:53:58.000Z | 2018-12-04T05:42:51.000Z | import mock
from keystoneauth1.exceptions.base import ClientException
from calplus.tests import base
from calplus.v1.compute import client
fake_config_driver = {
'os_auth_url': 'http://controller:5000/v2_0',
'os_username': 'test',
'os_password': 'veryhard',
'os_project_name': 'demo',
'os_endpoint_url': 'http://controller:9696',
'os_driver_name': 'default',
'os_project_domain_name': 'default',
'os_user_domain_name': 'default',
'tenant_id': 'fake_tenant_id',
'limit': {
"vcpus": 10,
"instances": 10,
"ram": 5000
}
}
class ClientTest(base.TestCase):
"""docstring for ClientTest"""
def setUp(self):
super(ClientTest, self).setUp()
self.fake_client = client.Client(
'OpenStack', fake_config_driver)
def test_create_successfully(self):
self.mock_object(
self.fake_client.driver, 'create',
mock.Mock(return_value="return object"))
self.fake_client.create('1', '1',
'net_id', 'random', 1)
self.fake_client.driver.create.\
assert_called_once_with('1', '1',
'net_id', 'random', 1)
def test_create_unable_to_create(self):
self.mock_object(
self.fake_client.driver, 'create',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.create, '1', '1',
'net_id', 'random', 1)
self.fake_client.driver.create.\
assert_called_once_with('1', '1',
'net_id', 'random', 1)
def test_show_successfully(self):
self.mock_object(
self.fake_client.driver, 'show',
mock.Mock(return_value="return object"))
self.fake_client.show('fake_id')
self.fake_client.driver.show.\
assert_called_once_with('fake_id')
def test_show_unable_to_show(self):
self.mock_object(
self.fake_client.driver, 'show',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.show, 'fake_id')
self.fake_client.driver.show.\
assert_called_once_with('fake_id')
def test_list_successfully(self):
self.mock_object(
self.fake_client.driver, 'list',
mock.Mock(return_value="list server objects"))
self.fake_client.list()
self.fake_client.driver.list.\
assert_called_once_with()
def test_list_unable_to_list(self):
self.mock_object(
self.fake_client.driver, 'list',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.list)
self.fake_client.driver.list.\
assert_called_once_with()
def test_delete_successfully(self):
self.mock_object(
self.fake_client.driver, 'delete',
mock.Mock(return_value=True))
self.fake_client.delete('fake_id')
self.fake_client.driver.delete.\
assert_called_once_with('fake_id')
def test_delete_unable_to_delete(self):
self.mock_object(
self.fake_client.driver, 'delete',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.delete, 'fake_id')
self.fake_client.driver.delete.\
assert_called_once_with('fake_id')
def test_shutdown_successfully(self):
self.mock_object(
self.fake_client.driver, 'shutdown',
mock.Mock(return_value=True))
self.fake_client.shutdown('fake_id')
self.fake_client.driver.shutdown.\
assert_called_once_with('fake_id')
def test_shutdown_unable_to_shutdown(self):
self.mock_object(
self.fake_client.driver, 'shutdown',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.shutdown, 'fake_id')
self.fake_client.driver.shutdown.\
assert_called_once_with('fake_id')
def test_start_successfully(self):
self.mock_object(
self.fake_client.driver, 'start',
mock.Mock(return_value=True))
self.fake_client.start('fake_id')
self.fake_client.driver.start.\
assert_called_once_with('fake_id')
def test_start_unable_to_start(self):
self.mock_object(
self.fake_client.driver, 'start',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.start, 'fake_id')
self.fake_client.driver.start.\
assert_called_once_with('fake_id')
def test_reboot_successfully(self):
self.mock_object(
self.fake_client.driver, 'reboot',
mock.Mock(return_value=True))
self.fake_client.reboot('fake_id')
self.fake_client.driver.reboot.\
assert_called_once_with('fake_id')
def test_reboot_unable_to_reboot(self):
self.mock_object(
self.fake_client.driver, 'reboot',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.reboot, 'fake_id')
self.fake_client.driver.reboot.\
assert_called_once_with('fake_id')
def test_resize_successfully(self):
self.mock_object(
self.fake_client.driver, 'resize',
mock.Mock(return_value=True))
self.fake_client.resize('fake_id',
'fake_configuration')
self.fake_client.driver.resize.\
assert_called_once_with('fake_id',
'fake_configuration')
def test_resize_unable_to_resize(self):
self.mock_object(
self.fake_client.driver, 'resize',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.resize, 'fake_id',
'fake_configuration')
self.fake_client.driver.resize.\
assert_called_once_with('fake_id',
'fake_configuration')
def test_add_nic_successfully(self):
self.mock_object(
self.fake_client.driver, 'add_nic',
mock.Mock(return_value=True))
self.fake_client.add_nic('fake_id',
'fake_net_id')
self.fake_client.driver.add_nic.\
assert_called_once_with('fake_id',
'fake_net_id')
def test_add_nic_unable_to_add(self):
self.mock_object(
self.fake_client.driver, 'add_nic',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.add_nic, 'fake_id',
'fake_net_id')
self.fake_client.driver.add_nic.\
assert_called_once_with('fake_id',
'fake_net_id')
def test_delete_nic_successfully(self):
self.mock_object(
self.fake_client.driver, 'delete_nic',
mock.Mock(return_value=True))
self.fake_client.delete_nic('fake_id',
'fake_port_id')
self.fake_client.driver.delete_nic.\
assert_called_once_with('fake_id',
'fake_port_id')
def test_delete_nic_unable_to_delete(self):
self.mock_object(
self.fake_client.driver, 'delete_nic',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.delete_nic, 'fake_id',
'fake_port_id')
self.fake_client.driver.delete_nic.\
assert_called_once_with('fake_id',
'fake_port_id')
def test_list_nic_successfully(self):
self.mock_object(
self.fake_client.driver, 'list_nic',
mock.Mock(return_value=True))
self.fake_client.list_nic('fake_id')
self.fake_client.driver.list_nic.\
assert_called_once_with('fake_id')
def test_list_nic_unable_to_list(self):
self.mock_object(
self.fake_client.driver, 'list_nic',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.list_nic, 'fake_id')
self.fake_client.driver.list_nic.\
assert_called_once_with('fake_id')
def test_associate_public_ip_successfully(self):
self.mock_object(
self.fake_client.driver, 'associate_public_ip',
mock.Mock(return_value=True))
self.fake_client.associate_public_ip(
'fake_id', 'fake_public_ip_id')
self.fake_client.driver.associate_public_ip.\
assert_called_once_with(
'fake_id', 'fake_public_ip_id', None)
def test_associate_public_ip_unable_to_associate(self):
self.mock_object(
self.fake_client.driver, 'associate_public_ip',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.associate_public_ip,
'fake_id', 'fake_public_ip_id')
self.fake_client.driver.associate_public_ip.\
assert_called_once_with(
'fake_id', 'fake_public_ip_id', None)
def test_disassociate_public_ip_successfully(self):
self.mock_object(
self.fake_client.driver, 'disassociate_public_ip',
mock.Mock(return_value=True))
self.fake_client.disassociate_public_ip(
'fake_public_ip_id')
self.fake_client.driver.disassociate_public_ip.\
assert_called_once_with('fake_public_ip_id')
def test_disassociate_public_ip_unable_to_disassociate(self):
self.mock_object(
self.fake_client.driver, 'disassociate_public_ip',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.disassociate_public_ip, 'fake_public_ip_id')
self.fake_client.driver.disassociate_public_ip.\
assert_called_once_with('fake_public_ip_id')
def test_list_ip_successfully(self):
self.mock_object(
self.fake_client.driver, 'list_ip',
mock.Mock(return_value=True))
self.fake_client.list_ip('fake_id')
self.fake_client.driver.list_ip.\
assert_called_once_with('fake_id')
def test_list_ip_unable_to_list(self):
self.mock_object(
self.fake_client.driver, 'list_ip',
mock.Mock(side_effect=ClientException))
self.assertRaises(ClientException,
self.fake_client.list_ip, 'fake_id')
self.fake_client.driver.list_ip.\
assert_called_once_with('fake_id')
| 31.72 | 73 | 0.615655 | 1,279 | 11,102 | 4.961689 | 0.068804 | 0.107154 | 0.18752 | 0.176489 | 0.88607 | 0.872518 | 0.872518 | 0.872518 | 0.863063 | 0.823668 | 0 | 0.004027 | 0.284273 | 11,102 | 349 | 74 | 31.810888 | 0.794614 | 0.002162 | 0 | 0.675781 | 0 | 0 | 0.108652 | 0.005961 | 0 | 0 | 0 | 0 | 0.164063 | 1 | 0.113281 | false | 0.003906 | 0.015625 | 0 | 0.132813 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1987c7257738d223d6a3d61a60f5292827463802 | 82 | py | Python | legacy/btmux-mux20/mux/game/pymods/btwiz.py | murrayma/btmux | 3519fdbfb9d5d27b4ce8e46ee16796961f1a0bfa | [
"ClArtistic",
"Naumen",
"Condor-1.1",
"MS-PL"
] | 1 | 2020-07-09T17:37:42.000Z | 2020-07-09T17:37:42.000Z | legacy/btmux-mux20/mux/game/pymods/btwiz.py | murrayma/btmux | 3519fdbfb9d5d27b4ce8e46ee16796961f1a0bfa | [
"ClArtistic",
"Naumen",
"Condor-1.1",
"MS-PL"
] | null | null | null | legacy/btmux-mux20/mux/game/pymods/btwiz.py | murrayma/btmux | 3519fdbfb9d5d27b4ce8e46ee16796961f1a0bfa | [
"ClArtistic",
"Naumen",
"Condor-1.1",
"MS-PL"
] | 1 | 2020-11-07T00:02:47.000Z | 2020-11-07T00:02:47.000Z | def hook_load():
pass
def hook_save():
pass
def hook_update():
pass
| 9.111111 | 18 | 0.609756 | 12 | 82 | 3.916667 | 0.5 | 0.446809 | 0.468085 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.280488 | 82 | 8 | 19 | 10.25 | 0.79661 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
19984534920ad4e7810e4a6491fa5e6dc8b362eb | 35,833 | py | Python | Programs/fading_fuse.py | ShineTop/PiGlow | 3b87aca3a36a9cc2076ccebdccc5eb7a61855aa7 | [
"MIT"
] | 5 | 2018-03-16T19:09:50.000Z | 2022-02-06T21:37:35.000Z | Programs/fading_fuse.py | Breakfast-for-Pigeons/PiGlow | 3b87aca3a36a9cc2076ccebdccc5eb7a61855aa7 | [
"MIT"
] | 3 | 2018-10-02T16:06:19.000Z | 2020-03-01T19:07:31.000Z | Programs/fading_fuse.py | Breakfast-for-Pigeons/PiGlow | 3b87aca3a36a9cc2076ccebdccc5eb7a61855aa7 | [
"MIT"
] | 1 | 2020-12-19T01:30:06.000Z | 2020-12-19T01:30:06.000Z | #!/usr/bin/env python3
"""
Fading Fuse
Exactly like my "Light the Fuse" program, except that the fuse fades.
This program lights up arm 1, then lights up arms 2 and 3 at the same
time. Then lights up arm 2, which then lights up arms 1 and 3. Then
lights up arm 3, which then lights up arms 1 and 2.
....................
Functions:
- fading_fuse: Controls which fuse to light
- fading_fuse_1: Lights up arm 1, then 2 and 3. LEDs fade out.
- fading_fuse_2: Lights up arm 2, then 1 and 3. LEDs fade out.
- fading_fuse_3: Lights up arm 3, then 1 and 2. LEDs fade out.
- go_faster: Sleep_speed is 0.01. Cycle through the LEDS 20 times
....................
Requirements:
PyGlow.py (many thanks to benleb for this program)
bfp_piglow_modules.py
You will have these files if you downloaded the entire repository.
....................
Author: Paul Ryan
This program was written on a Raspberry Pi using the Geany IDE.
"""
########################################################################
# Import modules #
########################################################################
import logging
from time import sleep
from PyGlow import PyGlow
from bfp_piglow_modules import print_header
from bfp_piglow_modules import check_log_directory
from bfp_piglow_modules import delete_empty_logs
from bfp_piglow_modules import stop
########################################################################
# Initialize #
########################################################################
PYGLOW = PyGlow()
PYGLOW.all(0)
SLEEP_SPEED = 0.10
########################################################################
# Functions #
########################################################################
def fading_fuse_1(sleep_speed):
"""
Lights up arm 1, then 2 and 2. LEDs fade out.
"""
LOGGER.debug("Fading Fuse 1")
# Turn on Arm 1 Red
PYGLOW.led(1, 120)
sleep(sleep_speed)
# Turn on Arm 1 Orange
PYGLOW.led(2, 120)
sleep(sleep_speed)
# Fade Arm 1 Red
PYGLOW.led(1, 110)
sleep(sleep_speed)
# Turn on Arm 1 Yellow
PYGLOW.led(3, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O
PYGLOW.led(1, 100)
sleep(sleep_speed)
PYGLOW.led(2, 110)
sleep(sleep_speed)
# Turn on Arm 1 Green
PYGLOW.led(4, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y
PYGLOW.led(1, 90)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(3, 110)
sleep(sleep_speed)
# Turn on Arm 1 Blue
PYGLOW.led(5, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y, G
PYGLOW.led(1, 80)
sleep(sleep_speed)
PYGLOW.led(2, 90)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(4, 110)
sleep(sleep_speed)
# Arm 1, White
PYGLOW.led(6, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y, G, B
PYGLOW.led(1, 70)
sleep(sleep_speed)
PYGLOW.led(2, 80)
sleep(sleep_speed)
PYGLOW.led(3, 90)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(5, 110)
sleep(sleep_speed)
# Turn on Arm 2 and 3 White
PYGLOW.led(12, 120)
PYGLOW.led(18, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y, G, B, W
PYGLOW.led(1, 60)
sleep(sleep_speed)
PYGLOW.led(2, 70)
sleep(sleep_speed)
PYGLOW.led(3, 80)
sleep(sleep_speed)
PYGLOW.led(4, 90)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(6, 110)
sleep(sleep_speed)
# Turn on Arm 2 and 3 Blue
PYGLOW.led(11, 120)
PYGLOW.led(17, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y, G, B, W, Arm 2 and 3 W
PYGLOW.led(1, 50)
sleep(sleep_speed)
PYGLOW.led(2, 60)
sleep(sleep_speed)
PYGLOW.led(3, 70)
sleep(sleep_speed)
PYGLOW.led(4, 80)
sleep(sleep_speed)
PYGLOW.led(5, 90)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(12, 110)
PYGLOW.led(18, 110)
sleep(sleep_speed)
# Arm 2 and 3, Green
PYGLOW.led(10, 120)
PYGLOW.led(16, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y, G, B, W, Arm 2 and 3 W, B
PYGLOW.led(1, 40)
sleep(sleep_speed)
PYGLOW.led(2, 50)
sleep(sleep_speed)
PYGLOW.led(3, 60)
sleep(sleep_speed)
PYGLOW.led(4, 70)
sleep(sleep_speed)
PYGLOW.led(5, 80)
sleep(sleep_speed)
PYGLOW.led(6, 90)
sleep(sleep_speed)
PYGLOW.led(12, 100)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(11, 110)
PYGLOW.led(17, 110)
sleep(sleep_speed)
# Turn on Arm 2 and 3 Yellow
PYGLOW.led(9, 120)
PYGLOW.led(15, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y, G, B, W, Arm 2 and 3 W, B, G
PYGLOW.led(1, 30)
sleep(sleep_speed)
PYGLOW.led(2, 40)
sleep(sleep_speed)
PYGLOW.led(3, 50)
sleep(sleep_speed)
PYGLOW.led(4, 60)
sleep(sleep_speed)
PYGLOW.led(5, 70)
sleep(sleep_speed)
PYGLOW.led(6, 80)
sleep(sleep_speed)
PYGLOW.led(12, 90)
PYGLOW.led(18, 90)
sleep(sleep_speed)
PYGLOW.led(11, 100)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(10, 110)
PYGLOW.led(16, 110)
sleep(sleep_speed)
# Turn on Arm 2 and 3 Orange
PYGLOW.led(8, 120)
PYGLOW.led(14, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y, G, B, W, Arm 2 and 3 W, B, G, Y
PYGLOW.led(1, 20)
sleep(sleep_speed)
PYGLOW.led(2, 30)
sleep(sleep_speed)
PYGLOW.led(3, 40)
sleep(sleep_speed)
PYGLOW.led(4, 50)
sleep(sleep_speed)
PYGLOW.led(5, 60)
sleep(sleep_speed)
PYGLOW.led(6, 70)
sleep(sleep_speed)
PYGLOW.led(12, 80)
PYGLOW.led(18, 80)
sleep(sleep_speed)
PYGLOW.led(11, 90)
PYGLOW.led(17, 90)
sleep(sleep_speed)
PYGLOW.led(10, 100)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(9, 110)
PYGLOW.led(15, 110)
sleep(sleep_speed)
# Turn on Arm 2 and 3, Red
PYGLOW.led(7, 120)
PYGLOW.led(13, 120)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y, G, B, W, Arm 2 and 3 W, B, G, Y, O
PYGLOW.led(1, 10)
sleep(sleep_speed)
PYGLOW.led(2, 20)
sleep(sleep_speed)
PYGLOW.led(3, 30)
sleep(sleep_speed)
PYGLOW.led(4, 40)
sleep(sleep_speed)
PYGLOW.led(5, 50)
sleep(sleep_speed)
PYGLOW.led(6, 60)
sleep(sleep_speed)
PYGLOW.led(12, 70)
PYGLOW.led(18, 70)
sleep(sleep_speed)
PYGLOW.led(11, 80)
PYGLOW.led(17, 80)
sleep(sleep_speed)
PYGLOW.led(10, 90)
PYGLOW.led(16, 90)
sleep(sleep_speed)
PYGLOW.led(9, 100)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(8, 110)
PYGLOW.led(14, 110)
sleep(sleep_speed)
# Fade Arm 1 R, O, Y, G, B, W, Arm 2 and 3 W, B, G, Y, O, R
PYGLOW.led(1, 0)
sleep(sleep_speed)
PYGLOW.led(2, 10)
sleep(sleep_speed)
PYGLOW.led(3, 20)
sleep(sleep_speed)
PYGLOW.led(4, 30)
sleep(sleep_speed)
PYGLOW.led(5, 40)
sleep(sleep_speed)
PYGLOW.led(6, 50)
sleep(sleep_speed)
PYGLOW.led(12, 60)
PYGLOW.led(18, 60)
sleep(sleep_speed)
PYGLOW.led(11, 70)
PYGLOW.led(17, 70)
sleep(sleep_speed)
PYGLOW.led(10, 80)
PYGLOW.led(16, 80)
sleep(sleep_speed)
PYGLOW.led(9, 90)
PYGLOW.led(15, 90)
sleep(sleep_speed)
PYGLOW.led(8, 100)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(7, 110)
PYGLOW.led(13, 110)
sleep(sleep_speed)
# Fade Arm 1 O, Y, G, B, W, Arm 2 and 3 W, B, G, Y, O, R
PYGLOW.led(2, 0)
sleep(sleep_speed)
PYGLOW.led(3, 10)
sleep(sleep_speed)
PYGLOW.led(4, 20)
sleep(sleep_speed)
PYGLOW.led(5, 30)
sleep(sleep_speed)
PYGLOW.led(6, 40)
sleep(sleep_speed)
PYGLOW.led(12, 50)
PYGLOW.led(18, 50)
sleep(sleep_speed)
PYGLOW.led(11, 60)
PYGLOW.led(17, 60)
sleep(sleep_speed)
PYGLOW.led(10, 70)
PYGLOW.led(16, 70)
sleep(sleep_speed)
PYGLOW.led(9, 80)
PYGLOW.led(15, 80)
sleep(sleep_speed)
PYGLOW.led(8, 90)
PYGLOW.led(14, 90)
sleep(sleep_speed)
PYGLOW.led(7, 100)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Fade Arm 1 Y, G, B, W, Arm 2 and 3 W, B, G, Y, O, R
PYGLOW.led(3, 0)
sleep(sleep_speed)
PYGLOW.led(4, 10)
sleep(sleep_speed)
PYGLOW.led(5, 20)
sleep(sleep_speed)
PYGLOW.led(6, 30)
sleep(sleep_speed)
PYGLOW.led(12, 40)
PYGLOW.led(18, 40)
sleep(sleep_speed)
PYGLOW.led(11, 50)
PYGLOW.led(17, 50)
sleep(sleep_speed)
PYGLOW.led(10, 60)
PYGLOW.led(16, 60)
sleep(sleep_speed)
PYGLOW.led(9, 70)
PYGLOW.led(15, 70)
sleep(sleep_speed)
PYGLOW.led(8, 80)
PYGLOW.led(14, 80)
sleep(sleep_speed)
PYGLOW.led(7, 90)
PYGLOW.led(13, 90)
sleep(sleep_speed)
# Fade Arm 1 G, B, W, Arm 2 and 3 W, B, G, Y, O, R
PYGLOW.led(4, 0)
sleep(sleep_speed)
PYGLOW.led(5, 10)
sleep(sleep_speed)
PYGLOW.led(6, 20)
sleep(sleep_speed)
PYGLOW.led(12, 30)
PYGLOW.led(18, 30)
sleep(sleep_speed)
PYGLOW.led(11, 40)
PYGLOW.led(17, 40)
sleep(sleep_speed)
PYGLOW.led(10, 50)
PYGLOW.led(16, 50)
sleep(sleep_speed)
PYGLOW.led(9, 60)
PYGLOW.led(15, 60)
sleep(sleep_speed)
PYGLOW.led(8, 70)
PYGLOW.led(14, 70)
sleep(sleep_speed)
PYGLOW.led(7, 80)
PYGLOW.led(13, 80)
sleep(sleep_speed)
# Fade Arm 1 B, W, Arm 2 and 3 W, B, G, Y, O, R
PYGLOW.led(5, 0)
sleep(sleep_speed)
PYGLOW.led(6, 10)
sleep(sleep_speed)
PYGLOW.led(12, 20)
PYGLOW.led(18, 20)
sleep(sleep_speed)
PYGLOW.led(11, 30)
PYGLOW.led(17, 30)
sleep(sleep_speed)
PYGLOW.led(10, 40)
PYGLOW.led(16, 40)
sleep(sleep_speed)
PYGLOW.led(9, 50)
PYGLOW.led(15, 50)
sleep(sleep_speed)
PYGLOW.led(8, 60)
PYGLOW.led(14, 60)
sleep(sleep_speed)
PYGLOW.led(7, 70)
PYGLOW.led(13, 70)
sleep(sleep_speed)
# Fade Arm 1 W, Arm 2 and 3 W, B, G, Y, O, R
PYGLOW.led(6, 0)
sleep(sleep_speed)
PYGLOW.led(12, 10)
PYGLOW.led(18, 10)
sleep(sleep_speed)
PYGLOW.led(11, 20)
PYGLOW.led(17, 20)
sleep(sleep_speed)
PYGLOW.led(10, 30)
PYGLOW.led(16, 30)
sleep(sleep_speed)
PYGLOW.led(9, 40)
PYGLOW.led(15, 40)
sleep(sleep_speed)
PYGLOW.led(8, 50)
PYGLOW.led(14, 50)
sleep(sleep_speed)
PYGLOW.led(7, 60)
PYGLOW.led(13, 60)
sleep(sleep_speed)
# Fade Arm 2 and 3 W, B, G, Y, O, R
PYGLOW.led(12, 0)
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(11, 10)
PYGLOW.led(17, 10)
sleep(sleep_speed)
PYGLOW.led(10, 20)
PYGLOW.led(16, 20)
sleep(sleep_speed)
PYGLOW.led(9, 30)
PYGLOW.led(15, 30)
sleep(sleep_speed)
PYGLOW.led(8, 40)
PYGLOW.led(14, 40)
sleep(sleep_speed)
PYGLOW.led(7, 50)
PYGLOW.led(13, 50)
sleep(sleep_speed)
# Fade Arm 2 and 3 B, G, Y, O, R
PYGLOW.led(11, 0)
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(10, 10)
PYGLOW.led(16, 10)
sleep(sleep_speed)
PYGLOW.led(9, 20)
PYGLOW.led(15, 20)
sleep(sleep_speed)
PYGLOW.led(8, 30)
PYGLOW.led(14, 30)
sleep(sleep_speed)
PYGLOW.led(7, 40)
PYGLOW.led(13, 40)
sleep(sleep_speed)
# Fade Arm 2 and 3 G, Y, O, R
PYGLOW.led(10, 0)
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(9, 10)
PYGLOW.led(15, 10)
sleep(sleep_speed)
PYGLOW.led(8, 20)
PYGLOW.led(14, 20)
sleep(sleep_speed)
PYGLOW.led(7, 30)
PYGLOW.led(13, 30)
sleep(sleep_speed)
# Fade Arm 2 and 3 Y, O, R
PYGLOW.led(9, 0)
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(8, 10)
PYGLOW.led(14, 10)
sleep(sleep_speed)
PYGLOW.led(7, 20)
PYGLOW.led(13, 20)
sleep(sleep_speed)
# Fade Arm 2 and 3 O, R
PYGLOW.led(8, 0)
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(7, 10)
PYGLOW.led(13, 10)
sleep(sleep_speed)
# Fade Arm 2 and 3 R
PYGLOW.led(7, 0)
PYGLOW.led(13, 0)
sleep(sleep_speed)
def fading_fuse_2(sleep_speed):
"""
Lights up arm 2, then 1 and 3. LEDs fade out.
"""
LOGGER.debug("Fading Fuse 2")
# Turn on Arm 2 Red
PYGLOW.led(7, 120)
sleep(sleep_speed)
# Turn on Arm 2 Orange
PYGLOW.led(8, 120)
sleep(sleep_speed)
# Fade Arm 1 Red
PYGLOW.led(7, 110)
sleep(sleep_speed)
# Turn on Arm 2 Yellow
PYGLOW.led(9, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O
PYGLOW.led(7, 100)
sleep(sleep_speed)
PYGLOW.led(8, 110)
sleep(sleep_speed)
# Turn on Arm 2 Green
PYGLOW.led(10, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y
PYGLOW.led(7, 90)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(9, 110)
sleep(sleep_speed)
# Turn on Arm 2 Blue
PYGLOW.led(11, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y, G
PYGLOW.led(7, 80)
sleep(sleep_speed)
PYGLOW.led(8, 90)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(10, 110)
sleep(sleep_speed)
# Arm 1, White
PYGLOW.led(12, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y, G, B
PYGLOW.led(7, 70)
sleep(sleep_speed)
PYGLOW.led(8, 80)
sleep(sleep_speed)
PYGLOW.led(9, 90)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(11, 110)
sleep(sleep_speed)
# Turn on Arm 2 and 3 White
PYGLOW.led(6, 120)
PYGLOW.led(18, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y, G, B, W
PYGLOW.led(7, 60)
sleep(sleep_speed)
PYGLOW.led(8, 70)
sleep(sleep_speed)
PYGLOW.led(9, 80)
sleep(sleep_speed)
PYGLOW.led(10, 90)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(12, 110)
sleep(sleep_speed)
# Turn on Arm 2 and 3 Blue
PYGLOW.led(5, 120)
PYGLOW.led(17, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y, G, B, W, Arm 1 and 3 W
PYGLOW.led(7, 50)
sleep(sleep_speed)
PYGLOW.led(8, 60)
sleep(sleep_speed)
PYGLOW.led(9, 70)
sleep(sleep_speed)
PYGLOW.led(10, 80)
sleep(sleep_speed)
PYGLOW.led(11, 90)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(6, 110)
PYGLOW.led(18, 110)
sleep(sleep_speed)
# Arm 1 and 3, Green
PYGLOW.led(4, 120)
PYGLOW.led(16, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y, G, B, W, Arm 1 and 3 W, B
PYGLOW.led(7, 40)
sleep(sleep_speed)
PYGLOW.led(8, 50)
sleep(sleep_speed)
PYGLOW.led(9, 60)
sleep(sleep_speed)
PYGLOW.led(10, 70)
sleep(sleep_speed)
PYGLOW.led(11, 80)
sleep(sleep_speed)
PYGLOW.led(12, 90)
sleep(sleep_speed)
PYGLOW.led(6, 100)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(5, 110)
PYGLOW.led(17, 110)
sleep(sleep_speed)
# Turn on Arm 1 and 3 Yellow
PYGLOW.led(3, 120)
PYGLOW.led(15, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y, G, B, W, Arm 1 and 3 W, B, G
PYGLOW.led(7, 30)
sleep(sleep_speed)
PYGLOW.led(8, 40)
sleep(sleep_speed)
PYGLOW.led(9, 50)
sleep(sleep_speed)
PYGLOW.led(10, 60)
sleep(sleep_speed)
PYGLOW.led(11, 70)
sleep(sleep_speed)
PYGLOW.led(12, 80)
sleep(sleep_speed)
PYGLOW.led(6, 90)
PYGLOW.led(18, 90)
sleep(sleep_speed)
PYGLOW.led(5, 100)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(4, 110)
PYGLOW.led(16, 110)
sleep(sleep_speed)
# Turn on Arm 1 and 3 Orange
PYGLOW.led(2, 120)
PYGLOW.led(14, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y, G, B, W, Arm 2 and 3 W, B, G, Y
PYGLOW.led(7, 20)
sleep(sleep_speed)
PYGLOW.led(8, 30)
sleep(sleep_speed)
PYGLOW.led(9, 40)
sleep(sleep_speed)
PYGLOW.led(10, 50)
sleep(sleep_speed)
PYGLOW.led(11, 60)
sleep(sleep_speed)
PYGLOW.led(12, 70)
sleep(sleep_speed)
PYGLOW.led(6, 80)
PYGLOW.led(18, 80)
sleep(sleep_speed)
PYGLOW.led(5, 90)
PYGLOW.led(17, 90)
sleep(sleep_speed)
PYGLOW.led(4, 100)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(3, 110)
PYGLOW.led(15, 110)
sleep(sleep_speed)
# Turn on Arm 1 and 3, Red
PYGLOW.led(1, 120)
PYGLOW.led(13, 120)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y, G, B, W, Arm 1 and 3 W, B, G, Y, O
PYGLOW.led(7, 10)
sleep(sleep_speed)
PYGLOW.led(8, 20)
sleep(sleep_speed)
PYGLOW.led(9, 30)
sleep(sleep_speed)
PYGLOW.led(10, 40)
sleep(sleep_speed)
PYGLOW.led(11, 50)
sleep(sleep_speed)
PYGLOW.led(12, 60)
sleep(sleep_speed)
PYGLOW.led(6, 70)
PYGLOW.led(18, 70)
sleep(sleep_speed)
PYGLOW.led(5, 80)
PYGLOW.led(17, 80)
sleep(sleep_speed)
PYGLOW.led(4, 90)
PYGLOW.led(16, 90)
sleep(sleep_speed)
PYGLOW.led(3, 100)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(2, 110)
PYGLOW.led(14, 110)
sleep(sleep_speed)
# Fade Arm 2 R, O, Y, G, B, W, Arm 1 and 3 W, B, G, Y, O, R
PYGLOW.led(7, 0)
sleep(sleep_speed)
PYGLOW.led(8, 10)
sleep(sleep_speed)
PYGLOW.led(9, 20)
sleep(sleep_speed)
PYGLOW.led(10, 30)
sleep(sleep_speed)
PYGLOW.led(11, 40)
sleep(sleep_speed)
PYGLOW.led(12, 50)
sleep(sleep_speed)
PYGLOW.led(6, 60)
PYGLOW.led(18, 60)
sleep(sleep_speed)
PYGLOW.led(5, 70)
PYGLOW.led(17, 70)
sleep(sleep_speed)
PYGLOW.led(4, 80)
PYGLOW.led(16, 80)
sleep(sleep_speed)
PYGLOW.led(3, 90)
PYGLOW.led(15, 90)
sleep(sleep_speed)
PYGLOW.led(2, 100)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(1, 110)
PYGLOW.led(13, 110)
sleep(sleep_speed)
# Fade Arm 2 O, Y, G, B, W, Arm 1 and 3 W, B, G, Y, O, R
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(9, 10)
sleep(sleep_speed)
PYGLOW.led(10, 20)
sleep(sleep_speed)
PYGLOW.led(11, 30)
sleep(sleep_speed)
PYGLOW.led(12, 40)
sleep(sleep_speed)
PYGLOW.led(6, 50)
PYGLOW.led(18, 50)
sleep(sleep_speed)
PYGLOW.led(5, 60)
PYGLOW.led(17, 60)
sleep(sleep_speed)
PYGLOW.led(4, 70)
PYGLOW.led(16, 70)
sleep(sleep_speed)
PYGLOW.led(3, 80)
PYGLOW.led(15, 80)
sleep(sleep_speed)
PYGLOW.led(2, 90)
PYGLOW.led(14, 90)
sleep(sleep_speed)
PYGLOW.led(1, 100)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Fade Arm 2 Y, G, B, W, Arm 1 and 3 W, B, G, Y, O, R
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(10, 10)
sleep(sleep_speed)
PYGLOW.led(11, 20)
sleep(sleep_speed)
PYGLOW.led(12, 30)
sleep(sleep_speed)
PYGLOW.led(6, 40)
PYGLOW.led(18, 40)
sleep(sleep_speed)
PYGLOW.led(5, 50)
PYGLOW.led(17, 50)
sleep(sleep_speed)
PYGLOW.led(4, 60)
PYGLOW.led(16, 60)
sleep(sleep_speed)
PYGLOW.led(3, 70)
PYGLOW.led(15, 70)
sleep(sleep_speed)
PYGLOW.led(2, 80)
PYGLOW.led(14, 80)
sleep(sleep_speed)
PYGLOW.led(1, 90)
PYGLOW.led(13, 90)
sleep(sleep_speed)
# Fade Arm 2 G, B, W, Arm 1 and 3 W, B, G, Y, O, R
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(11, 10)
sleep(sleep_speed)
PYGLOW.led(12, 20)
sleep(sleep_speed)
PYGLOW.led(6, 30)
PYGLOW.led(18, 30)
sleep(sleep_speed)
PYGLOW.led(5, 40)
PYGLOW.led(17, 40)
sleep(sleep_speed)
PYGLOW.led(4, 50)
PYGLOW.led(16, 50)
sleep(sleep_speed)
PYGLOW.led(3, 60)
PYGLOW.led(15, 60)
sleep(sleep_speed)
PYGLOW.led(2, 70)
PYGLOW.led(14, 70)
sleep(sleep_speed)
PYGLOW.led(1, 80)
PYGLOW.led(13, 80)
sleep(sleep_speed)
# Fade Arm 2 B, W, Arm 1 and 3 W, B, G, Y, O, R
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(12, 10)
sleep(sleep_speed)
PYGLOW.led(6, 20)
PYGLOW.led(18, 20)
sleep(sleep_speed)
PYGLOW.led(5, 30)
PYGLOW.led(17, 30)
sleep(sleep_speed)
PYGLOW.led(4, 40)
PYGLOW.led(16, 40)
sleep(sleep_speed)
PYGLOW.led(3, 50)
PYGLOW.led(15, 50)
sleep(sleep_speed)
PYGLOW.led(2, 60)
PYGLOW.led(14, 60)
sleep(sleep_speed)
PYGLOW.led(1, 70)
PYGLOW.led(13, 70)
sleep(sleep_speed)
# Fade Arm 2 W, Arm 1 and 3 W, B, G, Y, O, R
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(6, 10)
PYGLOW.led(18, 10)
sleep(sleep_speed)
PYGLOW.led(5, 20)
PYGLOW.led(17, 20)
sleep(sleep_speed)
PYGLOW.led(4, 30)
PYGLOW.led(16, 30)
sleep(sleep_speed)
PYGLOW.led(3, 40)
PYGLOW.led(15, 40)
sleep(sleep_speed)
PYGLOW.led(2, 50)
PYGLOW.led(14, 50)
sleep(sleep_speed)
PYGLOW.led(1, 60)
PYGLOW.led(13, 60)
sleep(sleep_speed)
# Fade Arm 1 and 3 W, B, G, Y, O, R
PYGLOW.led(6, 0)
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(5, 10)
PYGLOW.led(17, 10)
sleep(sleep_speed)
PYGLOW.led(4, 20)
PYGLOW.led(16, 20)
sleep(sleep_speed)
PYGLOW.led(3, 30)
PYGLOW.led(15, 30)
sleep(sleep_speed)
PYGLOW.led(2, 40)
PYGLOW.led(14, 40)
sleep(sleep_speed)
PYGLOW.led(1, 50)
PYGLOW.led(13, 50)
sleep(sleep_speed)
# Fade Arm 1 and 3 B, G, Y, O, R
PYGLOW.led(5, 0)
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(4, 10)
PYGLOW.led(16, 10)
sleep(sleep_speed)
PYGLOW.led(3, 20)
PYGLOW.led(15, 20)
sleep(sleep_speed)
PYGLOW.led(2, 30)
PYGLOW.led(14, 30)
sleep(sleep_speed)
PYGLOW.led(1, 40)
PYGLOW.led(13, 40)
sleep(sleep_speed)
# Fade Arm 1 and 3 G, Y, O, R
PYGLOW.led(4, 0)
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(3, 10)
PYGLOW.led(15, 10)
sleep(sleep_speed)
PYGLOW.led(2, 20)
PYGLOW.led(14, 20)
sleep(sleep_speed)
PYGLOW.led(1, 30)
PYGLOW.led(13, 30)
sleep(sleep_speed)
# Fade Arm 1 and 3 Y, O, R
PYGLOW.led(3, 0)
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(2, 10)
PYGLOW.led(14, 10)
sleep(sleep_speed)
PYGLOW.led(1, 20)
PYGLOW.led(13, 20)
sleep(sleep_speed)
# Fade Arm 1 and 3 O, R
PYGLOW.led(2, 0)
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(1, 10)
PYGLOW.led(13, 10)
sleep(sleep_speed)
# Fade Arm 1 and 3 R
PYGLOW.led(1, 0)
PYGLOW.led(13, 0)
sleep(sleep_speed)
def fading_fuse_3(sleep_speed):
"""
Lights up arm 3, then 1 and 2. LEDs fade out.
"""
LOGGER.debug("Fading Fuse 3")
# Turn on Arm 3 Red
PYGLOW.led(13, 120)
sleep(sleep_speed)
# Turn on Arm 3 Orange
PYGLOW.led(14, 120)
sleep(sleep_speed)
# Fade Arm 3 Red
PYGLOW.led(13, 110)
sleep(sleep_speed)
# Turn on Arm 3 Yellow
PYGLOW.led(15, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O
PYGLOW.led(13, 100)
sleep(sleep_speed)
PYGLOW.led(14, 110)
sleep(sleep_speed)
# Turn on Arm 3 Green
PYGLOW.led(16, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y
PYGLOW.led(13, 90)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(15, 110)
sleep(sleep_speed)
# Turn on Arm 3 Blue
PYGLOW.led(17, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y, G
PYGLOW.led(13, 80)
sleep(sleep_speed)
PYGLOW.led(14, 90)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(16, 110)
sleep(sleep_speed)
# Arm 3, White
PYGLOW.led(18, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y, G, B
PYGLOW.led(13, 70)
sleep(sleep_speed)
PYGLOW.led(14, 80)
sleep(sleep_speed)
PYGLOW.led(15, 90)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(17, 110)
sleep(sleep_speed)
# Turn on Arm 1 and 2 White
PYGLOW.led(6, 120)
PYGLOW.led(12, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y, G, B, W
PYGLOW.led(13, 60)
sleep(sleep_speed)
PYGLOW.led(14, 70)
sleep(sleep_speed)
PYGLOW.led(15, 80)
sleep(sleep_speed)
PYGLOW.led(16, 90)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(18, 110)
sleep(sleep_speed)
# Turn on Arm 1 and 2 Blue
PYGLOW.led(5, 120)
PYGLOW.led(11, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y, G, B, W, Arm 1 and 2 W
PYGLOW.led(13, 50)
sleep(sleep_speed)
PYGLOW.led(14, 60)
sleep(sleep_speed)
PYGLOW.led(15, 70)
sleep(sleep_speed)
PYGLOW.led(16, 80)
sleep(sleep_speed)
PYGLOW.led(17, 90)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(6, 110)
PYGLOW.led(12, 110)
sleep(sleep_speed)
# Arm 1 and 2, Green
PYGLOW.led(4, 120)
PYGLOW.led(10, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y, G, B, W, Arm 1 and 2 W, B
PYGLOW.led(13, 40)
sleep(sleep_speed)
PYGLOW.led(14, 50)
sleep(sleep_speed)
PYGLOW.led(15, 60)
sleep(sleep_speed)
PYGLOW.led(16, 70)
sleep(sleep_speed)
PYGLOW.led(17, 80)
sleep(sleep_speed)
PYGLOW.led(18, 90)
sleep(sleep_speed)
PYGLOW.led(6, 100)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(5, 110)
PYGLOW.led(11, 110)
sleep(sleep_speed)
# Turn on Arm 1 and 2 Yellow
PYGLOW.led(3, 120)
PYGLOW.led(9, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y, G, B, W, Arm 1 and 2 W, B, G
PYGLOW.led(13, 30)
sleep(sleep_speed)
PYGLOW.led(14, 40)
sleep(sleep_speed)
PYGLOW.led(15, 50)
sleep(sleep_speed)
PYGLOW.led(16, 60)
sleep(sleep_speed)
PYGLOW.led(17, 70)
sleep(sleep_speed)
PYGLOW.led(18, 80)
sleep(sleep_speed)
PYGLOW.led(6, 90)
PYGLOW.led(12, 90)
sleep(sleep_speed)
PYGLOW.led(5, 100)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(4, 110)
PYGLOW.led(10, 110)
sleep(sleep_speed)
# Turn on Arm 1 and 2 Orange
PYGLOW.led(2, 120)
PYGLOW.led(8, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y, G, B, W, Arm 1 and 2 W, B, G, Y
PYGLOW.led(13, 20)
sleep(sleep_speed)
PYGLOW.led(14, 30)
sleep(sleep_speed)
PYGLOW.led(15, 40)
sleep(sleep_speed)
PYGLOW.led(16, 50)
sleep(sleep_speed)
PYGLOW.led(17, 60)
sleep(sleep_speed)
PYGLOW.led(18, 70)
sleep(sleep_speed)
PYGLOW.led(6, 80)
PYGLOW.led(12, 80)
sleep(sleep_speed)
PYGLOW.led(5, 90)
PYGLOW.led(11, 90)
sleep(sleep_speed)
PYGLOW.led(4, 100)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(3, 110)
PYGLOW.led(9, 110)
sleep(sleep_speed)
# Turn on Arm 1 and 2, Red
PYGLOW.led(1, 120)
PYGLOW.led(7, 120)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y, G, B, W, Arm 1 and 2 W, B, G, Y, O
PYGLOW.led(13, 10)
sleep(sleep_speed)
PYGLOW.led(14, 20)
sleep(sleep_speed)
PYGLOW.led(15, 30)
sleep(sleep_speed)
PYGLOW.led(16, 40)
sleep(sleep_speed)
PYGLOW.led(17, 50)
sleep(sleep_speed)
PYGLOW.led(18, 60)
sleep(sleep_speed)
PYGLOW.led(6, 70)
PYGLOW.led(12, 70)
sleep(sleep_speed)
PYGLOW.led(5, 80)
PYGLOW.led(11, 80)
sleep(sleep_speed)
PYGLOW.led(4, 90)
PYGLOW.led(10, 90)
sleep(sleep_speed)
PYGLOW.led(3, 100)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(2, 110)
PYGLOW.led(8, 110)
sleep(sleep_speed)
# Fade Arm 3 R, O, Y, G, B, W, Arm 1 and 2 W, B, G, Y, O, R
PYGLOW.led(13, 0)
sleep(sleep_speed)
PYGLOW.led(14, 10)
sleep(sleep_speed)
PYGLOW.led(15, 20)
sleep(sleep_speed)
PYGLOW.led(16, 30)
sleep(sleep_speed)
PYGLOW.led(17, 40)
sleep(sleep_speed)
PYGLOW.led(18, 50)
sleep(sleep_speed)
PYGLOW.led(6, 60)
PYGLOW.led(12, 60)
sleep(sleep_speed)
PYGLOW.led(5, 70)
PYGLOW.led(11, 70)
sleep(sleep_speed)
PYGLOW.led(4, 80)
PYGLOW.led(10, 80)
sleep(sleep_speed)
PYGLOW.led(3, 90)
PYGLOW.led(9, 90)
sleep(sleep_speed)
PYGLOW.led(2, 100)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(1, 110)
PYGLOW.led(7, 110)
sleep(sleep_speed)
# Fade Arm 3 O, Y, G, B, W, Arm 1 and 2 W, B, G, Y, O, R
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(15, 10)
sleep(sleep_speed)
PYGLOW.led(16, 20)
sleep(sleep_speed)
PYGLOW.led(17, 30)
sleep(sleep_speed)
PYGLOW.led(18, 40)
sleep(sleep_speed)
PYGLOW.led(6, 50)
PYGLOW.led(12, 50)
sleep(sleep_speed)
PYGLOW.led(5, 60)
PYGLOW.led(11, 60)
sleep(sleep_speed)
PYGLOW.led(4, 70)
PYGLOW.led(10, 70)
sleep(sleep_speed)
PYGLOW.led(3, 80)
PYGLOW.led(9, 80)
sleep(sleep_speed)
PYGLOW.led(2, 90)
PYGLOW.led(8, 90)
sleep(sleep_speed)
PYGLOW.led(1, 100)
PYGLOW.led(7, 100)
sleep(sleep_speed)
# Fade Arm 3 Y, G, B, W, Arm 1 and 2 W, B, G, Y, O, R
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(16, 10)
sleep(sleep_speed)
PYGLOW.led(17, 20)
sleep(sleep_speed)
PYGLOW.led(18, 30)
sleep(sleep_speed)
PYGLOW.led(6, 40)
PYGLOW.led(12, 40)
sleep(sleep_speed)
PYGLOW.led(5, 50)
PYGLOW.led(11, 50)
sleep(sleep_speed)
PYGLOW.led(4, 60)
PYGLOW.led(10, 60)
sleep(sleep_speed)
PYGLOW.led(3, 70)
PYGLOW.led(9, 70)
sleep(sleep_speed)
PYGLOW.led(2, 80)
PYGLOW.led(8, 80)
sleep(sleep_speed)
PYGLOW.led(1, 90)
PYGLOW.led(7, 90)
sleep(sleep_speed)
# Fade Arm 3 G, B, W, Arm 1 and 2 W, B, G, Y, O, R
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(17, 10)
sleep(sleep_speed)
PYGLOW.led(18, 20)
sleep(sleep_speed)
PYGLOW.led(6, 30)
PYGLOW.led(12, 30)
sleep(sleep_speed)
PYGLOW.led(5, 40)
PYGLOW.led(11, 40)
sleep(sleep_speed)
PYGLOW.led(4, 50)
PYGLOW.led(10, 50)
sleep(sleep_speed)
PYGLOW.led(3, 60)
PYGLOW.led(9, 60)
sleep(sleep_speed)
PYGLOW.led(2, 70)
PYGLOW.led(8, 70)
sleep(sleep_speed)
PYGLOW.led(1, 80)
PYGLOW.led(7, 80)
sleep(sleep_speed)
# Fade Arm 3 B, W, Arm 1 and 2 W, B, G, Y, O, R
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(18, 10)
sleep(sleep_speed)
PYGLOW.led(6, 20)
PYGLOW.led(12, 20)
sleep(sleep_speed)
PYGLOW.led(5, 30)
PYGLOW.led(11, 30)
sleep(sleep_speed)
PYGLOW.led(4, 40)
PYGLOW.led(10, 40)
sleep(sleep_speed)
PYGLOW.led(3, 50)
PYGLOW.led(9, 50)
sleep(sleep_speed)
PYGLOW.led(2, 60)
PYGLOW.led(8, 60)
sleep(sleep_speed)
PYGLOW.led(1, 70)
PYGLOW.led(7, 70)
sleep(sleep_speed)
# Fade Arm 3 W, Arm 1 and 2 W, B, G, Y, O, R
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(6, 10)
PYGLOW.led(12, 10)
sleep(sleep_speed)
PYGLOW.led(5, 20)
PYGLOW.led(11, 20)
sleep(sleep_speed)
PYGLOW.led(4, 30)
PYGLOW.led(10, 30)
sleep(sleep_speed)
PYGLOW.led(3, 40)
PYGLOW.led(9, 40)
sleep(sleep_speed)
PYGLOW.led(2, 50)
PYGLOW.led(8, 50)
sleep(sleep_speed)
PYGLOW.led(1, 60)
PYGLOW.led(7, 60)
sleep(sleep_speed)
# Fade Arm 1 and 2 W, B, G, Y, O, R
PYGLOW.led(6, 0)
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(5, 10)
PYGLOW.led(11, 10)
sleep(sleep_speed)
PYGLOW.led(4, 20)
PYGLOW.led(10, 20)
sleep(sleep_speed)
PYGLOW.led(3, 30)
PYGLOW.led(9, 30)
sleep(sleep_speed)
PYGLOW.led(2, 40)
PYGLOW.led(8, 40)
sleep(sleep_speed)
PYGLOW.led(1, 50)
PYGLOW.led(7, 50)
sleep(sleep_speed)
# Fade Arm 1 and 2 B, G, Y, O, R
PYGLOW.led(5, 0)
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(4, 10)
PYGLOW.led(10, 10)
sleep(sleep_speed)
PYGLOW.led(3, 20)
PYGLOW.led(9, 20)
sleep(sleep_speed)
PYGLOW.led(2, 30)
PYGLOW.led(8, 30)
sleep(sleep_speed)
PYGLOW.led(1, 40)
PYGLOW.led(7, 40)
sleep(sleep_speed)
# Fade Arm 1 and 2 G, Y, O, R
PYGLOW.led(4, 0)
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(3, 10)
PYGLOW.led(9, 10)
sleep(sleep_speed)
PYGLOW.led(2, 20)
PYGLOW.led(8, 20)
sleep(sleep_speed)
PYGLOW.led(1, 30)
PYGLOW.led(7, 30)
sleep(sleep_speed)
# Fade Arm 1 and 2 Y, O, R
PYGLOW.led(3, 0)
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(2, 10)
PYGLOW.led(8, 10)
sleep(sleep_speed)
PYGLOW.led(1, 20)
PYGLOW.led(7, 20)
sleep(sleep_speed)
# Fade Arm 1 and 2 O, R
PYGLOW.led(2, 0)
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(1, 10)
PYGLOW.led(7, 10)
sleep(sleep_speed)
# Fade Arm 1 and 2 R
PYGLOW.led(1, 0)
PYGLOW.led(7, 0)
sleep(sleep_speed)
def go_faster():
"""
Sleep_speed is 0.01. Cycle through the LEDs 10 times
"""
LOGGER.debug("Going faster...")
sleep_speed = 0.01
# Start counter at 1, end at 10, increment by 1
for i in range(1, 11, 1):
LOGGER.debug("counter = %s", i)
fading_fuse_1(sleep_speed)
fading_fuse_2(sleep_speed)
fading_fuse_3(sleep_speed)
def main():
"""
The main function
"""
LOGGER.debug("START")
go_faster()
LOGGER.debug("END")
delete_empty_logs(LOG)
stop()
if __name__ == '__main__':
try:
# STEP01: Check if Log directory exists.
check_log_directory()
# STEP02: Enable logging
LOG = 'Logs/fading_fuse.log'
LOG_FORMAT = '%(asctime)s %(name)s: %(funcName)s: \
%(levelname)s: %(message)s'
LOGGER = logging.getLogger(__name__)
# Nothing will log unless logging level is changed to DEBUG
LOGGER.setLevel(logging.ERROR)
FORMATTER = logging.Formatter(fmt=LOG_FORMAT,
datefmt='%m/%d/%y %I:%M:%S %p:')
FILE_HANDLER = logging.FileHandler(LOG, 'w')
FILE_HANDLER.setFormatter(FORMATTER)
LOGGER.addHandler(FILE_HANDLER)
# STEP03: Print header
print_header()
# STEP04: Print instructions in white text
print("\033[1;37;40mPress Ctrl-C to stop the program.")
# STEP05: Run the main function
main()
except KeyboardInterrupt:
delete_empty_logs(LOG)
stop()
| 25.341584 | 73 | 0.567773 | 5,696 | 35,833 | 3.477001 | 0.032654 | 0.31901 | 0.354456 | 0.384903 | 0.930927 | 0.907599 | 0.902045 | 0.848523 | 0.720677 | 0.556425 | 0 | 0.112623 | 0.292552 | 35,833 | 1,413 | 74 | 25.359519 | 0.668639 | 0.136104 | 0 | 0.963085 | 0 | 0 | 0.00588 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004102 | false | 0 | 0.005742 | 0 | 0.009844 | 0.002461 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
2707abb78005a8d1843d32cdcd615d2e4b4c286f | 113 | py | Python | RobinhoodScripts/autohood/login.py | JGFahey/Autohood | ea6cca4fcf3407cbb461ccaae86018023c41ee05 | [
"MIT"
] | null | null | null | RobinhoodScripts/autohood/login.py | JGFahey/Autohood | ea6cca4fcf3407cbb461ccaae86018023c41ee05 | [
"MIT"
] | null | null | null | RobinhoodScripts/autohood/login.py | JGFahey/Autohood | ea6cca4fcf3407cbb461ccaae86018023c41ee05 | [
"MIT"
] | null | null | null | import robin_stocks as robinhood
def login(username, password):
login = robinhood.login(username, password)
| 22.6 | 47 | 0.778761 | 14 | 113 | 6.214286 | 0.642857 | 0.298851 | 0.482759 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141593 | 113 | 4 | 48 | 28.25 | 0.896907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.666667 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
27344e51d270dd839a657f59d88fd6f27bc58091 | 1,139 | py | Python | Argument Extraction Program and Results/DenseTfidfVectorizer.py | HCDLab/ArguLens | a0c85b3ab158ad005d358ce6e2101f4dfb5a7b81 | [
"MIT"
] | 2 | 2020-07-02T19:14:55.000Z | 2021-01-12T14:24:20.000Z | Argument Extraction Program and Results/DenseTfidfVectorizer.py | HCDLab/ArguLens | a0c85b3ab158ad005d358ce6e2101f4dfb5a7b81 | [
"MIT"
] | null | null | null | Argument Extraction Program and Results/DenseTfidfVectorizer.py | HCDLab/ArguLens | a0c85b3ab158ad005d358ce6e2101f4dfb5a7b81 | [
"MIT"
] | null | null | null | from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.feature_extraction.text import CountVectorizer
import pandas as pd
class DenseTfidfVectorizer(TfidfVectorizer):
def transform(self, raw_documents, copy=True):
X = super().transform(raw_documents, copy=copy)
df = pd.DataFrame(X.toarray(), columns=self.get_feature_names(), index=raw_documents.index)
return df
def fit_transform(self, raw_documents, y=None):
X = super().fit_transform(raw_documents, y=y)
df = pd.DataFrame(X.toarray(), columns=self.get_feature_names(), index=raw_documents.index)
return df
class DenseCountVectorizer(CountVectorizer):
def transform(self, raw_documents, copy=True):
X = super().transform(raw_documents, copy=copy)
df = pd.DataFrame(X.toarray(), columns=self.get_feature_names(), index=raw_documents.index)
return df
def fit_transform(self, raw_documents, y=None):
X = super().fit_transform(raw_documents, y=y)
df = pd.DataFrame(X.toarray(), columns=self.get_feature_names(), index=raw_documents.index)
return df
| 39.275862 | 99 | 0.714662 | 148 | 1,139 | 5.324324 | 0.22973 | 0.182741 | 0.081218 | 0.126904 | 0.840102 | 0.840102 | 0.743655 | 0.743655 | 0.743655 | 0.743655 | 0 | 0 | 0.174715 | 1,139 | 28 | 100 | 40.678571 | 0.838298 | 0 | 0 | 0.761905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.190476 | false | 0 | 0.142857 | 0 | 0.619048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fdbf694529f2498090f36d880579fca123f9cce1 | 11,694 | py | Python | sgtpy/equilibrium/hazt.py | MatKie/SGTPy | 8e98d92fedd2b07d834e547e5154ec8f70d80728 | [
"MIT"
] | 12 | 2020-12-27T17:04:33.000Z | 2021-07-19T06:28:28.000Z | sgtpy/equilibrium/hazt.py | MatKie/SGTPy | 8e98d92fedd2b07d834e547e5154ec8f70d80728 | [
"MIT"
] | 2 | 2021-05-15T14:27:57.000Z | 2021-08-19T15:42:24.000Z | sgtpy/equilibrium/hazt.py | MatKie/SGTPy | 8e98d92fedd2b07d834e547e5154ec8f70d80728 | [
"MIT"
] | 5 | 2021-02-21T01:33:29.000Z | 2021-07-26T15:11:08.000Z | from __future__ import division, print_function, absolute_import
import numpy as np
from scipy.optimize import fsolve, root
from .multiflash import multiflash
from .equilibriumresult import EquilibriumResult
from warnings import warn
def haz_objb(inc, T_P, type, model, index, equilibrium):
X0 = inc[:-1].reshape(3, 2)
P_T = inc[-1]
if type == 'T':
P = P_T
temp_aux = T_P
elif type == 'P':
T = P_T
temp_aux = model.temperature_aux(T)
P = T_P
nc = model.nc
X = np.zeros((3, nc))
X[:, index] = X0
lnphi = np.zeros_like(X)
global vg, Xassg
for i, state in enumerate(equilibrium):
out = model.logfugef_aux(X[i], temp_aux, P, state, vg[i], Xassg[i])
lnphi[i], vg[i], Xassg[i] = out
lnK = lnphi[0] - lnphi[1:]
K = np.exp(lnK)
return np.hstack([(K[:, index]*X0[0]-X0[1:]).flatten(), X0.sum(axis=1)-1.])
def haz_objt(inc, temp_aux, P, model):
X, W, Y = np.split(inc, 3)
global vg, Xassg
fugX, vg[0], Xassg[0] = model.logfugef_aux(X, temp_aux, P, 'L', vg[0],
Xassg[0])
fugW, vg[1], Xassg[1] = model.logfugef_aux(W, temp_aux, P, 'L', vg[1],
Xassg[1])
fugY, vg[2], Xassg[2] = model.logfugef_aux(Y, temp_aux, P, 'V', vg[2],
Xassg[2])
K1 = np.exp(fugX-fugY)
K2 = np.exp(fugX-fugW)
return np.hstack([K1*X-Y, K2*X-W, X.sum()-1, Y.sum()-1, W.sum()-1])
def haz(X0, W0, Y0, T, P, model, good_initial=False, v0=[None, None, None],
Xass0=[None, None, None], K_tol=1e-10, nacc=5, full_output=False):
"""
Liquid liquid vapor (T,P) -> (x, w, y)
Computes liquid liquid vapor equilibrium of multicomponent mixtures at
given temperature. This functions uses a simultaneous method to check
stability and equilibrium, when slow convergence is noted, minimization of
Gibbs free energy is performed with BFGS.
Parameters
----------
X0 : array_like
guess composition of liquid 1
W0 : array_like
guess composition of liquid 2
Y0 : array_like
guess composition of vapour 1
T : float
absolute temperature [K].
P : float
pressure [Pa]
model : object
created from mixture and saftvrmie function
good_initial: bool, optional
if True skip Gupta's method and solves full system of equations.
v0 : list, optional
if supplied volume used as initial value to compute fugacities
K_tol : float, optional
Desired accuracy of K (= X/Xr) vector
nacc : int, optional
number of accelerated successive substitution cycles to perform
full_output: bool, optional
wheter to outputs all calculation info
Returns
-------
X : array_like
liquid1 mole fraction vector
W : array_like
liquid2 mole fraction vector
Y : array_like
vapour mole fraction fector
"""
nc = model.nc
if len(X0) != nc or len(W0) != nc or len(Y0) != nc:
raise Exception('Composition vector lenght must be equal to nc')
Z0 = (X0+Y0+W0)/3
nonzero = np.count_nonzero(Z0)
x0 = np.vstack([X0, W0, Y0])
b0 = np.array([0.33, 0.33, 0.33, 0., 0.])
global vg, Xassg
vg = v0.copy()
Xassg = Xass0.copy()
# check for binary mixture
if nonzero == 2:
warn('Global mixture is a binary mixture, updating temperature')
index = np.nonzero(Z0)[0]
sol = np.zeros_like(x0)
inc0 = np.hstack([x0[:, index].flatten(), T])
sol1 = root(haz_objb, inc0, args=(P, 'P', model, index, 'LLV'))
T = sol1.x[-1]
Xs = sol1.x[:-1].reshape(3, 2)
sol[:, index] = Xs
X, W, Y = sol
if full_output:
temp_aux = model.temperature_aux(T)
for i, state in enumerate(['L', 'L', 'V']):
rho, Xassg[i] = model.density_aux(sol[i], temp_aux, P, state,
rho0=1/vg[i], Xass0=Xassg[i])
vg[i] = 1./rho
info = {'T': T, 'P': P, 'X': sol, 'v': vg, 'Xass': Xassg,
'states': ['L', 'L', 'V'], 'success': sol1.success}
out = EquilibriumResult(info)
return out
return X, W, Y, T
if not good_initial:
out = multiflash(x0, b0, ['L', 'L', 'V'], Z0, T, P, model, vg, Xassg,
K_tol, nacc, True)
else:
temp_aux = model.temperature_aux(T)
sol = fsolve(haz_objt, x0.flatten(), args=(temp_aux, P, model))
x0 = sol.reshape([model.nc, 3])
Z0 = x0.mean(axis=0)
out = multiflash(x0, b0, ['L', 'L', 'V'], Z0, T, P, model,
vg, Xassg, K_tol, nacc, True)
Xm, beta, tetha, equilibrio = out.X, out.beta, out.tetha, out.states
error_inner = out.error_inner
v = out.v
Xass = out.Xass
if error_inner > 1e-6:
order = [2, 0, 1] # Y, X, W
Xm = Xm[order]
betatetha = np.hstack([beta[order], tetha])
equilibrio = np.asarray(equilibrio)[order]
v0 = np.asarray(v)[order]
Xass0 = np.asarray(out.Xass)[order]
out = multiflash(Xm, betatetha, equilibrio, Z0, T, P, model, v0, Xass0,
K_tol, nacc, full_output=True)
order = [1, 2, 0]
Xm, beta, tetha, equilibrio = out.X, out.beta, out.tetha, out.states
error_inner = out.error_inner
if error_inner > 1e-6:
order = [2, 1, 0] # W, X, Y
Xm = Xm[order]
betatetha = np.hstack([beta[order], tetha])
equilibrio = np.asarray(equilibrio)[order]
v0 = np.asarray(out.v)[order]
Xass0 = np.asarray(out.Xass)[order]
out = multiflash(Xm, betatetha, equilibrio, Z0, T, P, model, v0,
Xass0, K_tol, nacc, full_output=True)
order = [1, 0, 2]
Xm, beta, tetha = out.X, out.beta, out.tetha
equilibrio = out.states
error_inner = out.error_inner
Xm = Xm[order]
beta = beta[order]
tetha = np.hstack([0., tetha])
tetha = tetha[order]
v = (out.v)[order]
Xass = np.asarray(out.Xass)[order]
else:
tetha = np.hstack([0., tetha])
if full_output:
info = {'T': T, 'P': P, 'error_outer': out.error_outer,
'error_inner': error_inner, 'iter': out.iter,
'beta': beta, 'tetha': tetha, 'X': Xm, 'v': v,
'Xass': Xass, 'states': ['L', 'L', 'V']}
out = EquilibriumResult(info)
return out
tethainestable = tetha > 0.
Xm[tethainestable] = None
X, W, Y = Xm
return X, W, Y
def vlle(X0, W0, Y0, Z, T, P, model, v0=[None, None, None],
Xass0=[None, None, None], K_tol=1e-10, nacc=5, full_output=False):
"""
Liquid liquid vapor Multiflash (Z, T, P) -> (x, w, y)
Computes liquid liquid vapor equilibrium of multicomponent mixtures at
given temperature. This functions uses a simultaneous method to check
stability and equilibrium, when slow convergence is noted, minimization
of Gibbs free energy is performed with BFGS.
Parameters
----------
X0 : array_like
guess composition of liquid 1
W0 : array_like
guess composition of liquid 2
Y0 : array_like
guess composition of vapour 1
T : float
absolute temperature in K.
P : float
pressure in bar
model : object
Created from mixture, eos and mixrule
good_initial: bool, optional
if True skip Gupta's method and solves full system of equations.
v0 : list, optional
if supplied volume used as initial value to compute fugacities
K_tol : float, optional
Desired accuracy of K (= X/Xr) vector
nacc : int, optional
number of accelerated successive substitution cycles to perform
full_output: bool, optional
wheter to outputs all calculation info
Returns
-------
X : array_like
liquid1 mole fraction vector
W : array_like
liquid2 mole fraction vector
Y : array_like
vapour mole fraction fector
"""
nc = model.nc
if len(X0) != nc or len(W0) != nc or len(Y0) != nc:
raise Exception('Composition vector lenght must be equal to nc')
nonzero = np.count_nonzero(Z)
x0 = np.vstack([X0, W0, Y0])
b0 = np.array([0.33, 0.33, 0.33, 0., 0.])
# check for binary mixture
if nonzero == 2:
warn('Global mixture is a binary mixture, updating temperature')
index = np.nonzero(Z)[0]
sol = np.zeros_like(x0)
global vg, Xassg
vg = v0.copy()
Xassg = Xass0.copy()
inc0 = np.hstack([x0[:, index].flatten(), T])
sol1 = root(haz_objb, inc0, args=(P, 'P', model, index, 'LLV'))
T = sol1.x[-1]
Xs = sol1.x[:-1].reshape(3, 2)
sol[:, index] = Xs
X, W, Y = sol
if full_output:
temp_aux = model.temperature_aux(T)
for i, state in enumerate(['L', 'L', 'V']):
rho, Xassg[i] = model.density_aux(sol[i], temp_aux, P, state,
rho0=1/vg[i], Xass0=Xassg[i])
vg[i] = 1./rho
info = {'T': T, 'P': P, 'X': sol, 'v': vg, 'Xass': Xassg,
'states': ['L', 'L', 'V'], 'success': sol1.success}
out = EquilibriumResult(info)
return out
return X, W, Y, T
out = multiflash(x0, b0, ['L', 'L', 'V'], Z, T, P, model, v0, Xass0,
K_tol, nacc, True)
Xm, beta, tetha, equilibrio = out.X, out.beta, out.tetha, out.states
error_inner = out.error_inner
v = out.v
Xass = out.Xass
if error_inner > 1e-6:
order = [2, 0, 1] # Y, X, W
Xm = Xm[order]
betatetha = np.hstack([beta[order], tetha])
equilibrio = np.asarray(equilibrio)[order]
v0 = np.asarray(v)[order]
Xass0 = np.asarray(out.Xass)[order]
out = multiflash(Xm, betatetha, equilibrio, Z, T, P, model, v0,
Xass, K_tol, nacc, full_output=True)
order = [1, 2, 0]
Xm, beta, tetha, equilibrio = out.X, out.beta, out.tetha, out.states
error_inner = out.error_inner
if error_inner > 1e-6:
order = [2, 1, 0] # W, X, Y
Xm = Xm[order]
betatetha = np.hstack([beta[order], tetha])
equilibrio = np.asarray(equilibrio)[order]
v0 = np.asarray(out.v)[order]
Xass0 = np.asarray(out.Xass)[order]
out = multiflash(Xm, betatetha, equilibrio, Z, T, P, model, v0,
Xass0, K_tol, nacc, full_output=True)
order = [1, 0, 2]
Xm, beta, tetha = out.X, out.beta, out.tetha
equilibrio = out.states
error_inner = out.error_inner
Xm = Xm[order]
beta = beta[order]
tetha = np.hstack([0., tetha])
tetha = tetha[order]
v = (out.v)[order]
Xass = np.asarray(out.Xass)[order]
else:
tetha = np.hstack([0., tetha])
if full_output:
info = {'T': T, 'P': P, 'error_outer': out.error_outer,
'error_inner': error_inner, 'iter': out.iter, 'beta': beta,
'tetha': tetha, 'X': Xm, 'v': v, 'Xass': Xass,
'states': ['L', 'L', 'V']}
out = EquilibriumResult(info)
return out
tethainestable = tetha > 0.
Xm[tethainestable] = None
X, W, Y = Xm
return X, W, Y
__all__ = ['haz', 'vlle']
| 33.797688 | 79 | 0.543954 | 1,616 | 11,694 | 3.866955 | 0.134901 | 0.006721 | 0.005281 | 0.024004 | 0.837894 | 0.818371 | 0.804289 | 0.801088 | 0.801088 | 0.787486 | 0 | 0.028843 | 0.324012 | 11,694 | 345 | 80 | 33.895652 | 0.76167 | 0.209851 | 0 | 0.713615 | 0 | 0 | 0.043639 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018779 | false | 0 | 0.028169 | 0 | 0.093897 | 0.004695 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e3127ed21769c07dd8fb039e15da440a6906b569 | 8,749 | py | Python | slack/tests/test_actions.py | drewp/slack-sansio | 3c578bd087073174b1ec31b9a610e889d1fa0449 | [
"MIT"
] | 39 | 2017-08-19T16:58:15.000Z | 2022-03-22T01:00:03.000Z | slack/tests/test_actions.py | drewp/slack-sansio | 3c578bd087073174b1ec31b9a610e889d1fa0449 | [
"MIT"
] | 32 | 2017-08-24T18:14:32.000Z | 2019-07-25T16:57:55.000Z | slack/tests/test_actions.py | drewp/slack-sansio | 3c578bd087073174b1ec31b9a610e889d1fa0449 | [
"MIT"
] | 10 | 2017-08-09T15:56:56.000Z | 2019-10-31T06:24:46.000Z | import pytest
import slack
from slack.actions import Action, Router
from . import data
class TestActions:
def test_from_http(self, slack_action):
act = Action.from_http(slack_action)
assert isinstance(act, slack.actions.Action)
def test_parsing_token(self, slack_action):
slack.actions.Action.from_http(
slack_action, verification_token="supersecuretoken"
)
def test_parsing_team_id(self, slack_action):
slack.actions.Action.from_http(slack_action, team_id="T000AAA0A")
def test_parsing_wrong_token(self, slack_action):
with pytest.raises(slack.exceptions.FailedVerification):
slack.actions.Action.from_http(slack_action, verification_token="xxx")
def test_parsing_wrong_team_id(self, slack_action):
with pytest.raises(slack.exceptions.FailedVerification):
slack.actions.Action.from_http(slack_action, team_id="xxx")
def test_mapping_access(self, slack_action):
act = Action.from_http(slack_action)
assert act["callback_id"] == "test_action"
def test_mapping_delete(self, slack_action):
act = Action.from_http(slack_action)
assert act["callback_id"] == "test_action"
del act["callback_id"]
with pytest.raises(KeyError):
print(act["callback_id"])
def test_mapping_set(self, slack_action):
act = Action.from_http(slack_action)
assert act["callback_id"] == "test_action"
act["callback_id"] = "foo"
assert act["callback_id"] == "foo"
class TestActionRouter:
def test_register(self, action_router):
def handler():
pass
action_router.register("test_action", handler)
assert len(action_router._routes["test_action"]["*"]) == 1
assert action_router._routes["test_action"]["*"][0] is handler
def test_register_name(self, action_router):
def handler():
pass
action_router.register("test_action", handler, name="aaa")
assert len(action_router._routes["test_action"]["aaa"]) == 1
assert action_router._routes["test_action"]["aaa"][0] is handler
def test_multiple_register(self, action_router):
def handler():
pass
def handler_bis():
pass
action_router.register("test_action", handler)
action_router.register("test_action", handler_bis)
assert len(action_router._routes["test_action"]["*"]) == 2
assert action_router._routes["test_action"]["*"][0] is handler
assert action_router._routes["test_action"]["*"][1] is handler_bis
def test_register_block(self, action_router: Router):
def handler():
pass
action_router.register_block_action("test_block_id", handler)
assert len(action_router._routes["test_block_id"]["*"]) == 1
assert action_router._routes["test_block_id"]["*"][0] is handler
def test_register_block_action_id(self, action_router: Router):
def handler():
pass
action_router.register_block_action(
"test_block_id", handler, action_id="test_action_id"
)
assert len(action_router._routes["test_block_id"]["test_action_id"]) == 1
assert action_router._routes["test_block_id"]["test_action_id"][0] is handler
def test_multiple_register_block(self, action_router: Router):
def handler():
pass
def handler_bis():
pass
action_router.register_block_action("test_block_id", handler)
action_router.register_block_action("test_block_id", handler_bis)
assert len(action_router._routes["test_block_id"]["*"]) == 2
assert action_router._routes["test_block_id"]["*"][0] is handler
assert action_router._routes["test_block_id"]["*"][1] is handler_bis
def test_register_dialog_submission(self, action_router: Router):
def handler():
pass
action_router.register_dialog_submission("test_action", handler)
assert len(action_router._routes["test_action"]["*"]) == 1
assert action_router._routes["test_action"]["*"][0] is handler
def test_register_interactive_message(self, action_router: Router):
def handler():
pass
action_router.register_interactive_message("test_action", handler)
assert len(action_router._routes["test_action"]["*"]) == 1
assert action_router._routes["test_action"]["*"][0] is handler
def test_dispath(self, slack_action, action_router):
def handler():
pass
act = Action.from_http(slack_action)
action_router.register("test_action", handler)
handlers = list()
for h in action_router.dispatch(act):
handlers.append(h)
assert len(handlers) == 1
assert handlers[0] is handler
def test_no_dispatch(self, slack_action, action_router):
def handler():
pass
act = Action.from_http(slack_action)
action_router.register("xxx", handler)
for h in action_router.dispatch(act):
assert False
@pytest.fixture(params={**data.InteractiveMessage.__members__})
def test_dispatch_details(self, slack_action, action_router):
def handler():
pass
act = Action.from_http(slack_action)
action_router.register("test_action", handler, name="ok")
action_router.register("test_action", handler, name="cancel")
handlers = list()
for h in action_router.dispatch(act):
handlers.append(h)
assert len(handlers) == 1
assert handlers[0] is handler
def test_dispatch_action_specific_registration_methods(
self, slack_action, action_router
):
def handler():
pass
act = Action.from_http(slack_action)
if act["type"] in ("interactive_message", "message_action"):
action_router.register_interactive_message("test_action", handler)
elif act["type"] == "dialog_submission":
action_router.register_dialog_submission("test_action", handler)
elif act["type"] == "block_actions":
action_router.register_block_action("test_block_id", handler)
handlers = list()
for h in action_router.dispatch(act):
handlers.append(h)
assert len(handlers) == 1
assert handlers[0] is handler
def test_dispatch_block_action(self, block_action, action_router):
def handler():
pass
act = Action.from_http(block_action)
action_router.register_block_action("test_block_id", handler)
action_router.register("test_other_block_id", handler)
handlers = list()
for h in action_router.dispatch(act):
handlers.append(h)
assert len(handlers) == 1
assert handlers[0] is handler
def test_dispatch_block_action_with_id(self, block_action, action_router):
def handler():
pass
act = Action.from_http(block_action)
action_router.register_block_action(
"test_block_id", handler, action_id="test_action_id"
)
handlers = list()
for h in action_router.dispatch(act):
handlers.append(h)
assert len(handlers) == 1
assert handlers[0] is handler
def test_multiple_dispatch(self, slack_action, action_router):
def handler():
pass
def handler_bis():
pass
act = Action.from_http(slack_action)
action_router.register("test_action", handler)
action_router.register("test_action", handler_bis)
handlers = list()
for h in action_router.dispatch(act):
handlers.append(h)
assert len(handlers) == 2
assert handlers[0] is handler
assert handlers[1] is handler_bis
def test_multiple_block_dispatch(self, block_action, action_router):
def handler():
pass
def handler_bis():
pass
act = Action.from_http(block_action)
action_router.register_block_action("test_block_id", handler)
action_router.register_block_action("test_block_id", handler_bis)
handlers = list()
for h in action_router.dispatch(act):
handlers.append(h)
assert len(handlers) == 2
assert handlers[0] is handler
assert handlers[1] is handler_bis
def test_dispatch_unhandle_type(self, action_router):
action = {"type": "unhandled_type", "callback_id": "test_action"}
with pytest.raises(slack.actions.UnknownActionType):
for _ in action_router.dispatch(action):
pass
| 33.39313 | 85 | 0.649674 | 1,046 | 8,749 | 5.115679 | 0.081262 | 0.154737 | 0.089703 | 0.074005 | 0.841151 | 0.833115 | 0.822089 | 0.769576 | 0.709587 | 0.655392 | 0 | 0.005764 | 0.246428 | 8,749 | 261 | 86 | 33.521073 | 0.805855 | 0 | 0 | 0.646154 | 0 | 0 | 0.096354 | 0 | 0 | 0 | 0 | 0 | 0.205128 | 1 | 0.230769 | false | 0.107692 | 0.020513 | 0 | 0.261538 | 0.005128 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
e340c782ddcb0e0d54fb898c5f47588f4f815717 | 168 | py | Python | word2vec/tests/test_import.py | kyle-w-brown/word2vec | 200d999eab38b0ad50606a473d9381f8eac5ef1d | [
"Apache-2.0"
] | 2,431 | 2015-01-01T12:25:57.000Z | 2022-03-30T07:54:01.000Z | word2vec/tests/test_import.py | kyle-w-brown/word2vec | 200d999eab38b0ad50606a473d9381f8eac5ef1d | [
"Apache-2.0"
] | 54 | 2015-01-19T01:38:36.000Z | 2022-01-15T09:55:43.000Z | word2vec/tests/test_import.py | kyle-w-brown/word2vec | 200d999eab38b0ad50606a473d9381f8eac5ef1d | [
"Apache-2.0"
] | 686 | 2015-01-02T15:18:31.000Z | 2022-03-29T03:01:32.000Z | def test_import():
import word2vec
assert word2vec.__version__ is not None
assert word2vec.__version__ != "0.0.0"
assert len(word2vec.__version__) > 0
| 24 | 43 | 0.708333 | 22 | 168 | 4.818182 | 0.5 | 0.424528 | 0.396226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059701 | 0.202381 | 168 | 6 | 44 | 28 | 0.731343 | 0 | 0 | 0 | 0 | 0 | 0.029762 | 0 | 0 | 0 | 0 | 0 | 0.6 | 1 | 0.2 | true | 0 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
e361441331bbae465b9e1b51f2abe39dd54f5a2f | 15,059 | py | Python | data/dataset_video_test.py | NimrodShabtay/KAIR | d744959d75c22b016543dc65c04922ea4b59894f | [
"MIT"
] | 1 | 2022-02-27T03:05:50.000Z | 2022-02-27T03:05:50.000Z | data/dataset_video_test.py | NimrodShabtay/KAIR | d744959d75c22b016543dc65c04922ea4b59894f | [
"MIT"
] | null | null | null | data/dataset_video_test.py | NimrodShabtay/KAIR | d744959d75c22b016543dc65c04922ea4b59894f | [
"MIT"
] | null | null | null | import glob
import torch
from os import path as osp
import torch.utils.data as data
import utils.utils_video as utils_video
class VideoRecurrentTestDataset(data.Dataset):
"""Video test dataset for recurrent architectures, which takes LR video
frames as input and output corresponding HR video frames. Modified from
https://github.com/xinntao/BasicSR/blob/master/basicsr/data/reds_dataset.py
Supported datasets: Vid4, REDS4, REDSofficial.
More generally, it supports testing dataset with following structures:
dataroot
├── subfolder1
├── frame000
├── frame001
├── ...
├── subfolder1
├── frame000
├── frame001
├── ...
├── ...
For testing datasets, there is no need to prepare LMDB files.
Args:
opt (dict): Config for train dataset. It contains the following keys:
dataroot_gt (str): Data root path for gt.
dataroot_lq (str): Data root path for lq.
io_backend (dict): IO backend type and other kwarg.
cache_data (bool): Whether to cache testing datasets.
name (str): Dataset name.
meta_info_file (str): The path to the file storing the list of test
folders. If not provided, all the folders in the dataroot will
be used.
num_frame (int): Window size for input frames.
padding (str): Padding mode.
"""
def __init__(self, opt):
super(VideoRecurrentTestDataset, self).__init__()
self.opt = opt
self.cache_data = opt['cache_data']
self.gt_root, self.lq_root = opt['dataroot_gt'], opt['dataroot_lq']
self.data_info = {'lq_path': [], 'gt_path': [], 'folder': [], 'idx': [], 'border': []}
self.imgs_lq, self.imgs_gt = {}, {}
if 'meta_info_file' in opt:
with open(opt['meta_info_file'], 'r') as fin:
subfolders = [line.split(' ')[0] for line in fin]
subfolders_lq = [osp.join(self.lq_root, key) for key in subfolders]
subfolders_gt = [osp.join(self.gt_root, key) for key in subfolders]
else:
subfolders_lq = sorted(glob.glob(osp.join(self.lq_root, '*')))
subfolders_gt = sorted(glob.glob(osp.join(self.gt_root, '*')))
for subfolder_lq, subfolder_gt in zip(subfolders_lq, subfolders_gt):
# get frame list for lq and gt
subfolder_name = osp.basename(subfolder_lq)
img_paths_lq = sorted(list(utils_video.scandir(subfolder_lq, full_path=True)))
img_paths_gt = sorted(list(utils_video.scandir(subfolder_gt, full_path=True)))
max_idx = len(img_paths_lq)
assert max_idx == len(img_paths_gt), (f'Different number of images in lq ({max_idx})'
f' and gt folders ({len(img_paths_gt)})')
self.data_info['lq_path'].extend(img_paths_lq)
self.data_info['gt_path'].extend(img_paths_gt)
self.data_info['folder'].extend([subfolder_name] * max_idx)
for i in range(max_idx):
self.data_info['idx'].append(f'{i}/{max_idx}')
border_l = [0] * max_idx
for i in range(self.opt['num_frame'] // 2):
border_l[i] = 1
border_l[max_idx - i - 1] = 1
self.data_info['border'].extend(border_l)
# cache data or save the frame list
if self.cache_data:
print(f'Cache {subfolder_name} for VideoTestDataset...')
self.imgs_lq[subfolder_name] = utils_video.read_img_seq(img_paths_lq)
self.imgs_gt[subfolder_name] = utils_video.read_img_seq(img_paths_gt)
else:
self.imgs_lq[subfolder_name] = img_paths_lq
self.imgs_gt[subfolder_name] = img_paths_gt
# Find unique folder strings
self.folders = sorted(list(set(self.data_info['folder'])))
self.sigma = opt['sigma'] / 255. if 'sigma' in opt else 0 # for non-blind video denoising
def __getitem__(self, index):
folder = self.folders[index]
if self.sigma:
# for non-blind video denoising
if self.cache_data:
imgs_gt = self.imgs_gt[folder]
else:
imgs_gt = utils_video.read_img_seq(self.imgs_gt[folder])
torch.manual_seed(0)
noise_level = torch.ones((1, 1, 1, 1)) * self.sigma
noise = torch.normal(mean=0, std=noise_level.expand_as(imgs_gt))
imgs_lq = imgs_gt + noise
t, _, h, w = imgs_lq.shape
imgs_lq = torch.cat([imgs_lq, noise_level.expand(t, 1, h, w)], 1)
else:
# for video sr and deblurring
if self.cache_data:
imgs_lq = self.imgs_lq[folder]
imgs_gt = self.imgs_gt[folder]
else:
imgs_lq = utils_video.read_img_seq(self.imgs_lq[folder])
imgs_gt = utils_video.read_img_seq(self.imgs_gt[folder])
return {
'L': imgs_lq,
'H': imgs_gt,
'folder': folder,
'lq_path': self.imgs_lq[folder],
}
def __len__(self):
return len(self.folders)
class SingleVideoRecurrentTestDataset(data.Dataset):
"""Single ideo test dataset for recurrent architectures, which takes LR video
frames as input and output corresponding HR video frames (only input LQ path).
More generally, it supports testing dataset with following structures:
dataroot
├── subfolder1
├── frame000
├── frame001
├── ...
├── subfolder1
├── frame000
├── frame001
├── ...
├── ...
For testing datasets, there is no need to prepare LMDB files.
Args:
opt (dict): Config for train dataset. It contains the following keys:
dataroot_gt (str): Data root path for gt.
dataroot_lq (str): Data root path for lq.
io_backend (dict): IO backend type and other kwarg.
cache_data (bool): Whether to cache testing datasets.
name (str): Dataset name.
meta_info_file (str): The path to the file storing the list of test
folders. If not provided, all the folders in the dataroot will
be used.
num_frame (int): Window size for input frames.
padding (str): Padding mode.
"""
def __init__(self, opt):
super(SingleVideoRecurrentTestDataset, self).__init__()
self.opt = opt
self.cache_data = opt['cache_data']
self.lq_root = opt['dataroot_lq']
self.data_info = {'lq_path': [], 'folder': [], 'idx': [], 'border': []}
self.imgs_lq = {}
if 'meta_info_file' in opt:
with open(opt['meta_info_file'], 'r') as fin:
subfolders = [line.split(' ')[0] for line in fin]
subfolders_lq = [osp.join(self.lq_root, key) for key in subfolders]
else:
subfolders_lq = sorted(glob.glob(osp.join(self.lq_root, '*')))
for subfolder_lq in subfolders_lq:
# get frame list for lq and gt
subfolder_name = osp.basename(subfolder_lq)
img_paths_lq = sorted(list(utils_video.scandir(subfolder_lq, full_path=True)))
max_idx = len(img_paths_lq)
self.data_info['lq_path'].extend(img_paths_lq)
self.data_info['folder'].extend([subfolder_name] * max_idx)
for i in range(max_idx):
self.data_info['idx'].append(f'{i}/{max_idx}')
border_l = [0] * max_idx
for i in range(self.opt['num_frame'] // 2):
border_l[i] = 1
border_l[max_idx - i - 1] = 1
self.data_info['border'].extend(border_l)
# cache data or save the frame list
if self.cache_data:
print(f'Cache {subfolder_name} for VideoTestDataset...')
self.imgs_lq[subfolder_name] = utils_video.read_img_seq(img_paths_lq)
else:
self.imgs_lq[subfolder_name] = img_paths_lq
# Find unique folder strings
self.folders = sorted(list(set(self.data_info['folder'])))
def __getitem__(self, index):
folder = self.folders[index]
if self.cache_data:
imgs_lq = self.imgs_lq[folder]
else:
imgs_lq = utils_video.read_img_seq(self.imgs_lq[folder])
return {
'L': imgs_lq,
'folder': folder,
'lq_path': self.imgs_lq[folder],
}
def __len__(self):
return len(self.folders)
class VideoTestVimeo90KDataset(data.Dataset):
"""Video test dataset for Vimeo90k-Test dataset.
It only keeps the center frame for testing.
For testing datasets, there is no need to prepare LMDB files.
Args:
opt (dict): Config for train dataset. It contains the following keys:
dataroot_gt (str): Data root path for gt.
dataroot_lq (str): Data root path for lq.
io_backend (dict): IO backend type and other kwarg.
cache_data (bool): Whether to cache testing datasets.
name (str): Dataset name.
meta_info_file (str): The path to the file storing the list of test
folders. If not provided, all the folders in the dataroot will
be used.
num_frame (int): Window size for input frames.
padding (str): Padding mode.
"""
def __init__(self, opt):
super(VideoTestVimeo90KDataset, self).__init__()
self.opt = opt
self.cache_data = opt['cache_data']
if self.cache_data:
raise NotImplementedError('cache_data in Vimeo90K-Test dataset is not implemented.')
self.gt_root, self.lq_root = opt['dataroot_gt'], opt['dataroot_lq']
self.data_info = {'lq_path': [], 'gt_path': [], 'folder': [], 'idx': [], 'border': []}
neighbor_list = [i + (9 - opt['num_frame']) // 2 for i in range(opt['num_frame'])]
with open(opt['meta_info_file'], 'r') as fin:
subfolders = [line.split(' ')[0] for line in fin]
for idx, subfolder in enumerate(subfolders):
gt_path = osp.join(self.gt_root, subfolder, 'im4.png')
self.data_info['gt_path'].append(gt_path)
lq_paths = [osp.join(self.lq_root, subfolder, f'im{i}.png') for i in neighbor_list]
self.data_info['lq_path'].append(lq_paths)
self.data_info['folder'].append('vimeo90k')
self.data_info['idx'].append(f'{idx}/{len(subfolders)}')
self.data_info['border'].append(0)
self.pad_sequence = opt.get('pad_sequence', False)
def __getitem__(self, index):
lq_path = self.data_info['lq_path'][index]
gt_path = self.data_info['gt_path'][index]
imgs_lq = utils_video.read_img_seq(lq_path)
img_gt = utils_video.read_img_seq([gt_path])
img_gt.squeeze_(0)
if self.pad_sequence: # pad the sequence: 7 frames to 8 frames
imgs_lq = torch.cat([imgs_lq, imgs_lq[-1:,...]], dim=0)
return {
'L': imgs_lq, # (t, c, h, w)
'H': img_gt, # (c, h, w)
'folder': self.data_info['folder'][index], # folder name
'idx': self.data_info['idx'][index], # e.g., 0/843
'border': self.data_info['border'][index], # 0 for non-border
'lq_path': lq_path[self.opt['num_frame'] // 2] # center frame
}
def __len__(self):
return len(self.data_info['gt_path'])
class SingleVideoRecurrentTestDataset(data.Dataset):
"""Single Video test dataset (only input LQ path).
Supported datasets: Vid4, REDS4, REDSofficial.
More generally, it supports testing dataset with following structures:
dataroot
├── subfolder1
├── frame000
├── frame001
├── ...
├── subfolder1
├── frame000
├── frame001
├── ...
├── ...
For testing datasets, there is no need to prepare LMDB files.
Args:
opt (dict): Config for train dataset. It contains the following keys:
dataroot_gt (str): Data root path for gt.
dataroot_lq (str): Data root path for lq.
io_backend (dict): IO backend type and other kwarg.
cache_data (bool): Whether to cache testing datasets.
name (str): Dataset name.
meta_info_file (str): The path to the file storing the list of test
folders. If not provided, all the folders in the dataroot will
be used.
num_frame (int): Window size for input frames.
padding (str): Padding mode.
"""
def __init__(self, opt):
super(SingleVideoRecurrentTestDataset, self).__init__()
self.opt = opt
self.cache_data = opt['cache_data']
self.lq_root = opt['dataroot_lq']
self.data_info = {'lq_path': [], 'folder': [], 'idx': [], 'border': []}
# file client (io backend)
self.file_client = None
self.imgs_lq = {}
if 'meta_info_file' in opt:
with open(opt['meta_info_file'], 'r') as fin:
subfolders = [line.split(' ')[0] for line in fin]
subfolders_lq = [osp.join(self.lq_root, key) for key in subfolders]
else:
subfolders_lq = sorted(glob.glob(osp.join(self.lq_root, '*')))
for subfolder_lq in subfolders_lq:
# get frame list for lq and gt
subfolder_name = osp.basename(subfolder_lq)
img_paths_lq = sorted(list(utils_video.scandir(subfolder_lq, full_path=True)))
max_idx = len(img_paths_lq)
self.data_info['lq_path'].extend(img_paths_lq)
self.data_info['folder'].extend([subfolder_name] * max_idx)
for i in range(max_idx):
self.data_info['idx'].append(f'{i}/{max_idx}')
border_l = [0] * max_idx
for i in range(self.opt['num_frame'] // 2):
border_l[i] = 1
border_l[max_idx - i - 1] = 1
self.data_info['border'].extend(border_l)
# cache data or save the frame list
if self.cache_data:
logger.info(f'Cache {subfolder_name} for VideoTestDataset...')
self.imgs_lq[subfolder_name] = utils_video.read_img_seq(img_paths_lq)
else:
self.imgs_lq[subfolder_name] = img_paths_lq
# Find unique folder strings
self.folders = sorted(list(set(self.data_info['folder'])))
def __getitem__(self, index):
folder = self.folders[index]
if self.cache_data:
imgs_lq = self.imgs_lq[folder]
else:
imgs_lq = utils_video.read_img_seq(self.imgs_lq[folder])
return {
'L': imgs_lq,
'folder': folder,
'lq_path': self.imgs_lq[folder],
}
def __len__(self):
return len(self.folders)
| 39.318538 | 97 | 0.584368 | 1,984 | 15,059 | 4.25756 | 0.107359 | 0.025571 | 0.044039 | 0.022138 | 0.826802 | 0.782408 | 0.754706 | 0.750444 | 0.741565 | 0.728661 | 0 | 0.009718 | 0.303008 | 15,059 | 382 | 98 | 39.421466 | 0.787348 | 0.291786 | 0 | 0.719807 | 0 | 0 | 0.095401 | 0.004296 | 0 | 0 | 0 | 0 | 0.004831 | 1 | 0.057971 | false | 0 | 0.024155 | 0.019324 | 0.140097 | 0.009662 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e38967bd78e97bae5a6d2376d0db27f671acdb75 | 47,521 | py | Python | django_socio_grpc/tests/fakeapp/grpc/fakeapp_pb2_grpc.py | socotecio/django-socio-grpc | b01fc00347f2ca0f20d54085e6f807a4b4495caf | [
"Apache-2.0"
] | 13 | 2021-04-15T14:53:34.000Z | 2022-02-28T13:13:36.000Z | django_socio_grpc/tests/fakeapp/grpc/fakeapp_pb2_grpc.py | socotecio/django-socio-grpc | b01fc00347f2ca0f20d54085e6f807a4b4495caf | [
"Apache-2.0"
] | 1 | 2021-11-26T14:05:11.000Z | 2021-11-26T14:05:11.000Z | django_socio_grpc/tests/fakeapp/grpc/fakeapp_pb2_grpc.py | socotecio/django-socio-grpc | b01fc00347f2ca0f20d54085e6f807a4b4495caf | [
"Apache-2.0"
] | 2 | 2021-05-31T07:43:23.000Z | 2021-07-08T07:51:50.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
from django_socio_grpc.tests.fakeapp.grpc import (
fakeapp_pb2 as django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2,
)
class UnitTestModelControllerStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.List = channel.unary_unary(
'/fakeproject.fakeapp.UnitTestModelController/List',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelListRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelListResponse.FromString,
)
self.Create = channel.unary_unary(
'/fakeproject.fakeapp.UnitTestModelController/Create',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
)
self.Retrieve = channel.unary_unary(
'/fakeproject.fakeapp.UnitTestModelController/Retrieve',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelRetrieveRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
)
self.Update = channel.unary_unary(
'/fakeproject.fakeapp.UnitTestModelController/Update',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
)
self.Destroy = channel.unary_unary(
'/fakeproject.fakeapp.UnitTestModelController/Destroy',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelDestroyRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.Stream = channel.unary_stream(
'/fakeproject.fakeapp.UnitTestModelController/Stream',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelStreamRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
)
class UnitTestModelControllerServicer(object):
"""Missing associated documentation comment in .proto file."""
def List(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Create(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Retrieve(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Update(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Destroy(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Stream(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_UnitTestModelControllerServicer_to_server(servicer, server):
rpc_method_handlers = {
'List': grpc.unary_unary_rpc_method_handler(
servicer.List,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelListRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelListResponse.SerializeToString,
),
'Create': grpc.unary_unary_rpc_method_handler(
servicer.Create,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.SerializeToString,
),
'Retrieve': grpc.unary_unary_rpc_method_handler(
servicer.Retrieve,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelRetrieveRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.SerializeToString,
),
'Update': grpc.unary_unary_rpc_method_handler(
servicer.Update,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.SerializeToString,
),
'Destroy': grpc.unary_unary_rpc_method_handler(
servicer.Destroy,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelDestroyRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'Stream': grpc.unary_stream_rpc_method_handler(
servicer.Stream,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelStreamRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'fakeproject.fakeapp.UnitTestModelController', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class UnitTestModelController(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def List(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.UnitTestModelController/List',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelListRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelListResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Create(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.UnitTestModelController/Create',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Retrieve(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.UnitTestModelController/Retrieve',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelRetrieveRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Update(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.UnitTestModelController/Update',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Destroy(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.UnitTestModelController/Destroy',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelDestroyRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Stream(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_stream(request, target, '/fakeproject.fakeapp.UnitTestModelController/Stream',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModelStreamRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.UnitTestModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class ForeignModelControllerStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.List = channel.unary_unary(
'/fakeproject.fakeapp.ForeignModelController/List',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelListRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelListResponse.FromString,
)
self.Retrieve = channel.unary_unary(
'/fakeproject.fakeapp.ForeignModelController/Retrieve',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelRetrieveRequestCustom.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelRetrieveRequestCustom.FromString,
)
class ForeignModelControllerServicer(object):
"""Missing associated documentation comment in .proto file."""
def List(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Retrieve(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ForeignModelControllerServicer_to_server(servicer, server):
rpc_method_handlers = {
'List': grpc.unary_unary_rpc_method_handler(
servicer.List,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelListRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelListResponse.SerializeToString,
),
'Retrieve': grpc.unary_unary_rpc_method_handler(
servicer.Retrieve,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelRetrieveRequestCustom.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelRetrieveRequestCustom.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'fakeproject.fakeapp.ForeignModelController', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class ForeignModelController(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def List(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.ForeignModelController/List',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelListRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelListResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Retrieve(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.ForeignModelController/Retrieve',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelRetrieveRequestCustom.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ForeignModelRetrieveRequestCustom.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class ManyManyModelControllerStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.List = channel.unary_unary(
'/fakeproject.fakeapp.ManyManyModelController/List',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelListRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelListResponse.FromString,
)
self.Create = channel.unary_unary(
'/fakeproject.fakeapp.ManyManyModelController/Create',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.FromString,
)
self.Retrieve = channel.unary_unary(
'/fakeproject.fakeapp.ManyManyModelController/Retrieve',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelRetrieveRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.FromString,
)
self.Update = channel.unary_unary(
'/fakeproject.fakeapp.ManyManyModelController/Update',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.FromString,
)
self.Destroy = channel.unary_unary(
'/fakeproject.fakeapp.ManyManyModelController/Destroy',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelDestroyRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
class ManyManyModelControllerServicer(object):
"""Missing associated documentation comment in .proto file."""
def List(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Create(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Retrieve(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Update(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Destroy(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ManyManyModelControllerServicer_to_server(servicer, server):
rpc_method_handlers = {
'List': grpc.unary_unary_rpc_method_handler(
servicer.List,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelListRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelListResponse.SerializeToString,
),
'Create': grpc.unary_unary_rpc_method_handler(
servicer.Create,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.SerializeToString,
),
'Retrieve': grpc.unary_unary_rpc_method_handler(
servicer.Retrieve,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelRetrieveRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.SerializeToString,
),
'Update': grpc.unary_unary_rpc_method_handler(
servicer.Update,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.SerializeToString,
),
'Destroy': grpc.unary_unary_rpc_method_handler(
servicer.Destroy,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelDestroyRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'fakeproject.fakeapp.ManyManyModelController', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class ManyManyModelController(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def List(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.ManyManyModelController/List',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelListRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelListResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Create(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.ManyManyModelController/Create',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Retrieve(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.ManyManyModelController/Retrieve',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelRetrieveRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Update(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.ManyManyModelController/Update',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Destroy(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.ManyManyModelController/Destroy',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.ManyManyModelDestroyRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class RelatedFieldModelControllerStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.List = channel.unary_unary(
'/fakeproject.fakeapp.RelatedFieldModelController/List',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelListRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelListResponse.FromString,
)
self.Create = channel.unary_unary(
'/fakeproject.fakeapp.RelatedFieldModelController/Create',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.FromString,
)
self.Retrieve = channel.unary_unary(
'/fakeproject.fakeapp.RelatedFieldModelController/Retrieve',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelRetrieveRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.FromString,
)
self.Update = channel.unary_unary(
'/fakeproject.fakeapp.RelatedFieldModelController/Update',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.FromString,
)
self.Destroy = channel.unary_unary(
'/fakeproject.fakeapp.RelatedFieldModelController/Destroy',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelDestroyRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
class RelatedFieldModelControllerServicer(object):
"""Missing associated documentation comment in .proto file."""
def List(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Create(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Retrieve(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Update(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Destroy(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_RelatedFieldModelControllerServicer_to_server(servicer, server):
rpc_method_handlers = {
'List': grpc.unary_unary_rpc_method_handler(
servicer.List,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelListRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelListResponse.SerializeToString,
),
'Create': grpc.unary_unary_rpc_method_handler(
servicer.Create,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.SerializeToString,
),
'Retrieve': grpc.unary_unary_rpc_method_handler(
servicer.Retrieve,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelRetrieveRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.SerializeToString,
),
'Update': grpc.unary_unary_rpc_method_handler(
servicer.Update,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.SerializeToString,
),
'Destroy': grpc.unary_unary_rpc_method_handler(
servicer.Destroy,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelDestroyRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'fakeproject.fakeapp.RelatedFieldModelController', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class RelatedFieldModelController(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def List(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.RelatedFieldModelController/List',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelListRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelListResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Create(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.RelatedFieldModelController/Create',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Retrieve(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.RelatedFieldModelController/Retrieve',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelRetrieveRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Update(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.RelatedFieldModelController/Update',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Destroy(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.RelatedFieldModelController/Destroy',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.RelatedFieldModelDestroyRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class SpecialFieldsModelControllerStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.List = channel.unary_unary(
'/fakeproject.fakeapp.SpecialFieldsModelController/List',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelListRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelListResponse.FromString,
)
self.Create = channel.unary_unary(
'/fakeproject.fakeapp.SpecialFieldsModelController/Create',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.FromString,
)
self.Retrieve = channel.unary_unary(
'/fakeproject.fakeapp.SpecialFieldsModelController/Retrieve',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelRetrieveRequest.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.FromString,
)
self.Update = channel.unary_unary(
'/fakeproject.fakeapp.SpecialFieldsModelController/Update',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.SerializeToString,
response_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.FromString,
)
self.Destroy = channel.unary_unary(
'/fakeproject.fakeapp.SpecialFieldsModelController/Destroy',
request_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelDestroyRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
class SpecialFieldsModelControllerServicer(object):
"""Missing associated documentation comment in .proto file."""
def List(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Create(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Retrieve(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Update(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Destroy(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_SpecialFieldsModelControllerServicer_to_server(servicer, server):
rpc_method_handlers = {
'List': grpc.unary_unary_rpc_method_handler(
servicer.List,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelListRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelListResponse.SerializeToString,
),
'Create': grpc.unary_unary_rpc_method_handler(
servicer.Create,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.SerializeToString,
),
'Retrieve': grpc.unary_unary_rpc_method_handler(
servicer.Retrieve,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelRetrieveRequest.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.SerializeToString,
),
'Update': grpc.unary_unary_rpc_method_handler(
servicer.Update,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.FromString,
response_serializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.SerializeToString,
),
'Destroy': grpc.unary_unary_rpc_method_handler(
servicer.Destroy,
request_deserializer=django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelDestroyRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'fakeproject.fakeapp.SpecialFieldsModelController', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class SpecialFieldsModelController(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def List(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.SpecialFieldsModelController/List',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelListRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelListResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Create(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.SpecialFieldsModelController/Create',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Retrieve(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.SpecialFieldsModelController/Retrieve',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelRetrieveRequest.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Update(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.SpecialFieldsModelController/Update',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.SerializeToString,
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModel.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Destroy(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/fakeproject.fakeapp.SpecialFieldsModelController/Destroy',
django__socio__grpc_dot_tests_dot_fakeapp_dot_grpc_dot_fakeapp__pb2.SpecialFieldsModelDestroyRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
| 52.335903 | 160 | 0.714631 | 4,654 | 47,521 | 6.77804 | 0.029652 | 0.056364 | 0.060865 | 0.072468 | 0.965383 | 0.963671 | 0.94538 | 0.94538 | 0.922206 | 0.922206 | 0 | 0.00384 | 0.221755 | 47,521 | 907 | 161 | 52.393605 | 0.84912 | 0.058164 | 0 | 0.769128 | 1 | 0 | 0.086993 | 0.059969 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075168 | false | 0 | 0.004027 | 0.030872 | 0.130201 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8b768f79e00dc3a95906142d60a1ebe8e1981837 | 387 | py | Python | apps/utils/print_colors.py | osw4l/villas-de-san-pablo | 89f00dfbbfbfee5111bd9852ddfbdb8727d10ed2 | [
"MIT"
] | null | null | null | apps/utils/print_colors.py | osw4l/villas-de-san-pablo | 89f00dfbbfbfee5111bd9852ddfbdb8727d10ed2 | [
"MIT"
] | null | null | null | apps/utils/print_colors.py | osw4l/villas-de-san-pablo | 89f00dfbbfbfee5111bd9852ddfbdb8727d10ed2 | [
"MIT"
] | null | null | null | def _red(msj):
return "\033[0;31m{0}\033[0m".format(msj)
def _green(msj):
return "\033[0;32m{0}\033[0m".format(msj)
def _orange(msj):
return "\033[0;33m{0}\033[0m".format(msj)
def _blue(msj):
return "\033[0;34m{0}\033[0m".format(msj)
def _purple(msj):
return "\033[0;35m{0}\033[0m".format(msj)
def _cyan(msj):
return "\033[0;36m{0}\033[0m".format(msj)
| 16.125 | 45 | 0.614987 | 72 | 387 | 3.222222 | 0.263889 | 0.232759 | 0.310345 | 0.336207 | 0.452586 | 0.387931 | 0 | 0 | 0 | 0 | 0 | 0.20122 | 0.152455 | 387 | 23 | 46 | 16.826087 | 0.506098 | 0 | 0 | 0 | 0 | 0 | 0.310881 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
8b7c12904403246fe997ab74faf5a43985c988d6 | 127 | py | Python | ontology-tools/CMCLABoxManagement/chemaboxwriters/chemaboxwriters/ontomops/__init__.py | cambridge-cares/TheWorldAvatar | baf08ddc090414c6d01e48c74b408f2192461e9e | [
"MIT"
] | 21 | 2021-03-08T01:58:25.000Z | 2022-03-09T15:46:16.000Z | ontology-tools/CMCLABoxManagement/chemaboxwriters/chemaboxwriters/ontomops/__init__.py | cambridge-cares/TheWorldAvatar | baf08ddc090414c6d01e48c74b408f2192461e9e | [
"MIT"
] | 63 | 2021-05-04T15:05:30.000Z | 2022-03-23T14:32:29.000Z | ontology-tools/CMCLABoxManagement/chemaboxwriters/chemaboxwriters/ontomops/__init__.py | cambridge-cares/TheWorldAvatar | baf08ddc090414c6d01e48c74b408f2192461e9e | [
"MIT"
] | 15 | 2021-03-08T07:52:03.000Z | 2022-03-29T04:46:20.000Z | from chemaboxwriters.ontomops.pipeline import assemble_omops_pipeline
from chemaboxwriters.ontomops.writeabox import write_abox | 63.5 | 69 | 0.913386 | 15 | 127 | 7.533333 | 0.666667 | 0.336283 | 0.477876 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055118 | 127 | 2 | 70 | 63.5 | 0.941667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4749723c9f5d85fcd3d50bd12a8ecee2fdbcc292 | 156 | py | Python | ccc/nodes/admin.py | sirikata/sirikata-ccc | 74babcd4b4bfbb6948d670c00fd78daea835761e | [
"BSD-3-Clause"
] | null | null | null | ccc/nodes/admin.py | sirikata/sirikata-ccc | 74babcd4b4bfbb6948d670c00fd78daea835761e | [
"BSD-3-Clause"
] | 1 | 2020-06-05T18:21:57.000Z | 2020-06-05T18:21:57.000Z | ccc/nodes/admin.py | sirikata/sirikata-ccc | 74babcd4b4bfbb6948d670c00fd78daea835761e | [
"BSD-3-Clause"
] | null | null | null | from django.contrib import admin
from nodes.models import Node
from nodes.models import NodeGroup
admin.site.register(Node)
admin.site.register(NodeGroup)
| 22.285714 | 34 | 0.833333 | 23 | 156 | 5.652174 | 0.478261 | 0.138462 | 0.230769 | 0.323077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096154 | 156 | 6 | 35 | 26 | 0.921986 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
47718a2d51db9aec1177fdfd2c1fdaf506c9ec60 | 161 | py | Python | attacks/none.py | sio13/turtleNet | 02195eb33a28d238a769a3f34d6f1b24255ab40f | [
"MIT"
] | 1 | 2021-06-10T08:31:35.000Z | 2021-06-10T08:31:35.000Z | attacks/none.py | sio13/turtleNet | 02195eb33a28d238a769a3f34d6f1b24255ab40f | [
"MIT"
] | 2 | 2020-05-15T11:14:14.000Z | 2020-05-15T11:14:46.000Z | attacks/none.py | sio13/turtleNet | 02195eb33a28d238a769a3f34d6f1b24255ab40f | [
"MIT"
] | null | null | null | import numpy as np
class NoneAttack:
def __init__(self, **kwargs):
pass
def generate_np(self, x: np.array, **attack_params):
return x
| 16.1 | 56 | 0.627329 | 22 | 161 | 4.318182 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.273292 | 161 | 9 | 57 | 17.888889 | 0.811966 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.166667 | 0.166667 | 0.166667 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.