hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
f32b0de8c5d3125cdac4f4a1e0c8eaddbd9163a7 | 57,624 | py | Python | riptide_engine_docker/tests/unit/container_builder_test.py | theCapypara/riptide-engine-docker | 065eac0fcfe6d7de975082cabc9fb54ee1b1a422 | [
"MIT"
] | 1 | 2020-03-17T13:16:24.000Z | 2020-03-17T13:16:24.000Z | riptide_engine_docker/tests/unit/container_builder_test.py | theCapypara/riptide-engine-docker | 065eac0fcfe6d7de975082cabc9fb54ee1b1a422 | [
"MIT"
] | 3 | 2021-09-22T09:50:31.000Z | 2022-01-05T13:48:02.000Z | riptide_engine_docker/tests/unit/container_builder_test.py | theCapypara/riptide-engine-docker | 065eac0fcfe6d7de975082cabc9fb54ee1b1a422 | [
"MIT"
] | null | null | null | import os
import unittest
from docker.types import Mount
from unittest import mock
from unittest.mock import Mock, MagicMock
from configcrunch.tests.test_utils import YamlConfigDocumentStub
from riptide.tests.stubs import ProjectStub
from riptide_engine_docker.container_builder import ContainerBuilder, ENTRYPOINT_SH, ENTRYPOINT_CONTAINER_PATH, \
EENV_ORIGINAL_ENTRYPOINT, EENV_DONT_RUN_CMD, EENV_COMMAND_LOG_PREFIX, EENV_USER, EENV_GROUP, \
EENV_RUN_MAIN_CMD_AS_USER, RIPTIDE_DOCKER_LABEL_IS_RIPTIDE, RIPTIDE_DOCKER_LABEL_MAIN, RIPTIDE_DOCKER_LABEL_PROJECT, \
RIPTIDE_DOCKER_LABEL_SERVICE, RIPTIDE_DOCKER_LABEL_HTTP_PORT, EENV_USER_RUN, DOCKER_ENGINE_HTTP_PORT_BND_START, \
EENV_ON_LINUX, EENV_HOST_SYSTEM_HOSTNAMES, EENV_OVERLAY_TARGETS, EENV_NAMED_VOLUMES
IMAGE_NAME = 'unit/testimage'
COMMAND = 'test_command'
EADMOCK = '__riptide_engine_docker_assets_dir'
GET_LOCALHOSTS_HOSTS_RETURN = ['dummy1', 'dummy2']
class ContainerBuilderTest(unittest.TestCase):
def setUp(self) -> None:
self.fix = ContainerBuilder(image=IMAGE_NAME, command=COMMAND)
self.expected_api_base = {
'image': IMAGE_NAME,
'command': COMMAND,
'environment': {EENV_ON_LINUX: '1'},
'mounts': [],
'ports': {},
'labels': {'riptide': '1'}
}
self.expected_cli_base = ["docker", "run", "--rm", "-it"]
def test_simple(self):
"""Test only with values from constructor"""
# Test API build
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1', '--label', 'riptide=1', IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_command_list(self):
test_obj = ContainerBuilder(image=IMAGE_NAME, command=[COMMAND, 'elem2'])
# Test API build
self.expected_api_base.update({
'command': [COMMAND, 'elem2']
})
actual_api = test_obj.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1', '--label', 'riptide=1', IMAGE_NAME, COMMAND + ' "elem2"'
]
actual_cli = test_obj.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_command_none(self):
test_obj = ContainerBuilder(image=IMAGE_NAME, command=None)
# Test API build
self.expected_api_base.update({
'command': None
})
actual_api = test_obj.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1', '--label', 'riptide=1', IMAGE_NAME, ''
]
actual_cli = test_obj.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_command_list_spaces(self):
test_obj = ContainerBuilder(image=IMAGE_NAME, command=[COMMAND, 'elem2 elem3'])
# Test API build
self.expected_api_base.update({
'command': [COMMAND, '"elem2 elem3"']
})
actual_api = test_obj.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1', '--label', 'riptide=1', IMAGE_NAME, COMMAND + ' "elem2 elem3"'
]
actual_cli = test_obj.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_command_str_spaces_in_first_part_of_command(self):
test_obj = ContainerBuilder(image=IMAGE_NAME, command='elem1 elem2 elem3 "elem4a elem4b" \'elem5a elem5b\'')
# Test API build
self.expected_api_base.update({
'command': 'elem1 elem2 elem3 "elem4a elem4b" \'elem5a elem5b\''
})
actual_api = test_obj.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1', '--label', 'riptide=1', IMAGE_NAME, 'elem1 elem2 elem3 "elem4a elem4b" \'elem5a elem5b\''
]
actual_cli = test_obj.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_env(self):
self.fix.set_env('test_key', 'test_value')
# Test API build
self.expected_api_base.update({
'environment': {'test_key': 'test_value', EENV_ON_LINUX: '1'}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1', '-e', 'test_key=test_value',
'--label', 'riptide=1', IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_label(self):
self.fix.set_label('test_key', 'test_value')
# Test API build
self.expected_api_base.update({
'labels': {
'riptide': '1',
'test_key': 'test_value'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
'--label', 'test_key=test_value', IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('platform.system', return_value='Linux')
def test_set_mount_not_mac(self, system_mock: Mock):
self.fix.set_mount('/host_path', '/container_path')
self.fix.set_mount('/host_path2', '/container_path2', 'ro')
# Test API build
self.expected_api_base.update({
'mounts': [
Mount(
target='/container_path',
source='/host_path',
type='bind',
read_only=False,
consistency='delegated'
),
Mount(
target='/container_path2',
source='/host_path2',
type='bind',
read_only=True,
consistency='delegated'
)
]
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
'-v', '/host_path:/container_path:rw',
'-v', '/host_path2:/container_path2:ro',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('platform.system', return_value='MacOS')
def test_set_mount_mac(self, system_mock: Mock):
self.fix.set_mount('/host_path', '/container_path')
self.fix.set_mount('/host_path2', '/container_path2', 'ro')
# Test API build
self.expected_api_base.update({
'mounts': [
Mount(
target='/container_path',
source='/host_path',
type='bind',
read_only=False,
consistency='delegated'
),
Mount(
target='/container_path2',
source='/host_path2',
type='bind',
read_only=True,
consistency='delegated'
)
]
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
'-v', '/host_path:/container_path:rw:delegated',
'-v', '/host_path2:/container_path2:ro:delegated',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('platform.system', return_value='Linux')
def test_set_named_volume_mount(self, system_mock: Mock):
self.fix.set_named_volume_mount('name', '/container_path')
self.fix.set_named_volume_mount('name2', '/container_path2', 'ro')
# Test API build
self.expected_api_base.update({
'mounts': [
Mount(
target='/container_path',
source='riptide__name',
type='volume',
read_only=False,
labels={RIPTIDE_DOCKER_LABEL_IS_RIPTIDE: "1"}
),
Mount(
target='/container_path2',
source='riptide__name2',
type='volume',
read_only=True,
labels={RIPTIDE_DOCKER_LABEL_IS_RIPTIDE: "1"}
)
],
'environment': {
EENV_ON_LINUX: '1',
EENV_NAMED_VOLUMES: '/container_path:/container_path2'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_NAMED_VOLUMES + '=/container_path:/container_path2',
'--label', 'riptide=1',
'--mount', 'type=volume,target=/container_path,src=riptide__name,ro=0,volume-label=riptide=1',
'--mount', 'type=volume,target=/container_path2,src=riptide__name2,ro=1,volume-label=riptide=1',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_port(self):
self.fix.set_port(1234, 5678)
self.fix.set_port(9876, 5432)
# Test API build
self.expected_api_base.update({
'ports': {1234: 5678, 9876: 5432}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
'-p', '5678:1234',
'-p', '5432:9876',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_network(self):
self.fix.set_network('name')
# Test API build
self.expected_api_base.update({
'network': 'name'
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--network', 'name',
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_name(self):
self.fix.set_name('blubbeldiblub')
# Test API build
self.expected_api_base.update({
'name': 'blubbeldiblub'
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--name', 'blubbeldiblub',
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_entrypoint(self):
self.fix.set_entrypoint('/usr/bin/very-important-script')
# Test API build
self.expected_api_base.update({
'entrypoint': ['/usr/bin/very-important-script']
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', '/usr/bin/very-important-script',
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_args(self):
self.fix.set_args(['arg1', 'arg2', 'arg3'])
# Test API build
self.expected_api_base.update({
'command': COMMAND + ' "arg1" "arg2" "arg3"'
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
IMAGE_NAME, COMMAND + ' "arg1" "arg2" "arg3"'
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_args_command_is_list(self):
fix = ContainerBuilder(image=IMAGE_NAME, command=[COMMAND, "arg0"])
fix.set_args(['arg1', 'arg2', 'arg3'])
# Test API build
self.expected_api_base.update({
'command': [COMMAND, 'arg0', 'arg1', 'arg2', 'arg3']
})
actual_api = fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
IMAGE_NAME, COMMAND + ' "arg0" "arg1" "arg2" "arg3"'
]
actual_cli = fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_hostname(self):
self.fix.set_hostname('dubdub')
# Test API build
self.expected_api_base.update({
'hostname': 'dubdub'
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--hostname', 'dubdub',
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
def test_set_workdir(self):
self.fix.set_workdir('/tmp/blubbel')
# Test API build
self.expected_api_base.update({
'working_dir': '/tmp/blubbel'
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-w', '/tmp/blubbel',
'-e', EENV_ON_LINUX + '=1',
'--label', 'riptide=1',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
def test_enable_riptide_entrypoint_orig_is_list(self, sys_mock: Mock, ead_mock: Mock):
self.maxDiff = None
image_config_mock = {'Entrypoint': ['cmd', 'arg1', 'arg2 with space']}
self.fix.enable_riptide_entrypoint(image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
)],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: 'cmd "arg1" "arg2 with space"',
EENV_ON_LINUX: '1'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=cmd "arg1" "arg2 with space"',
'--label', 'riptide=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
def test_enable_riptide_entrypoint_orig_is_string(self, sys_mock: Mock, ead_mock: Mock):
self.maxDiff = None
expected_sh = '/bin/sh -c '
ep_value = 'entrypoint is a string'
image_config_mock = {'Entrypoint': ep_value}
self.fix.enable_riptide_entrypoint(image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
)],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: expected_sh + ep_value,
EENV_DONT_RUN_CMD: 'true',
EENV_ON_LINUX: '1'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=' + expected_sh + ep_value,
'-e', EENV_DONT_RUN_CMD + '=true',
'--label', 'riptide=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
def test_enable_riptide_entrypoint_orig_no(self, sys_mock: Mock, ead_mock: Mock):
self.maxDiff = None
image_config_mock = {'Entrypoint': None}
self.fix.enable_riptide_entrypoint(image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
)],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
EENV_ON_LINUX: '1'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'--label', 'riptide=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
@mock.patch('riptide_engine_docker.container_builder.getuid', return_value=9898)
@mock.patch('riptide_engine_docker.container_builder.getgid', return_value=8989)
@mock.patch('riptide_engine_docker.container_builder.get_localhost_hosts', return_value=GET_LOCALHOSTS_HOSTS_RETURN)
def test_init_from_service_current_user(self, *args, **kwargs):
self.maxDiff = None
service_stub = YamlConfigDocumentStub({
'$name': 'SERVICENAME',
'roles': [],
'run_as_current_user': True,
'dont_create_user': False,
'logging': {
'commands': {
'name1': 'command1',
'name2': 'command2'
}
}
})
config_stub = YamlConfigDocumentStub({
'performance': {
'dont_sync_named_volumes_with_host': False,
'dont_sync_unimportant_src': False
}
})
service_stub.collect_ports = MagicMock(return_value={
1234: 5678,
9876: 5432
})
service_stub.collect_volumes = MagicMock(return_value={
'host1': {'bind': 'bind1', 'mode': 'ro', 'name': 'namedvolume'},
'host2': {'bind': 'bind2', 'mode': 'rw'},
})
service_stub.collect_environment = MagicMock(return_value={
'key1': 'value1',
'key2': 'value2'
})
service_stub.get_project = MagicMock(return_value=ProjectStub({
'name': 'PROJECTNAME'
}, parent=config_stub))
image_config_mock = {'Entrypoint': '', 'User': '12345'}
self.fix.init_from_service(service_stub, image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'ports': {
1234: 5678,
9876: 5432
},
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
),
Mount(
target='bind1',
source='host1',
type='bind',
read_only=True,
consistency='delegated'
),
Mount(
target='bind2',
source='host2',
type='bind',
read_only=False,
consistency='delegated'
)
],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
EENV_COMMAND_LOG_PREFIX + 'name1': 'command1',
EENV_COMMAND_LOG_PREFIX + 'name2': 'command2',
EENV_USER: '9898',
EENV_GROUP: '8989',
EENV_RUN_MAIN_CMD_AS_USER: 'yes',
'key1': 'value1',
'key2': 'value2',
EENV_ON_LINUX: '1',
EENV_HOST_SYSTEM_HOSTNAMES: ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
EENV_OVERLAY_TARGETS: ''
},
'labels': {
RIPTIDE_DOCKER_LABEL_IS_RIPTIDE: '1',
RIPTIDE_DOCKER_LABEL_MAIN: '0',
RIPTIDE_DOCKER_LABEL_PROJECT: 'PROJECTNAME',
RIPTIDE_DOCKER_LABEL_SERVICE: 'SERVICENAME'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'-e', EENV_HOST_SYSTEM_HOSTNAMES + '=' + ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
'-e', 'key1=value1',
'-e', 'key2=value2',
'-e', EENV_OVERLAY_TARGETS + '=',
'-e', EENV_COMMAND_LOG_PREFIX + 'name1=command1',
'-e', EENV_COMMAND_LOG_PREFIX + 'name2=command2',
'-e', EENV_USER + '=9898',
'-e', EENV_GROUP + '=8989',
'-e', EENV_RUN_MAIN_CMD_AS_USER + '=yes',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'--label', RIPTIDE_DOCKER_LABEL_PROJECT + '=PROJECTNAME',
'--label', RIPTIDE_DOCKER_LABEL_SERVICE + '=SERVICENAME',
'--label', RIPTIDE_DOCKER_LABEL_MAIN + '=0',
'-p', '5678:1234',
'-p', '5432:9876',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
'-v', 'host1:bind1:ro',
'-v', 'host2:bind2:rw',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
@mock.patch('riptide_engine_docker.container_builder.getuid', return_value=9898)
@mock.patch('riptide_engine_docker.container_builder.getgid', return_value=8989)
@mock.patch('riptide_engine_docker.container_builder.get_localhost_hosts', return_value=GET_LOCALHOSTS_HOSTS_RETURN)
def test_init_from_service_current_user_main_service(self, *args, **kwargs):
self.maxDiff = None
service_stub = YamlConfigDocumentStub({
'$name': 'SERVICENAME',
'roles': ['main'],
'run_as_current_user': True,
'dont_create_user': False
})
config_stub = YamlConfigDocumentStub({
'performance': {
'dont_sync_named_volumes_with_host': False,
'dont_sync_unimportant_src': False
}
})
service_stub.collect_ports = MagicMock(return_value={})
service_stub.collect_volumes = MagicMock(return_value={})
service_stub.collect_environment = MagicMock(return_value={})
service_stub.get_project = MagicMock(return_value=ProjectStub({
'name': 'PROJECTNAME'
}, parent=config_stub))
image_config_mock = {'Entrypoint': '', 'User': '12345'}
self.fix.init_from_service(service_stub, image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
)
],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
EENV_USER: '9898',
EENV_GROUP: '8989',
EENV_RUN_MAIN_CMD_AS_USER: 'yes',
EENV_ON_LINUX: '1',
EENV_HOST_SYSTEM_HOSTNAMES: ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
EENV_OVERLAY_TARGETS: '',
},
'labels': {
RIPTIDE_DOCKER_LABEL_IS_RIPTIDE: '1',
RIPTIDE_DOCKER_LABEL_MAIN: '1',
RIPTIDE_DOCKER_LABEL_PROJECT: 'PROJECTNAME',
RIPTIDE_DOCKER_LABEL_SERVICE: 'SERVICENAME'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'-e', EENV_HOST_SYSTEM_HOSTNAMES + '=' + ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
'-e', EENV_OVERLAY_TARGETS + '=',
'-e', EENV_USER + '=9898',
'-e', EENV_GROUP + '=8989',
'-e', EENV_RUN_MAIN_CMD_AS_USER + '=yes',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'--label', RIPTIDE_DOCKER_LABEL_PROJECT + '=PROJECTNAME',
'--label', RIPTIDE_DOCKER_LABEL_SERVICE + '=SERVICENAME',
'--label', RIPTIDE_DOCKER_LABEL_MAIN + '=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
@mock.patch('riptide_engine_docker.container_builder.getuid', return_value=9898)
@mock.patch('riptide_engine_docker.container_builder.getgid', return_value=8989)
@mock.patch('riptide_engine_docker.container_builder.get_localhost_hosts', return_value=GET_LOCALHOSTS_HOSTS_RETURN)
def test_init_from_service_no_current_user_but_set(self, *args, **kwargs):
self.maxDiff = None
service_stub = YamlConfigDocumentStub({
'$name': 'SERVICENAME',
'roles': ['main'],
'run_as_current_user': False,
'dont_create_user': False
})
config_stub = YamlConfigDocumentStub({
'performance': {
'dont_sync_named_volumes_with_host': False,
'dont_sync_unimportant_src': False
}
})
service_stub.collect_ports = MagicMock(return_value={})
service_stub.collect_volumes = MagicMock(return_value={})
service_stub.collect_environment = MagicMock(return_value={})
service_stub.get_project = MagicMock(return_value=ProjectStub({
'name': 'PROJECTNAME'
}, parent=config_stub))
image_config_mock = {'Entrypoint': '', 'User': '12345'}
self.fix.init_from_service(service_stub, image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
)
],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
EENV_USER: '9898',
EENV_USER_RUN: '12345',
EENV_GROUP: '8989',
EENV_RUN_MAIN_CMD_AS_USER: 'yes',
EENV_ON_LINUX: '1',
EENV_HOST_SYSTEM_HOSTNAMES: ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
EENV_OVERLAY_TARGETS: ''
},
'labels': {
RIPTIDE_DOCKER_LABEL_IS_RIPTIDE: '1',
RIPTIDE_DOCKER_LABEL_MAIN: '1',
RIPTIDE_DOCKER_LABEL_PROJECT: 'PROJECTNAME',
RIPTIDE_DOCKER_LABEL_SERVICE: 'SERVICENAME'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'-e', EENV_HOST_SYSTEM_HOSTNAMES + '=' + ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
'-e', EENV_OVERLAY_TARGETS + '=',
'-e', EENV_USER + '=9898',
'-e', EENV_GROUP + '=8989',
'-e', EENV_RUN_MAIN_CMD_AS_USER + '=yes',
'-e', EENV_USER_RUN + '=12345',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'--label', RIPTIDE_DOCKER_LABEL_PROJECT + '=PROJECTNAME',
'--label', RIPTIDE_DOCKER_LABEL_SERVICE + '=SERVICENAME',
'--label', RIPTIDE_DOCKER_LABEL_MAIN + '=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
@mock.patch('riptide_engine_docker.container_builder.getuid', return_value=9898)
@mock.patch('riptide_engine_docker.container_builder.getgid', return_value=8989)
@mock.patch('riptide_engine_docker.container_builder.get_localhost_hosts', return_value=GET_LOCALHOSTS_HOSTS_RETURN)
def test_init_from_service_no_current_user_root(self, *args, **kwargs):
self.maxDiff = None
service_stub = YamlConfigDocumentStub({
'$name': 'SERVICENAME',
'roles': ['main'],
'run_as_current_user': False,
'dont_create_user': False
})
config_stub = YamlConfigDocumentStub({
'performance': {
'dont_sync_named_volumes_with_host': False,
'dont_sync_unimportant_src': False
}
})
service_stub.collect_ports = MagicMock(return_value={})
service_stub.collect_volumes = MagicMock(return_value={})
service_stub.collect_environment = MagicMock(return_value={})
service_stub.get_project = MagicMock(return_value=ProjectStub({
'name': 'PROJECTNAME'
}, parent=config_stub))
image_config_mock = {'Entrypoint': '', 'User': ''}
self.fix.init_from_service(service_stub, image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
)
],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
EENV_USER: '9898',
EENV_GROUP: '8989',
EENV_ON_LINUX: '1',
EENV_HOST_SYSTEM_HOSTNAMES: ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
EENV_OVERLAY_TARGETS: ''
},
'labels': {
RIPTIDE_DOCKER_LABEL_IS_RIPTIDE: '1',
RIPTIDE_DOCKER_LABEL_MAIN: '1',
RIPTIDE_DOCKER_LABEL_PROJECT: 'PROJECTNAME',
RIPTIDE_DOCKER_LABEL_SERVICE: 'SERVICENAME'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'-e', EENV_HOST_SYSTEM_HOSTNAMES + '=' + ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
'-e', EENV_OVERLAY_TARGETS + '=',
'-e', EENV_USER + '=9898',
'-e', EENV_GROUP + '=8989',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'--label', RIPTIDE_DOCKER_LABEL_PROJECT + '=PROJECTNAME',
'--label', RIPTIDE_DOCKER_LABEL_SERVICE + '=SERVICENAME',
'--label', RIPTIDE_DOCKER_LABEL_MAIN + '=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
@mock.patch('riptide_engine_docker.container_builder.getuid', return_value=9898)
@mock.patch('riptide_engine_docker.container_builder.getgid', return_value=8989)
@mock.patch('riptide_engine_docker.container_builder.get_localhost_hosts', return_value=GET_LOCALHOSTS_HOSTS_RETURN)
def test_init_from_service_no_current_user_dont_create(self, *args, **kwargs):
self.maxDiff = None
service_stub = YamlConfigDocumentStub({
'$name': 'SERVICENAME',
'roles': ['main'],
'run_as_current_user': False,
'dont_create_user': True
})
config_stub = YamlConfigDocumentStub({
'performance': {
'dont_sync_named_volumes_with_host': False,
'dont_sync_unimportant_src': False
}
})
service_stub.collect_ports = MagicMock(return_value={})
service_stub.collect_volumes = MagicMock(return_value={})
service_stub.collect_environment = MagicMock(return_value={})
service_stub.get_project = MagicMock(return_value=ProjectStub({
'name': 'PROJECTNAME'
}, parent=config_stub))
image_config_mock = {'Entrypoint': '', 'User': ''}
self.fix.init_from_service(service_stub, image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
)
],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
EENV_ON_LINUX: '1',
EENV_HOST_SYSTEM_HOSTNAMES: ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
EENV_OVERLAY_TARGETS: ''
},
'labels': {
RIPTIDE_DOCKER_LABEL_IS_RIPTIDE: '1',
RIPTIDE_DOCKER_LABEL_MAIN: '1',
RIPTIDE_DOCKER_LABEL_PROJECT: 'PROJECTNAME',
RIPTIDE_DOCKER_LABEL_SERVICE: 'SERVICENAME'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'-e', EENV_HOST_SYSTEM_HOSTNAMES + '=' + ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
'-e', EENV_OVERLAY_TARGETS + '=',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'--label', RIPTIDE_DOCKER_LABEL_PROJECT + '=PROJECTNAME',
'--label', RIPTIDE_DOCKER_LABEL_SERVICE + '=SERVICENAME',
'--label', RIPTIDE_DOCKER_LABEL_MAIN + '=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
@mock.patch('riptide_engine_docker.container_builder.getuid', return_value=9898)
@mock.patch('riptide_engine_docker.container_builder.getgid', return_value=8989)
@mock.patch('riptide_engine_docker.container_builder.get_localhost_hosts', return_value=GET_LOCALHOSTS_HOSTS_RETURN)
def test_init_from_service_named_volume_perf_options(self, *args, **kwargs):
self.maxDiff = None
service_stub = YamlConfigDocumentStub({
'$name': 'SERVICENAME',
'roles': [],
'run_as_current_user': True,
'dont_create_user': False
})
config_stub = YamlConfigDocumentStub({
'performance': {
# ENABLE FOR THIS TEST
'dont_sync_named_volumes_with_host': True,
'dont_sync_unimportant_src': False
}
})
service_stub.collect_ports = MagicMock(return_value={})
service_stub.collect_volumes = MagicMock(return_value={
'host1': {'bind': 'bind1', 'mode': 'ro', 'name': 'namedvolume'},
'host2': {'bind': 'bind2', 'mode': 'rw'},
})
service_stub.collect_environment = MagicMock(return_value={})
service_stub.get_project = MagicMock(return_value=ProjectStub({
'name': 'PROJECTNAME'
}, parent=config_stub))
image_config_mock = {'Entrypoint': '', 'User': '12345'}
self.fix.init_from_service(service_stub, image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'ports': {},
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
),
Mount(
target='bind1',
source='riptide__namedvolume',
type='volume',
read_only=True,
labels={'riptide' : '1'}
),
Mount(
target='bind2',
source='host2',
type='bind',
read_only=False,
consistency='delegated'
)
],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
EENV_USER: '9898',
EENV_GROUP: '8989',
EENV_RUN_MAIN_CMD_AS_USER: 'yes',
EENV_ON_LINUX: '1',
EENV_HOST_SYSTEM_HOSTNAMES: ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
EENV_OVERLAY_TARGETS: '',
EENV_NAMED_VOLUMES: 'bind1'
},
'labels': {
RIPTIDE_DOCKER_LABEL_IS_RIPTIDE: '1',
RIPTIDE_DOCKER_LABEL_MAIN: '0',
RIPTIDE_DOCKER_LABEL_PROJECT: 'PROJECTNAME',
RIPTIDE_DOCKER_LABEL_SERVICE: 'SERVICENAME'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'-e', EENV_HOST_SYSTEM_HOSTNAMES + '=' + ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
'-e', EENV_OVERLAY_TARGETS + '=',
'-e', EENV_USER + '=9898',
'-e', EENV_GROUP + '=8989',
'-e', EENV_RUN_MAIN_CMD_AS_USER + '=yes',
'-e', EENV_NAMED_VOLUMES + '=bind1',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'--label', RIPTIDE_DOCKER_LABEL_PROJECT + '=PROJECTNAME',
'--label', RIPTIDE_DOCKER_LABEL_SERVICE + '=SERVICENAME',
'--label', RIPTIDE_DOCKER_LABEL_MAIN + '=0',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
'--mount', 'type=volume,target=bind1,src=riptide__namedvolume,ro=1,volume-label=riptide=1',
'-v', 'host2:bind2:rw',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.find_open_port_starting_at',
return_value=9876)
def test_service_add_main_port(self, find_open_port_starting_at_mock: Mock):
service_stub = YamlConfigDocumentStub({
'port': 4536
})
self.fix.service_add_main_port(service_stub)
find_open_port_starting_at_mock.assert_called_once_with(DOCKER_ENGINE_HTTP_PORT_BND_START)
# Test API build
self.expected_api_base.update({
'ports': {
4536: 9876
},
'labels': {
RIPTIDE_DOCKER_LABEL_IS_RIPTIDE: '1',
RIPTIDE_DOCKER_LABEL_HTTP_PORT: '9876',
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'-e', EENV_ON_LINUX + '=1',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'--label', RIPTIDE_DOCKER_LABEL_HTTP_PORT + '=9876',
'-p', '9876:4536',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
@mock.patch('riptide_engine_docker.container_builder.get_localhost_hosts', return_value=GET_LOCALHOSTS_HOSTS_RETURN)
def test_init_from_command(self, *args, **kwargs):
self.maxDiff = None
config_stub = YamlConfigDocumentStub({
'performance': {
'dont_sync_named_volumes_with_host': False,
'dont_sync_unimportant_src': False
}
})
project_stub = YamlConfigDocumentStub({}, parent=config_stub)
command_stub = YamlConfigDocumentStub({
'$name': 'COMMANDNAME'
})
command_stub.get_project = MagicMock(return_value=project_stub)
command_stub.collect_volumes = MagicMock(return_value={
'host1': {'bind': 'bind1', 'mode': 'ro', 'name': 'namedvolume'},
'host2': {'bind': 'bind2', 'mode': 'rw'},
})
command_stub.collect_environment = MagicMock(return_value={
'key1': 'value1',
'key2': 'value2'
})
image_config_mock = {'Entrypoint': '', 'User': '12345'}
self.fix.init_from_command(command_stub, image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
),
Mount(
target='bind1',
source='host1',
type='bind',
read_only=True,
consistency='delegated'
),
Mount(
target='bind2',
source='host2',
type='bind',
read_only=False,
consistency='delegated'
)
],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
'key1': 'value1',
'key2': 'value2',
EENV_ON_LINUX: '1',
EENV_HOST_SYSTEM_HOSTNAMES: ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
EENV_OVERLAY_TARGETS: ''
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'-e', EENV_HOST_SYSTEM_HOSTNAMES + '=' + ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
'-e', 'key1=value1',
'-e', 'key2=value2',
'-e', EENV_OVERLAY_TARGETS + '=',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
'-v', 'host1:bind1:ro',
'-v', 'host2:bind2:rw',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
@mock.patch('riptide_engine_docker.container_builder.get_localhost_hosts', return_value=GET_LOCALHOSTS_HOSTS_RETURN)
def test_init_from_command_named_volume_perf_options(self, *args, **kwargs):
self.maxDiff = None
config_stub = YamlConfigDocumentStub({
'performance': {
# ENABLED FOR THIS TEST:
'dont_sync_named_volumes_with_host': True,
'dont_sync_unimportant_src': False
}
})
project_stub = YamlConfigDocumentStub({}, parent=config_stub)
command_stub = YamlConfigDocumentStub({
'$name': 'COMMANDNAME'
})
command_stub.get_project = MagicMock(return_value=project_stub)
command_stub.collect_volumes = MagicMock(return_value={
'host1': {'bind': 'bind1', 'mode': 'ro', 'name': 'namedvolume'},
'host2': {'bind': 'bind2', 'mode': 'rw'},
})
command_stub.collect_environment = MagicMock(return_value={})
image_config_mock = {'Entrypoint': '', 'User': '12345'}
self.fix.init_from_command(command_stub, image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
),
Mount(
target='bind1',
source='riptide__namedvolume',
type='volume',
read_only=True,
labels={'riptide': '1'}
),
Mount(
target='bind2',
source='host2',
type='bind',
read_only=False,
consistency='delegated'
)
],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
EENV_ON_LINUX: '1',
EENV_HOST_SYSTEM_HOSTNAMES: ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
EENV_NAMED_VOLUMES: 'bind1',
EENV_OVERLAY_TARGETS: ''
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'-e', EENV_HOST_SYSTEM_HOSTNAMES + '=' + ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
'-e', EENV_OVERLAY_TARGETS + '=',
'-e', EENV_NAMED_VOLUMES + '=' + 'bind1',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
'--mount', 'type=volume,target=bind1,src=riptide__namedvolume,ro=1,volume-label=riptide=1',
'-v', 'host2:bind2:rw',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
@mock.patch('riptide_engine_docker.container_builder.riptide_engine_docker_assets_dir',
return_value=EADMOCK)
@mock.patch('platform.system', return_value='Linux')
@mock.patch('riptide_engine_docker.container_builder.get_localhost_hosts', return_value=GET_LOCALHOSTS_HOSTS_RETURN)
def test_init_from_command_unimportant_paths(self, *args, **kwargs):
self.maxDiff = None
config_stub = YamlConfigDocumentStub({
'performance': {
'dont_sync_named_volumes_with_host': False,
# ENABLED FOR THIS TEST:
'dont_sync_unimportant_src': True
}
})
project_stub = YamlConfigDocumentStub({}, parent=config_stub)
app_stub = YamlConfigDocumentStub({
'unimportant_paths': [
'unimportant_1', 'unimportant_2/subpath'
]
}, parent=project_stub)
command_stub = YamlConfigDocumentStub({
'$name': 'COMMANDNAME'
})
command_stub.get_project = MagicMock(return_value=project_stub)
command_stub.parent = MagicMock(return_value=app_stub)
command_stub.collect_volumes = MagicMock(return_value={})
command_stub.collect_environment = MagicMock(return_value={})
image_config_mock = {'Entrypoint': '', 'User': '12345'}
self.fix.init_from_command(command_stub, image_config_mock)
expected_entrypoint_host_path = os.path.join(EADMOCK, ENTRYPOINT_SH)
# Test API build
self.expected_api_base.update({
'cap_add': ['SYS_ADMIN'],
'security_opt': ['apparmor:unconfined'],
'user': 0,
'entrypoint': [ENTRYPOINT_CONTAINER_PATH],
'mounts': [
Mount(
target=ENTRYPOINT_CONTAINER_PATH,
source=expected_entrypoint_host_path,
type='bind',
read_only=True,
consistency='delegated'
)
],
'environment': {
EENV_ORIGINAL_ENTRYPOINT: '',
EENV_ON_LINUX: '1',
EENV_HOST_SYSTEM_HOSTNAMES: ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
EENV_OVERLAY_TARGETS: '/src/unimportant_1:/src/unimportant_2/subpath'
}
})
actual_api = self.fix.build_docker_api()
self.assertDictEqual(actual_api, self.expected_api_base)
# Test CLI build
expected_cli = self.expected_cli_base + [
'--entrypoint', ENTRYPOINT_CONTAINER_PATH,
'-u', '0',
'-e', EENV_ON_LINUX + '=1',
'-e', EENV_ORIGINAL_ENTRYPOINT + '=',
'-e', EENV_HOST_SYSTEM_HOSTNAMES + '=' + ' '.join(GET_LOCALHOSTS_HOSTS_RETURN),
'-e', EENV_OVERLAY_TARGETS + '=/src/unimportant_1:/src/unimportant_2/subpath',
'--label', RIPTIDE_DOCKER_LABEL_IS_RIPTIDE + '=1',
'-v', expected_entrypoint_host_path + ':' + ENTRYPOINT_CONTAINER_PATH + ':ro',
'--cap-add=SYS_ADMIN',
'--security-opt', 'apparmor:unconfined',
IMAGE_NAME, COMMAND
]
actual_cli = self.fix.build_docker_cli()
self.assertListEqual(actual_cli, expected_cli)
| 40.156098 | 129 | 0.572366 | 5,816 | 57,624 | 5.283012 | 0.04075 | 0.036712 | 0.036321 | 0.038339 | 0.935755 | 0.923355 | 0.90028 | 0.892534 | 0.867767 | 0.862169 | 0 | 0.015262 | 0.31322 | 57,624 | 1,434 | 130 | 40.1841 | 0.761137 | 0.017979 | 0 | 0.796926 | 0 | 0.003236 | 0.155331 | 0.058247 | 0 | 0 | 0 | 0 | 0.050971 | 1 | 0.02589 | false | 0 | 0.020227 | 0 | 0.046926 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f3b8a50e7789b67062f7ecf4a77a60cab440e1f6 | 216 | py | Python | pvanalytics/util/__init__.py | kanderso-nrel/pvanalytics | 27ea3fdddaf0e885cce8b56438256b7e51e9bdea | [
"MIT",
"BSD-3-Clause"
] | 49 | 2020-02-19T19:18:27.000Z | 2022-03-26T00:12:48.000Z | pvanalytics/util/__init__.py | kanderso-nrel/pvanalytics | 27ea3fdddaf0e885cce8b56438256b7e51e9bdea | [
"MIT",
"BSD-3-Clause"
] | 96 | 2020-02-20T15:02:11.000Z | 2022-03-22T22:51:15.000Z | pvanalytics/util/__init__.py | kanderso-nrel/pvanalytics | 27ea3fdddaf0e885cce8b56438256b7e51e9bdea | [
"MIT",
"BSD-3-Clause"
] | 20 | 2020-02-18T21:40:13.000Z | 2022-02-22T15:50:23.000Z | from pvanalytics.util import _fit # noqa: F401
from pvanalytics.util import _group # noqa: F401
from pvanalytics.util._functions import freq_to_timedelta # noqa: F401
| 54 | 71 | 0.62963 | 24 | 216 | 5.458333 | 0.5 | 0.343511 | 0.435115 | 0.381679 | 0.412214 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061644 | 0.324074 | 216 | 3 | 72 | 72 | 0.835616 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
45e84d8e9d77efbfabf2947c66c6be57bcd4693d | 13,522 | py | Python | tests/max_payment_determiner_test.py | phillipgreenii/loan_payoff_tools | 4ffb8a83f7fe6bf7eb37eb7165b3959422d3a515 | [
"MIT"
] | null | null | null | tests/max_payment_determiner_test.py | phillipgreenii/loan_payoff_tools | 4ffb8a83f7fe6bf7eb37eb7165b3959422d3a515 | [
"MIT"
] | 3 | 2015-05-03T02:16:49.000Z | 2015-05-08T21:25:01.000Z | tests/max_payment_determiner_test.py | phillipgreenii/loan_payoff_tools | 4ffb8a83f7fe6bf7eb37eb7165b3959422d3a515 | [
"MIT"
] | null | null | null | '''
loan_payoff_tools: Test module.
Meant for use with py.test.
Write each test as a function named test_<something>.
Read more here: http://pytest.org/
Copyright 2014, Phillip Green II
Licensed under MIT
'''
import unittest
from datetime import date
from loan_payoff_tools.payment_manager import Account
from loan_payoff_tools.max_payment_determiner import ConstantMaxPaymentDeterminer
from loan_payoff_tools.max_payment_determiner import MinimumMaxPaymentDeterminer
from loan_payoff_tools.max_payment_determiner import AnnualRaiseMaxPaymentDeterminer
from loan_payoff_tools.max_payment_determiner import MinimumAnnualRaiseMaxPaymentDeterminer
from loan_payoff_tools.max_payment_determiner import AnnualRaiseAndBonusMaxPaymentDeterminer
from loan_payoff_tools.max_payment_determiner import MinimumAnnualRaiseAndBonusMaxPaymentDeterminer
from loan_payoff_tools.money import Money
class ConstantMaxPaymentDeterminerTestCase(unittest.TestCase):
def setUp(self):
self.payment_manager = ConstantMaxPaymentDeterminer(1000)
def test_id(self):
self.assertEqual(self.payment_manager.id, 'constant_1000_0')
def test_determine_max_payment_for_should_return_constant_with_different_dates(self):
payments_per_year = 12
expected_max_payment = (Money(1000), Money(0))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 1, 1)), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 6, 1)), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2015, 6, 1)), expected_max_payment)
def test_determine_max_payment_for_should_return_constant_with_different_payments_per_year(self):
payment_date = date(2014, 1, 1)
expected_max_payment = (Money(1000), Money(0))
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), expected_max_payment)
class ConstantMaxPaymentDeterminerWithBonusTestCase(unittest.TestCase):
def setUp(self):
self.payment_manager = ConstantMaxPaymentDeterminer(1000, 100)
def test_id(self):
self.assertEqual(self.payment_manager.id, 'constant_1000_100')
def test_determine_max_payment_for_should_return_constant_with_different_dates(self):
payments_per_year = 12
expected_max_payment = (Money(1000), Money(100))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 1, 1)), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 6, 1)), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2015, 6, 1)), expected_max_payment)
def test_determine_max_payment_for_should_return_constant_with_different_payments_per_year(self):
payment_date = date(2014, 1, 1)
expected_max_payment = (Money(1000), Money(100))
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), expected_max_payment)
class MinimumMaxPaymentDeterminerTestCase(unittest.TestCase):
def setUp(self):
account0 = Account("Bank0", "00", "Joe", 5000, 0.02, 100.00, date(2014, 5, 1))
account1 = Account("Bank0", "01", "Joe", 6000, 0.04, 300.00, date(2014, 5, 1))
account2 = Account("Bank1", "00", "Joe", 7000, 0.03, 100.00, date(2014, 5, 1))
self.payment_manager = MinimumMaxPaymentDeterminer([account0, account1, account2])
def test_id(self):
self.assertEqual(self.payment_manager.id, 'constant_500_0')
def test_determine_max_payment_for_should_return_constant_with_different_dates(self):
payments_per_year = 12
expected_max_payment = (Money(500), Money(0))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 1, 1)), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 6, 1)), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2015, 6, 1)), expected_max_payment)
def test_determine_max_payment_for_should_return_constant_with_different_payments_per_year(self):
payment_date = date(2014, 1, 1)
expected_max_payment = (Money(500), Money(0))
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), expected_max_payment)
class AnnualRaiseMaxPaymentDeterminerTestCase(unittest.TestCase):
def setUp(self):
self.payment_manager = AnnualRaiseMaxPaymentDeterminer(100000, 0.05, date(2013, 5, 1), 1000)
def test_id(self):
self.assertEqual(self.payment_manager.id, 'annual_raise_100000_5_1000')
def test_determine_max_payment_for_should_return_correct_values_when_different_dates(self):
payments_per_year = 12
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 1, 1)), (Money(1000.00), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 6, 1)), (Money(1312.50), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2015, 6, 1)), (Money(1625.00), Money(0)))
def test_determine_max_payment_for_should_return_correct_values_when_payments_per_year_when_no_raise(self):
payment_date = date(2014, 1, 1)
expected_max_payment = (Money(1000), Money(0))
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), expected_max_payment)
def test_determine_max_payment_for_should_return_correct_values_when_payments_per_year_when_raise(self):
payment_date = date(2014, 10, 1)
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), (Money(4750.00), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), (Money(1937.50), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), (Money(1312.50), Money(0)))
class MinimumAnnualRaiseMaxPaymentDeterminerTestCase(unittest.TestCase):
def setUp(self):
account0 = Account("Bank0", "00", "Joe", 5000, 0.02, 100.00, date(2014, 5, 1))
account1 = Account("Bank0", "01", "Joe", 6000, 0.04, 300.00, date(2014, 5, 1))
account2 = Account("Bank1", "00", "Joe", 7000, 0.03, 100.00, date(2014, 5, 1))
self.payment_manager = MinimumAnnualRaiseMaxPaymentDeterminer(100000, 0.05, date(2013, 5, 1), [account0, account1, account2])
def test_id(self):
self.assertEqual(self.payment_manager.id, 'annual_raise_100000_5_500')
def test_determine_max_payment_for_should_return_correct_values_when_different_dates(self):
payments_per_year = 12
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 1, 1)), (Money(500.00), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 6, 1)), (Money(812.50), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2015, 6, 1)), (Money(1125.00), Money(0)))
def test_determine_max_payment_for_should_return_correct_values_when_payments_per_year_when_no_raise(self):
payment_date = date(2014, 1, 1)
expected_max_payment = (Money(500), Money(0))
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), expected_max_payment)
def test_determine_max_payment_for_should_return_correct_values_when_payments_per_year_when_raise(self):
payment_date = date(2014, 10, 1)
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), (Money(4250.00), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), (Money(1437.50), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), (Money(812.50), Money(0)))
class AnnualRaiseAndBonusMaxPaymentDeterminerTestCaseTestCase(unittest.TestCase):
def setUp(self):
self.payment_manager = AnnualRaiseAndBonusMaxPaymentDeterminer(100000, 0.05, 0.10, date(2013, 5, 1), 1000)
def test_id(self):
self.assertEqual(self.payment_manager.id, 'annual_raise_and_bonus_100000_5_10_1000')
def test_determine_max_payment_for_should_return_correct_values_when_different_dates(self):
payments_per_year = 12
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 1, 1)), (Money(1000.00), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 6, 1)), (Money(1312.50), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2015, 6, 1)), (Money(1625.00), Money(0)))
def test_determine_max_payment_for_should_return_correct_values_when_payments_per_year_when_no_raise(self):
payment_date = date(2014, 1, 1)
expected_max_payment = (Money(1000), Money(0))
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), expected_max_payment)
def test_determine_max_payment_for_should_return_correct_values_when_payments_per_year_when_raise(self):
payment_date = date(2014, 10, 1)
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), (Money(4750.00), Money(7875.00)))
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), (Money(1937.50), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), (Money(1312.50), Money(0)))
class MinimumAnnualRaiseAndBonusMaxPaymentDeterminerTestCase(unittest.TestCase):
def setUp(self):
account0 = Account("Bank0", "00", "Joe", 5000, 0.02, 100.00, date(2014, 5, 1))
account1 = Account("Bank0", "01", "Joe", 6000, 0.04, 300.00, date(2014, 5, 1))
account2 = Account("Bank1", "00", "Joe", 7000, 0.03, 100.00, date(2014, 5, 1))
self.payment_manager = MinimumAnnualRaiseAndBonusMaxPaymentDeterminer(100000, 0.05, 0.10, date(2013, 5, 1), [account0, account1, account2])
def test_id(self):
self.assertEqual(self.payment_manager.id, 'annual_raise_and_bonus_100000_5_10_500')
def test_determine_max_payment_for_should_return_correct_values_when_different_dates(self):
payments_per_year = 12
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 1, 1)), (Money(500.00), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2014, 6, 1)), (Money(812.50), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(payments_per_year, date(2015, 6, 1)), (Money(1125.00), Money(0)))
def test_determine_max_payment_for_should_return_correct_values_when_payments_per_year_when_no_raise(self):
payment_date = date(2014, 1, 1)
expected_max_payment = (Money(500), Money(0))
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), expected_max_payment)
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), expected_max_payment)
def test_determine_max_payment_for_should_return_correct_values_when_payments_per_year_when_raise(self):
payment_date = date(2014, 10, 1)
self.assertEqual(self.payment_manager.determine_max_payment_for(1, payment_date), (Money(4250.00), Money(7875.00)))
self.assertEqual(self.payment_manager.determine_max_payment_for(4, payment_date), (Money(1437.50), Money(0)))
self.assertEqual(self.payment_manager.determine_max_payment_for(12, payment_date), (Money(812.50), Money(0)))
if __name__ == '__main__':
unittest.main()
| 65.009615 | 147 | 0.773628 | 1,856 | 13,522 | 5.253233 | 0.065733 | 0.121026 | 0.140308 | 0.162462 | 0.896 | 0.896 | 0.896 | 0.894154 | 0.857026 | 0.854564 | 0 | 0.070386 | 0.123724 | 13,522 | 207 | 148 | 65.323672 | 0.752469 | 0.014939 | 0 | 0.762821 | 0 | 0 | 0.020433 | 0.009615 | 0 | 0 | 0 | 0 | 0.391026 | 1 | 0.205128 | false | 0 | 0.064103 | 0 | 0.314103 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
caab7e8c771178eec50e3c94239920b90e7c79d5 | 137,138 | py | Python | HackerRank/Real_Data_April_2016/Segment_Twitter_Hashtags.py | KartikKannapur/HackerRank | 50c630d2c3bcb537033519fc5d857749584aafa7 | [
"MIT"
] | null | null | null | HackerRank/Real_Data_April_2016/Segment_Twitter_Hashtags.py | KartikKannapur/HackerRank | 50c630d2c3bcb537033519fc5d857749584aafa7 | [
"MIT"
] | null | null | null | HackerRank/Real_Data_April_2016/Segment_Twitter_Hashtags.py | KartikKannapur/HackerRank | 50c630d2c3bcb537033519fc5d857749584aafa7 | [
"MIT"
] | 1 | 2020-03-06T00:36:29.000Z | 2020-03-06T00:36:29.000Z | import zlib
import binascii
import nltk
compressed = "789cecbdcb962b59726439c78f7014abbf2198c96627aba2321793557383d90160e1f60a7b382eeed7b7ca163d060f8e7adebec8cc8cb8d71d303b0f7d888a8aee8f72b9964b337597f976692efd74d9e7cba3f92cfadf7ebffcedb23f9afd729bd7cb6b3e2ef1e3cf7e7f5ce6e9d2cd97ad79c55ff75bfc57795de2c7aec77e79968bfee8b6cea37f779ae33fffe2ff5c5f972d3e233eadd92e4ffded7dd66ff7eba56da6f893f9722ffba5bf5d9ef33174f17df193c370195f97b1f98807bdcef1151fd3fc8ce7883f8f4f3996789a78dc7e2cfaa4b55c5ea5592f9b3eb79f3ee2338b3eb86f1ffaeb31fe227e30fe7f29f332c4afe863f5a1fd146ffcfbb1ed97ade81546bd703c967ebce561f4adf146d365e8f53bfab2cb833f8b6fe8f7eda29fdf9ff3659c579e658bd56aa6589258a7619e3f2eb77e8dcf6f8678b8a93c63e5dbe6881feae2eff91dfdcb14bf1fdf71eb275ebff01af7cbbd8f3d89bf78c502c78bcfd3a0b59fe31776ff7bb97c96f5c58a95cf78a06bd37e5cf4f3f779ee6239f5a1fb639d8ffb23be27dee116eff5e8e3bdbc6acf79fd884d7d4ef12db199b73d3e687bf0e2adb620fe3efe71fed41fb78f791e2edbae2dd8e34be3d8c44e0c8d5e6efb88772bf1a9f37cb99532e84b6341d9866d6ff678c3a20f8997d7d25ecbfe2cf1578f3e9e6b2df14d2fb6c86f3dcefac8c96b3d1eb187b766ece347f49cdab5a1e8ac2ef14f733c5d6cf3a0cd8d578d13fa51ca125f7974457bf0883d88a3758fef882d6ef4d57d2c6b2cc8120f718f57887d1fb5d5d3ce3a0e4b1c8e8103a45d38d6e9b2acf375881f2a2c75bccdbac765898d1ae3e9f7cbaf63597b9de34dc762d1dffac036f7a69fe24d96a1694baea15e863f8f557ac612c74ef2b6b11a1f3a750b7bcd776faf6dd7b736f12b2bdf140f725f9b319ed247fd8fa3c46ee8561eabd68dcd8c6fd3eec7b74da396603de2c946ede5748cd77886f916777e88d3b9c557c73272ae27be60d071fb6dbd2c73dc8b589fa18f778ed3aee332ebbc5ef99e47b32cb17953f9b16318f4b2d772d3591e9af55e7c79e398e8d9465dae8736fd98baf8fa67a333b6ce61289e6bbfebfdd9e726b6e832357a9f267e3b8eb6d67a8e158f1b195f7a6bda5d7f1c76a8eb6fb758a178c421ac8cd7463b1ecfa7fb565ee5f2fb7cd572e89873f6afc7d64f65db2efdb61db1b67d572e1fba6c375ddf58d0f8a758d4eba0fb33ccf17dd778ac58907d1fb402baa3afb2d51bd24fb1a3fc58ecc9a73e6c2beb671f7fd8c4d1d2c7ae7d89ffe9c7655e7759835bc35b6e615d1b5986386ffd100bfdc49cc433dc7538e7f8f829be485b11777b99e301e26c0d177d5833ea0bb9467108e30faef1d4838c411c3cfe782cecf022cbd33ce35fe383da78adb6dfe3720f9cb478f4212e471cb67882496ba16f6353e2fc8dc7a49fe5d24d7aa29b8ec4acd78d6b16bb15574f477a2d5ac289ab1857445738ec4a33dd6599e2888ef1d1bbee3996359e2dfe3e7ea189458fe79e3b598f383023fb2d67c1c56fc22f783b07edc836b77dfc1ee7c66f1c378365c4ccc54992799feffeb3db3c0c71017303e21eeaf1b098718c169d1fb9bc97bfa995419009b90e7d1b4b13ff1acfb42da5f9b08ddff44eb1bcf1ec83ceb33eb993bfbc6997e307e33bba79e6ec0c715296f8957813ddfeedd0edd50a3e63edc32d728a65185e5c97b8fc61809ef105db31c423ea2ee5da853f98b8c93241f100fac801b7a735de62ade353effd3a5ceec72b4cc31a6f7793a5cfeb37cedcf9f0635b19e29a8787dd6540589e55564837274e932f858e78e98ed6bbd0b4eb1c3744afc35faf25cf537c8376314e8cf64e4ef97555a810af17a7326c636c1c1fcd816d5fb6945ee84126262e49abcb17b62056db27905d98271da3b8fc615062ade28a1ff8826bbc9dcc6e7c7818e5435eecd9f43bb74c0e71fd8843dbf53abcf14ce5c752da3432f1077121ae875c5c9c99581e19a3f991a645e631563b0c561bc7a5c46387a50b5b12c7344e672c477882353e40ce48de551fadabdcea2c3f6430560cf230eb567ec8156efd0f2d956c7a89d58d07b1b1681e7108ee777d741b4e72e3c6ade141d7a68f1f887ba970640d63ba553ba36b18d63556f5d697788132c40f2efa5ddd15dd7b79fdb1f97d5eed2e23fcd141fde3e857595dd9874e27765e38070a7de2573ef92159a2b0bb03de57d75d1b3a8fb121f1d8fabb70895d6965cc563dc7160f2b031767a2d71ecbebc42aadc7dd9e4e57343e65e0be7eceba164f851711ce0d365df1272336b69fd8d5e588a75f0b4ef5a6f0207c441fee547bad1fce671c74549ab60df3ddf3426dbf69eb622f06adb22ec5236e8d239e7cdfcb675f74a506b6797bf48b56383e7a57d0b0ce7197bb55b6ac5923425dc3eedd4a8759baaebaf4b926d5b5b485a04f97b02dfab5cf6638f2b09c4e8a5386dd6a395b37bddf387761317e9f092db8c0326325ee7e0dfcf6e6477c4f7c70d885bcc3b1c87d3d9e2fdd87bb96a7849991058c1f889f96375b7ad93d5d95a559741efc03f1bfb26ef7f43d7156e2e82d2ca6d668b7d9d20f3e091ec3ecee8e18d7974e795b26dbff43c1f1263baf7b1537f0772e56dcbcab4215bde7a7fc8382a0780f1d0bc23edd1ae2933026f1bbcd97a808fb7399c3d93d9a3049f1e1876f622cce2b7e1197d44590107b75ebefb29d98eaf803b9af7e946d89385bf66c7970277bec7dd7ec4dfcbefc5e9c7f19f33ecc47fc7a3bb040ab9eb6d51f1739eb86bbab378d23b3b56b7f2df9c0f107337bf194a958e2e9081d23988a8f8e883d321d5de208967a2d55f8f35e2e6b557c8075c3f4c7e262c9ebc26fe1fb89d67b4521a57d4cf330dfe38df15a11da4500da5c5f047d87567f7fc591d6c64c8ee9d6f0e0b77e182fff59eca6b4983a165b2c36b72592131f9f79bd3753ffd3ebaa301dfba2b3f6900b200b714ca7f73f565968b96faeec14e13fe7be6fe58e86f0577a8d591f157fa015bd662c272f254bb83a642b5359e38d6421e7eeb49cc7284bc5255658d0ff8ca78863a9c76d3e9b7ee0d4c80cef849e7a7c56cc9147acc428fb821dd87d2aa7aeaf6fc686124f470c768f0f5a8f41a954f82bfd6cd385c1891758bd14b152b1498f63bb6af7ff12e66cd563dd0639eeb68968bbbf2bf3895fd5e558e6b050fe4dd22d32205bc9707aa523128b0bb484d152b873c81bb7446072991b47ac8f54a83d56a28f9bccf2ed600d9f58d37efa9cc31ec711bce1a82220d1910e97286f2ff715bfab75b8caf88c2f8e7d9eb94159a0e229ced775969fd8943760098e2b77d5018a8e7fff292b8e6b0bd7d57cf6f1ce9de2b4f871422f6de50debad0c3132dab0a27193da0f19244579f759a625fd7e6c5f5820ad71ac697cd4adc7982b12e455e7f6c04d152d69b7c667e16aaf83c29363517c718f13272fb0e18db9aa71275625bfbe998ed556ceb60cc9863d8be8500f3a951ae5e8d493bbe0e41619081bb0edd1ac36dcc4b2b2a7da699f56beb291fb6d897e639564a2c3bb6f7198ee537fd315e3e2c6215554aeb878ba1f7cbe8cd8b11d698c630bb47b9d42c4b5671fc3500d24292c731bbe51c160dfcaa03e144011386f4a5c371c97fe76cb6c608bb797c99bc3bc2b3090792b3a67b6cd7aec923b15abbc2b78e5ddafe5a5cd51421677fa477cfb40e0b4eb28c41f86df53b0a24049f7b09b238fd292eab5e3c05ee6368e1b47bbb96408aced7300685b20eb1c46bf3b741664d34a1c92b8d6fb8b2c8060379c755994b26fa4333a98f1573653a409fd4d79f86d2725b153d74527c78933a45debfa8d1b207ca864ae4cd018196578de36a29385058d2b2a9b17d950e450117ef6437aa4883fe52663277f169ebad505089b254f101fa8a01b9b1d1fda153cec7326270dabab844261bfeedb5aeeb2017b89e8dfa1c775fe71f9afff130b1fd7b027eb0bdbf4d2bf77c5972a7ed641004e950c6779bc364c0619b98ca54e4889ac3a3e817c460b25db1acbb26d3a66b7b08f4238c84f237c216009d7121fb4c5664fe516ae98bc6c2b3f74db08c123d77b71e57d15b7fa24d3671f46470725bef84e0c9a5bb465e4ba695dfef19b8ecc2d1e417725ac834da0a2b689a0219ca5be7127ca4f9842bb8e21205dec5952cece26e3249ba67c5d9e44b8dd50baf8ddfb613088155538656408735dc29164148cff8e1d8e13113665d3562ac6700213a15bece14626fca3e13876659c5b3d741b9b5d621fbfbc789cba386c8d7011a77a9f8d4ffeea7c7c90e78807d3412b3fc2f8eefda75362030a4aefe545f575718c659a7447f35d338f8a27137e69542062fd7eb4cd8aac60367e172bb71a5cb2bbb0612bd3ef914d4d6507b8892795758ae371e7095bc12ec472c4cf17328cad7532a2d33b29df576ad93a0f9b79f2464e5e7e6b537cd6cbd6f6ac9b769274663742bb2d8039e1a3c6cb5f73096359ba7ebee8d4eff9dcac22c94a13417e4740a67bbbcfc68c1e47ecec157c69ceb7bbc9073f4052b94f006ab1dc7ad6b0203a1836a19d4f005fa1d723b2d285bd9774c62b77accf807b165abbc60d8b7f76046d5fddc8bd94380cf3ab948c5a746e677f62e4966d31c8537330c2d9457f195652365377740c0384ab06d0790823be0d2f07fda4175e113d612f0718eff836498a27e2eae8341f8e4313bdd9aac38dab2ea0cf70d5d1855dcc434e0c1bbea7eb3bdd07f9e18c4c31436097b17f0bc13cf986f3cc9eec59ef547316c55009baec2fe5e59d42ba385eb1f2ab11860c2ff3fc0bf4ea8498e2f0b9a55fedbdd28b708b46eab417f3e1985f9659895b62cb4200635522a3d677dcc0274191e397848146d0cfb78e7dd70d9cd2081184135d05ba95b0247d244f7a6b9dcf17bfa3ed8ad3c2058ccb18d63d9c86eca360e28829d2da8c05631ba7b2f9a973196e3bf2af2d2e4f2f9cb1dc065298d27cbe6434f55a0e8dafcd5d7fbc1b2118656db5b93b577ac3461f2ba9c872ac8bce75f9217fb8903c51b61865481dd8ec02a705c185c3d563dc9bcdb87ac3b9b3f977143b2b40c6b0365d4740da2a165b9d66f4027d099c7725074d4d167e5bb7cb0b87a1678e686e70963acbb8ef808d5324fa8e0bd274a61997f1b9c7af1690b69ef067a8fe4af96abccd2e085fa7592fe17434d6bd8d932098235216815411f41ca32b042afb74b27a4f5f837ec2af639f886bb6e6569cc7a60571d43c129247f614af100f3092603999ad9baed241534dc4d657073f0ae58fb59f57a0e17676e0411450411c3d82f3f9e6e86c2957d580224bd77b657ef929e3ac50ba005d82ba849911d2d7952bb8e0eabcb98d65f829432bc8454f31c9b3ddc2b928008fdf56c2218f3bf554b7b43ca454978ff2aa5e3511ed3ef7837ac940da1d59efcc7a1ec07271dd8ab6db818f2ec4bc1815c0caf40be1a916b018418995e7391d9686fd5578bce97474650e8f3f875dbefcb34c4e04b0418d832afd581346323ef077eda3dde3f670425e6114d706e2df88cc749aafca66e7250cbc4e7284e9f1f6111e18a46eb5356b8dad0bf9b6ee5698375e5da841e160c44a1a2bd3219b641263351fca146299efb83bc287750e5b186e55c89e0d3e78a0d26b99a688dcfb91e875571e197b18be394c7b58ac30c37a7670a2ede305ded4c5856dae611a0e7e5cf9892a6fd80f5d5b153b66a75171ce7ad59022408827d3e2d5175b04e7811000de63b6e27ff44b137e51c51bd214e54d42179a29c284c949879622e18f885eec4785142f5e62d2ff8e082bb25f3d246755418bd66c1b64aac1cf3291acb7bebf27927e18fda860718685808fb7887fb4707decd7576033027032ea7fd53eea484d4ae5650b555ef9fa1f85e2ce3b5eae0928878e0be77a96533a2a2493d0835f7fa3ac65a45cb7c850a2afb3c1cab8f23a84ad82381d727f78431848790e778757cc003a96e4d79b912597e3f06d71041521d7758b4ba6d3d613a99139c6eaf6c3e9ae2e409471acf9be8cf1158a44d83529351e2e7f911557a426249123e39fd1bddd86886865821a3d662c66b85dce389e6b4f140cb891dc222213b98d88f1c2a229a98ce792158e145e80aaead5db31a55909d3bdb3224f5d07d51907a2def6a342a7243e0483c7fa295328c46953f1b5864a3241f2d761bc07ede33f055de95adfe4f8b678ec4d48f0502b3e5900a054d8125952a23c8b3c0f851977c5a3ed705c2ba27113941a2f2483aff24ec530d2c9ffdb216fde4c9997b44416458593f4507143751e23792d249ff1c0bf254aaecbdf46bad6c53ec4d6c6e703ff72929bf3a0880ce05c2aa29fbb8a3061b6899dc17be62bff228b310b99aa4e422bd510b147042c8076e5f58e06a06e1d95ca8c860b85b4908b0fece5ae8a67820caa853f5565f82c0fb2fcf9eaeac4cd887806671140f8c0bfab5844702a89095708f3af8269f719a6587f000093a0ba11764e2055b685ac41f66d5325563fe274ad90c150e05642c412f65715e0c223be84c1c467262cc0312edcb09ba15b4201c52235cc76415db57945bfe4ef4a767bfd8e6a88c30b780f67265e880f7a7cfe2863154b952932d9a81df43101088e09c947da02e22458b4ffe35028ae5294c3c645a5bc8f449b94a484e753d018a7bda75e7dc3f976ae582e65f799fffd085b74cfb4ae975b56b2ec74ac15e4194bebb2c813ec36bcb5ea8b82087060aad6857fd8a825656d4e9bb8eda4e5d8b66952fcf87f2bd87ee4d11c9bcd21ddf4256451a472bb294b3841982cf8aa12d61d405c77d28648c3484c9453a8cea5cc84746a9be33d23c6952b29f7fb999d5096bf91f54e4e9d85223a083e5895bb56281349dd627217b9ea79cd387fba93c7a81a2854a8b55557fcd15d5c05ecc35d1cdcdaf8f027941f4aa68e5fb5b87147e2355fc4734fc15a24bbcd2dbc412788889ba3426d96255436f9043c6eba045ee43ce378bdb44ad46b655539fba06c5956d99f945c8ff85241ed4b860d827388f754206d598bcc9828ecebb68779b8177b1755b3144f835f4430d826a24cae7995598d93575c45ba03ebe816e9befb7d77d1830a5007e421e39d0e4a550cf4c70d821ab7cbdfe292c756c681539912aba5db0150f86cd677c8f3f6582e4edd224e303c92088afe301e82a0e0a16daf86237ef12ac7f6342eea350893a47861351926ad18157a162a1ef82168a05326b88282140a74711de358ffef481eb5ff026394aa3b0afc240ae95b8ad88a6a1b619b2a1b1a25f81b25b1b26721fc65d07b2a776fb7abf2b0462884132882a567f023eb4a357d89a462860105f3696dae19f2b97c0a63848cbedd616c510aa881b156b09187e7d22de4e74208417522d98a95ff988a8c338c8b5d3ee7013a2f6651dfe4450d6b032275000aa89ef7a2586cda19d86e44c002a05c8adffb137e52d9bb714a119f322a397e91628837d3f66b0b520547041e8f0e050c9a9a4d4466335efa8d4255f991840045731358543f1b7e8cf0a17bd96a370246855f54073e09fde237ba86bad92276566cd54e652bcbcb7bd2d89e5a1565f6d55ccb3e0f25194ecde0b3c75f90b8ccf38e61d2cd121dc5a0691ccbf603387b8cb371a5c0a9b49a34f110aeb3e8040b19c90ff2f15389c5717dfc170f1997874ab83c1a16526f2ba44d9677d36f63a5fae9f76335084fd02d63ae559fe6ce7968ec5b6472b10cb77a80baf8f9ff543edc4c6499d550b8fab1e8de444c3f5e2fdb533547c82761237542c50fd27ec7d9d662345f483d2a6f971dd43f5215adc28ed17e269c9da486c88ae2e72bd0a94d1dc2a1cbe86bdd8f3562954b62b41942a9b2152ef6909339e919826362a5cc0d53e8a5d8d298df7157bd53d98c5cd5e70cad4ad572b98ef94aac18d173ec0f3c381809d89d1f72aab6cee137b143914ef4648b0b4862a398d1ecbcb0d599dd0b11d22224312016a50c981c93f7468329aaf70d2e6e69c55dab51c6d80a05b39f3e79448093113ef8456d72f98d7f1e8b7828423d844bb0c850681455d793126fb150865914d4ea1ecf9bf1bf2dbc3e1e80e2b3a0016d26db6ed82d4e343bd974916b8baee03c5f5bc54fd5f03d9d0665e1cd456d6c6a218f91cb8b184e977bfe685eb66561c2968760c9fd9525052135ab20ac6916d8d0ce9d9926a253640db75c38dd7ad2d8a6bfadcd1f7d1260e2b29d48ba881dd48b226239267b790802715d3e8cbf415f1023f72041515ca09c2fa2934fb15386f49a11d073291763fc8f46d560edd7fdee48d26c31172176dd5d15d309af121cb8d79ae9b540f8d173affea7f0ad3a3a0fc7239f4574339177ec318b63c357b27ff8e425de5286ebac47c339d42beb0515659d717d9a43d27df2c2f2cba8b54da88dc0057eadcf8329b51ffd9bd222c0409f871f9b5526871ab46620125f275a0ce56a27d9b2fe07340745967195c93c236000416ddad6d912467414462dca8cbe309e54af23fe93dcc0343b5a6816e3c4461762cd07385d7142ffa3d1a55215ff61069dbd8d4cad48a30f7d52394643a7afcc22545c3d1c4ed7643cf92a71b7f608b45bb10a3f4897928ee47211f1ffaa1ae53453d50c87bb96771d2a3c6371a46c8216ce6a305349fb1bd984f71b8be578e7d6c32489e78d8fc1ce8b81a3aa7757e927069e3abdc0f690dd75d1b14c84b97f899b2a0e11ce235e506bf363774611b66762a5fb1fb0307f316b208f74ac76acd94f999ea742b537301d6101b573875211893ea777212e8ea9382cbd808ea7d2ad387dfd6864a96df67756957bd7c0aa748d430cabe26ad6fd00930f877f8c99c96fe290ff5b1875a05257057adc3ae5f8f804d5c2e415c2a2a9dedb00d40b9a3ea6491522335e36b168de81792cd35d10485c7cbcdc7f14b0d8eba0faa05cea48587408eadec26ee2bc89012295bf7d018e9401353f5f5fabe07ae707e59184a8afba9f8f0c3b9b4cb8941aad5858fd8249c7a06564287144c5d2e9221e6f8d2dab8022845075458e5e6f105bc8b862be7910466d3770f9f7423e742fd8ae0dda84d21505485bab00ee10e8710d9ba8dc4e755699e293ea7423cb8b74442d02afc534da1f06f0e418d64f47a845d60c88f421ec5387457731fe315235ed1248b3d2d5e437adcaed646ce5f29c2bcbbc0351269624b4ada772f7c3d64df86f78da55b8813cd0d8407001b57061918c4736695e54c8947fd6fa9baf2c00104304f1754d9eed504c88e2d418298e9b65f0440fd0256fb12376de4ad82a919fae2af3ad6ed8d845339920eee00f29f12e271dadb101880c90b0e33ac3e683bf93d9e848cc7b51e52b798886fb29c49ab41b8738299a4d05a7eef1347120e80bd0a505b33cbd0ed98b839d46e1d9a1bfeb6d30fce9c4b8b99957b125eaedc294ba30a28fece6e32af63fc0b9390fa20e64d5c05c21995e155106da28e6ca46a3f03dcc4787bd1467ebe922c2e73c08e5cf82c1a694b9c61f9de936b7b0d0f8d4b240cda9af1ac171ad266615ae9d17012b4bcde0c26184231160d153a2186edaa8cf722ff643f126940c1fb2fcf148591f6351201e789f4444d322b3a34af87a38eaf1ba045af1002b911b3b9d99ae825a252d110e8a942fe327d425ee7e866767b741261e620580d4679078e19e1cbcf20ac727ab844acb6a259165d6714fde8c1228176e85c9f6ca951c627c124694137985824f27c2ad3c1dfaae3ea2189667bf4f4e60b453d786abb499627992947cdf815fc5a550287825fe4cdebf902693da15df140890b1ffca1985e3118d5494b30b1fb169a10a5f358a5c112ff26820da185d77294e00748f17b986f11467602531054a1efa2bb410fd3ec688ac43d9d1435d3409b2c823f7530d434e2ca1e54c9887ad4497cd5c297dfffab7bffe530ee13235206d2dd5569b00ed32567fd38a871990c1b381a48aa48f8e83162f7488a3bf1f939da02ee310866ecb788f3a552c447aa736ecd1d581976b29a329701b81b261eeb87966258659887fd0a921df4ddbb4168a2f7119612fb3c0277fa727498c6cfd230e25f6bf855125ea976bdff0b3cc41207e8b5486d0df2577d371e4779d4c663565d177bd8c017f4604e05a5af66c25260ae6bbb869802304889084708003a7caf12e5e9a1a3add95fb37e3726c59db768b4dc4083a4b899436adf9db06334ba2a272574b04e7a6507f81d92335b2417baaced90fe68d3c7a11f74d4efeccc02ff1b356066b3a69aaa437e1e9b55411ec2aca8e4812aa57781343845b2145d7a1cc1846152de54b32e6e1ed1430fd6d5b9b709e27906aac8ed58c4c857a0570da33f220d5d1b2f920cc176c0ba3aefd36b21fba2827d4e01a9536938e81467cea43712cf5f7158af47638ff575b9a70d467a25c4ae4cdfb2a0aa08dca57666c3f74541997e665c453609188682af5c64b89715a6f6c6645ce07af06693900f3044278b8aa8a59acb92be593b8685d441d1092b232a493b0e0b086c146a12c66ea89f02923455a6456cf70b41f2f7d84f2ca958b1f6602e32ca641d7900bb1198d61b9ed649f197a92c50d9f1f21d8848970e2571d0007a4133419b6017f751873d3356ec2213d9561c4ba964ee4da2d5e3c72b008e86483337fda15b6c6d5acac0897e954e1da65b77c42f0f37043dc1fa17f8f00e90032e95762cc7e492e4346837b2d675da00e5d9563862b778c032501f619751b97314cf3e24ce802f22da6b588e5edb0c8fd4d2408116777a6f8c0a7ed784a37235d96f42938c8dd74a52c2fe32b493a1a1597a88d8ba12b333928cb10b6b3f129bb926ce15c79d087847aaee6a1d2f40207a09de3e886c97764a7cc686bb008f744bec4d5da08480bc163adc447d870f91580d47c1cda85a01fff33f246e2a0830a0e252ee32e2e4ee96e433474959387eeb31e0df15b50c8700d83f08656f1b2ab83ce661fe76d21c8f81d42f9ca416e5fed70068e00e21fd907d2e517e75b6795a45a95ca7e7a395cc385993ab5b903404eab7101141eb31990aaa2ca06aa0553941fa119e2ca0c7e15f98f30416a462182f85486414eb070626b69eaf759d9a18ab0bfc5dbd3c2f91a97fd0bce1c09c8a14b16cf14a72bccc7bccde147f4008d5aa54f66d84d209a97ea8f4355b07972075f03cc7b9ff73490f4e52a7113f7b6f139e1d580a68a79a45f72b3d8f5df1b50affbd040c18e980bf8b3f2c3d60c764059849fa854421bcf24b28e438d035ec9f5cbb2477447617cd812531465b2764d40d4590b4646afa0ef51dc1bef2ba28eb043c79c353a1ccc708dc30c4dd375f570b1cdf1a7ba6f5b9bb70c39f7b23e61297ffa0eeea6c4ab16dad6b0d06c2382337905fd35796798eadec85f01c0d135f8100d351ce2644ed5127186aaf36791f053e09ae94fee04302497540a476b73e682b5943cd08eebe6655f71f2d76678fd2c678f3250383cd05afa11fc4b6aa1b2bf5f76b6bfbacedb76a677b0e6c6e5ec58a25278562c64979b848669a2d4faba05635473b0e99433b8e727bcb9da9741330d1ccf62d3ce3314353111f8bdc6eb97566e55bbd4682f527d9c4ea547f17850411cd1b2aa47161baf028ac6790233fb154c1df6bd60e624a5eb8ffc5d4913aa6cff51152962803b6cb0342b4eeac42899fc1d66b5399e73b560d95e72f4f3dd4d18250ca51c28542f05603f2a54f3c6d0b5092e13c3bd921ffcc3d1dda76ed0d36c5e81df262324b4c32e83f067e221678a7b541766be83216a5f0cac8018e109838a7f06cb8bbe53b7d109bdcad01ea84c748e6dfe45213735f869a34bd8bd097131555e13455e366954cd8efc2f1e2bb127b330d7cfbcdd0a8e9334b073bed6037b773d68ae6ba8b9cb623967e6310f37bc15c125906e1ec004264051a9c58c6390e21568938dd720122b3240cb916f3cf63f2a0c6f04038af968eb3ab9beb5e807ee439a9f315de43ebf9284a168c6a074fca2201235795cb5c3ee858880b5ac19c3aadc60da88ab23f68c19f83a3c4e407a1005bdf1b251b8538229f3552b6649dd225d60cbdcbda53bddd676fc383a267f14f3b8b3c9c0c61733e44c6788802d2333198df0fe65cd26651144079563e1c8e99ae9d84c80a05007938eab3d40a860d009dd404a9257b282e49d75896c09cba08f6e84564d3771b32af76d0140a3ec42b755b8ed49fdc172e7d30f0865c70488ffa4e17c4f7e95fdb3d96a8629ca78e564665dd469419ca5477fcdceac5a7ea7808ed94bbae8b09bb4e880a97cedd09a5733bbaa23e4cbc7e635af6763b0dfef2ffff6773746a9994f003647eb4438afba4cb5ba1ca6b08c0e0e5258e0ea16d41bcdbdbe67dc462e2d66562533c8b72613cde2f09e940433b62962c1ca28537263b3b9970ba0ba8330723399946651185b8dfc007a85658dd83e39670db0ad68929289b02f12bcd90cc9539a2a3d81c39e10f8653fae85c6c0f4287be96119282835c50a56d7ff149c1a476ae905cfffae87ab0c95ee24aa09398c4dfc87c860f1e332e33491daed563e085e412853666602746a031450c9bd6f60f77e94ba05cd74063c6e3f9b310dc41e7bb6d97de5cd670fab4e94c99f995183994ec52c0a9dbaeeb3dfca9763a926001d79837911628888e26e7cb9dfca1b86f250db6f32e9fc59ccc3fd1343f764a40aaae94f5f4348355364eb64f35ebae93b807807e51714b483d9d1f105ea26a42b080a5ef3a7c43c7c4ced9d928fd77bb9816c1de9e7941f8f2756042302969837f607ea0d89544a2b72a6664f85f2262cca193ed686e3735795950d84aaf25b4431fd9884fb9b00a76ce45581711b1a7139b629c2a59d2c4319b900bd7ba12ff1ac5e2b40a9251199a69d7280ff96f8500e1fe69368d076327745f567b27ca52cbcbaca6ae58ccd7c9d6148b4e12eeae1bb8bdd09bdfe82de9e9165806c343457a9cc285f5678f923bebeca18749fae2ec645b9ede61e9c61b30287a27e70b1405505d546dca7e6245eeeceddc304431fdadfedbd8a04dd2ed6e218231c6967ca297f0bcba063ff97880b89fe22533531f14961b0d3a6f684248907d05bd127aa892bb6bb527762183865d204b16e1e8056e48604b99417da0037eebb9cd52a6504331fcc91f35181bd37344f8853492fc1c845ec2685059573e82578a60e8ab885075d0250af084d917f0887df6700f76e0f9393d1ee1475394daf2f54d27222776a4e1de6bc6716e99149d309d30da5056fb0ab8a05562319deb27f2701cadfd6ddfcadcc776002d87181255320c1b26797097796d3bf3a768cecbef2bd1a91a6747967382bcb3264ca85eb1039cf3c5a20e2ea145711a056d1bdf6236c2a89bd407283a367039b22a404cc76a077ca9e823ed60f2e86ba2d97975be0363a83c476f421075a31f51aeaee53be24017bd9a3a147fe8403a9153154571f5d1645264e9093eda45560c24c7f944d7187c253c843447d3da1365e6cd06d3aef16c1ef6010371caefe81ecb0d9d23a54411db8307609dc92bf0f913ac7b7772aa2ce9de07dd59e8e41f24b3b21cbbf92a409cf55403000d118d7fe533dae8a856ce21096a471bf93836ca17108573a8756af338fa295563d7ed4c7b676c625df2a3e1659dd9bcbad96aa33784fc91ee257618d895efbd7d4aeaadc2dc922f5ec11f975730a6b214676d214b2899288561d1889f189287764af4ca38c5ebce34679b502c2197d822b60029a31320fba6032b93a35b7c474130f1522d25a864031e4d3f4424c47d6a45093e923f0d5df562fd8519c739fb320fff2448342388b0cc4ed455750ec65134e50b21748196403b582b40370169b41f96ca9175635939e5fdd680a6dfab5c50c2785fa459f9f0e3424ca30003f61a9ed67a3e3e4defe8f5acad491a85caf2ec154e160224c23292470f8959cde24d562aa9d776f6e7f2a6ebba7139be0290ee1a0f8579c9ae9ebbdbdb931bfdec3bffeaf5f33033b2178779a180990bc43dc3df888b76cbd56ecbc0a42f59bd3bff73a9dab49c5e69db90e71aa3ce9830595f7c9f974456889f4482b6aa82f1cef48012d0218e41a107b404d05d504d420dc42e31a88e36887a3ca34cc8b2d267288402144509fa414ea878c9358cb1d6d45ab1f8493b60028d7ba3274cf2e59be5bb63372f95fe0b9590d88ccf7d075695953ca778ac750003baed98da93d25203705e864b0bd153c289d40eeecaa7482966657b9baa383f64b99e827c009e7fc1dde390673ffc5e92bcabb49d6f1ee1f07e7d12daef90aee1418d51ee7fe1d034a6b79e453508d9b4a4559a74c8a0d8113c8a1743288f6bc553ad1feeeef4c7e806c50b1da4198d48ef7fd5ab9c40af10a715a728d4bedd8033d12f9edc87c2da1091aeab42382532a8de4a65f4c52f02647055c15d7a23607b8c81e17035c3012aa9d32a7c0039345a123544b8c76903a4fe43f925bd474cd22f69a30a407526e9054ceac61812275643deecd3b41554d9a0a44239f821707686e60d6832483c65f1ee3284cdd223015b92997b76537aa2581171db61b22936484b7daf4ad4eb6e3c5ef5b6fabdfbad932806d644b9fc730e5a521efa3e33acc8aa250812f19c2c7c2288fa3ee890726d0852db9d71f3203e1a6ee8e8d962195becd115ac434829f42ffb79249dd3863a564fed37dc65855eceb557bcce339636524599375ef38fecbfec650c2505aeeeba994195d19d13b15e81b251ae1413e749fe8a9dd8e8df8c0a20a73dc896c286b505989fd1a531b29bb1653a82335a454448a0cbb201c732c672d98c3082187e4d3a464f85b5975b0537b8976555c65826db55ecda6dfb25e22c285b29e11aa5238afa41db5f2d657413519fa8aa247495f1d0c5f40432416cd4fbf87dbfdf1425a734a358c8fde257f3ff0dfff071120c437d5382a73b5324bc6b29b4c19cf97b511aaf220321581ec08762018e39a1718a2f55e3eec5c0c5fb9115db57087205ce6c6848d9bb3eb41a9322a2837d7aa15048a1168541dde9f3de8a9a99775513eccd59f3f6f47fb4a06036a1e72f930450a7d089bbad1e33affe32fa816e8d91204b83c5cdfdbca983ee553d4f75b7355047d1271a19438f0af45350e3de1dedadf74fd1b808ea67646f6b5dd1dd2206cd8f642c1c8ad928975b94fb5ac67d757ec6236e424d1b3bb8c2f34229607613661621bb7a2ecede5dd8195cd94658a7c042cd5096a19e069406546fece2ba86f174dfb78173edfdeab27a6dab2178bff5240e81efcc731422e3d3a75e4021c65e5fa386f86581a19fd9ecc612b46f411c1eaa89c744845665ed84fa4f3624da5dc7aa7906fdd038e666705b70f6da682e0e598aa9f13fd425b7a6d3a37c7f53f6798afd06ad5b58e689a740653c7523e6ed9cf6ab78e16bd0df34043ed7acdb28c55e79ac80d645c0b70697670b144b235da9f08c2f2383ae9afdd53b404f2c7ee42fa8a4ed189db50958584fd00ed4c2550e12887c421a4788b12c84c4e1ddf24ad3c516207603eed9755790721ad04b0f7b01e9527fe6eab22cc162bd5954f9ba1a53eb4e280eb80fd3ca9539688aa75782eee550c6513ea13a02e4929f914f4ffb24846983fe5bd0bbd3cabba1a75ed75b2c4ff386b903a93e84bbd9b222b8fbfb6352b93bb51df333bcb6c945b8d9892539990c03c425ca44e95b64e14b81efee9bfcfdd9798d575311086da9cee9aa32841a32dda1d2602ba48da23577ab6e32a7645d51a8c9b16817f98c2e9ecdb9910f4bdd340f3a3d6ceb78774066f240a3b952ed47044f28c274e106ad6e14ca32d5a6c0d4916765bbe42452bdd2324765dea393b502562f867790a8ea854e4aa0a8e980483baecc3f99b19598c5bd72612a0ea616816a138ee69bc1df4a38cee001f159fc63fef34dc6d4dc658e82a68fff6a7c8c3f449c3b0baca092990d0b98145a773d7ea62e2e7bb4243b89d607604a67f9acd764a19315868228caba3f96d3dc575b10ba2fb278ed5ea64919ebea60dcf12c9fb460ba13477d6e3bc0b3addc7f86e51abf246c32b09960e22c3cc8b3df2b08c97ba484ae76e96d49a590f73342a6db9c69e29880586b0a5989c89cb97e6614f4a9938c1e8cf4a98a20e79dc9df3d17662504c01da937648b65f30de41a573ee6ca785499bbc7ca962b81d1d823dea9c4095fcab9e3c69ceea2cae759964443c654ae5dad1f314ab343607d20d60d2fa0867344238c956cf8f6c5b96331808e9c85e00ab61ac5c76129f2a9d556dcac93069de4253aa672913338c42d65933b77efa34bfe054d994784505976325fe00945de218ce6aa66ca9275384d7a36475c712dd61e9a60fd2a05500fddba46706082d7904cc54ae065ca37c8694e864e164e3fe2fa736a4fde8d97f7ac5f5c7a1a06765698c304fb2296a5d9c930353854169917332dfa77ee5304f541824782680e1defc245ae6f626d2dbaab00639b8dce8719212d55a7a33172e2ef24638663b848a29c05dcdc168e952aad328cc7bd360b53311a55db238f8f95659811c284435ae8e5278035b6affa08b5f0b2cb0c3694c47eb859a88a4673bb85323fcec5dfe02c183b675dd562093dced418de89699401823dd47b4fc1da3c49750885877b50bbaff56fd24fdeede3853d5b2faf6510e816262248a43b6952f40922c949ac2c920ba92edfea9b4662a4aaf7a7d62dcd4b8103cc7a18989f95a689055fffafccad8c2e94071a577449c4418b8c479e9c1bf3bf46d3ac51cad2d773fb9ee713996adec89b152b8bc2bf3a612e39a6beabb748e282fb5b938354acc3bf934832c35249158139af474a31ca916f57fde3b0561e3209aa118aec08c1d39cb63a1782b879855c5b5a94ac7e80d0a59688de3f76effbdaade2bcd1fda39087d54a237628389d5822c67d2976ee1f712c6803ca21a2f60c8f91eaef6312b9f1a9caf8b9e1e91e15219181611d0e3deb18211071d2a92ee6ed75e91b700349dde3adfba33623a5183b8ee30d1e3ee50287320b6666f95c351f5dbe77b4c2937848fe4671735ea2bc14086744879b38afd590945cd151df8e6fd424fbdc06eb9b3539192027978794863ebfe85724d83f12043dc59a20c7c485a19f9da98b7c97a061642158ea68d81cfe4bee20f2d1fbd6f0f6551ce4615ff5cddb279479ed6bbcce67b3fc36d74827add89bf5ab18af0ae00008b87a5102843bef80e194d47bb22f47287975e2706b90d6a5fcd7056bbbbcb7ffde24ea564981dd3978aade38a1482bde913aa9cad8aaeee9086c78b4cc7b69d7e3863273a0d7b9738e39db30a57dcd29354529da9e4f944a84901f1f29f0c9aa88d12a771d1917b57f7ffa4ce4ae3cd555045210bc922e5ffd36f4b935c939210c40e336a42bb441d5cc8f6300940fa692afe77b21fff23220e15b1e9de4da6abf4eafd189fe04c3f44f96f0c42577a211b0a7bf2d9a3d340452ac23ba93f93bb9d7cb7ccac9fd683cf2c86ab8212f61ce968efd690e5cc665cbd63cc44f6f5560abd96a0969cd9f624cfa92f684b9d71b1dedcc5309087d113745b9bc376b399927d61e810f3bb2af4c9ef56a2429e7e934dd5e633af65cd3e27aacb0df7a20823046096a0c3d382c7d366d0b4b817828c3ece1bbd3c32a9aae9cba3a5c809b8e259c15275250efd9594eb1961c758597e959f135642a40a4cea19943e2827a82d7dcf5eb0b3da7b434b5622cb68ad919873c34734152c6d962be6068545466b4b4289406e55479a547decfae63ecd724689c4fbd2ac0c5d410ce44d5a2511d9ce50fb777b4e312ebb6cca774e0e9ddb94901ffd18e172157fb2f088a24a4adfb44153cd3468dd9ced35b18922141c3ac33775325639c1c7b15a293dae78b93befa4a44a581ebbc63882ed60004cf77f090993ab6fdd4823c45430c6249c13119027895dd8c504fe4ce84395317e467f469b827a1f77e457ac8c25ae314157d87949feeb4aeb456ffd8671fe906dea64e21f4ea89b38578a138ee96b47ad9a9c2cb8493bf155c1ae21ceaa7051f50128b1a68ff3f411251171d40964e2dc5b2ba0eb2b5f6f556069c61aca8fa057b10d9f67e13a5f6e47280bf1ab3468cc40aa664a47bf1f61301533413bebf4a936d8f11b160d5a9acec179cdad1a8615981ba6bfeaef77ee919c325e578f740763cc9693b7a63041eccbfde81981760eb2ee921bddec88ba0a997d4925239d457e9e4254d51e362b2589770bd5912111b0aa4f8614eb755692a46044dcb3d84e9d46e5db77dde36791ee1b60099c1cb4e730f9eae4d7e1530f5e157ceab09da94e758eadd8de64a82f9a9e80e5127774f768cadb35dbfe45c065770d3adbf599f9f228667bd5261fabfbbbc160175a2807ff8818bed94653f96261cceb65fa022212050127a99740d2866fa812b01bc926ff3c25d874d359873445e89852977ba7fa3c6f295b9e1c0704f11ae560f0045eb5eb7998af574b039e62fdc87beac4b8f95ca0f6b17ee4da814bca51595ba3814f8a6aa01cf7b154fe1dc55289102e00dea83d8b1b30219d76589aee888b669d756a586e235362d5bb05c132e16e94953080a1e0bd0e87619a0a55b0975bfcf667ff1693206e609e9430aa35a70709edec5b9a30c49a496d588bc5fc82aeecff977fae7128120563a96a13ba47fa0b1a209282b0bdd5f61d2eaad7c98df60a2de99c5251fef01097b5e05a5614ea526d24aea2d6fa361c3dfe6673418f32043ab6c51cf96dc6f6a24064f92ba788e626d033058127be83812457eb0056d6066af2f347b20bb2a219670b8a91590fa3be3ded967bb6b4ce676d9c6f4ba1eb58ac3a0a4763220e06607c7a40039b9b88dde22a1428498488fd8c4ecb6e958364d2ef3dceda3a51583a850fd9e4d2542a45e41567a19c2eccb7c48f5be846a5947af1d550f58c4a30cd260617e2a6760973b4ae81b65653ff01145d9d9fede3291402969edebbb85af4787509bb5f21c2ab4589f27c3a95699e52b6a15d9be770997fbcee9a3fa65e7246c3655e365660bd3b47468d48d8e9aad49135ad71f294304f5c003e0e8383e45d69dbf8f9d388e15bce0128d82de9a43eb4fed5d11f3d62cad824e6b419b14f8665d514528fb5e608314d401e5f319cc51329d78b18b1cf275b99aa8b73ef464c23c03e091d16eb0ee0a6904bea3c1be3fe1a2f5880701e71956b8f5f999c09a4269e333de176552b1c55b10e9d3d2bcf64bb8b182756e4839ec1914122531292d99e9c0802d1cb647441d274804a18dc55fb72ac0b23725413bc2ab5090b13a7c34cafc6cf647d6b94606a238bdbec989b560784105114e3bb5aa37767cb03129d9a3d99814067c4c9279428ef680010ea10f5ea485189ccdd3b87f02d90d1a232d9d2fffcd99818dc9b33b1394abe49ec21dbd0bd96c8ce5ec35dc51978a02bb43c66ab420fa5ca405ac8f9f7a3b3c2410af3593d1694f218865f484198c1373428e245ba7af78032add5d9d39ac88aea295d1f3be686c5a2b46b3ae50b159a2f59afe034aacaf2c68b5a94782c69e3201ed732dd0ce3d0ddc9e6942f49b2465b70951afa3b0456392387bcdcf5a35d446af0021caad7cd13780837e5dd5aa0042b6be7a7ebb1b7e445280035b2c3f0ada45b7c969a12bd089c5a91fbd5c527c8eea064a6a2a0006357aa01f0088b5123d773d265d44802f6d2bd26b367dbbc9a67cd5cb4055364e7ba39c620cff2deab5ebc7dee07a6596d598321d92a3632f1e0b694ffad89bd8ab720140d685ef5ad1e3d549cdaa7b2979cba071aaa530a767dec672134c22f5244693565196d9caf882dafa0cf5437459968ae9bd5253a80384970a39a201fb6a766ba025e89eeba4084ca15314315aa7c13f02c3630a6715a3b64b314228733fda06d6bfe38d51109ffad5623db2475957e374f6c3d866c345dd9348b34c27dadaac0b83c14cc06e30d54a13d2508b2a1f7c702270d3929b12ef469497a0d5f62b1c3b0e1a7506ff34ce49efa4ef5576dd97943e4a246a553a603980f62f1b054224718c1a31bd175f884ddad435f55f74d24d56c5230c2f95fda6ccad1a6d67ffbefffbba062a176399fc7bf1e3a88713a53c853f715344a35b4fc9d7f5f45d60d73ddabe4384f89c823e56869d176ed6f4eb03f2b53c21cd1562649d07c955e51b2accaf30af70d82fc30bf20f8a664b45bfadce72f3e4c23290298b47da402713a35db93ba55a9fa87ba15ee7e8d65bd86ef5503bf48ebf38f0a17c58ffc2e9623cdcaee90f3ac07cb86858b6e50381bcd516e15d1e4151620fd44cca78a79772225ed905154d57133082d55d79c868604ba864e4a2c6863daa46133e6bb84bbbb0fb23d7b42ce557f55611ea095bcda0dda360dd5229f89893594b8b9f1945a82da1bf1b3a4ce1d3d4a0fe34f42e4b2c7c1b2f8e225d81b1296a63848e7193caf5aef38754fbb93184a7c758b3435cb9b74864c56119bdf5472b5efc46ddd5cf68c2d19cd30b3aa38eda0e83228731d6d95946c2844bb17ebafaab0db0b1b506de34f4a875434a53e8e84ba048a182a66e91a452c7ccebad3ca95149fd3ac9b7dd54fd9d928935cd82bc98352ee6256cc4992c8311eb2f3c8f7cc73f61f6659512cf12eb95f488e376a1a6cd1acbcc6e5d80c695f6b9833f63f8487a50ef3bcca624b426349fc9591913d32f3b395883e2b0f957114a9646cf0b772a3e4ff864a26340d68bcfc5e22cb8f53f5974381c8a3417745097049d2d2b3b9eb664a859bd49f00aa71abcd7a68891ec5ac0c90cdf1c014becf12536204655a1845595ef921684d1d332d011ff23dad47adc596ee2e116af89d0e1368f1dd8a5a72a659cdb53f7523062574e33d6e6c39e885d387ea31354b803b5a465a7e5ba440cd681317e8ac1852d42c07f391dcb123e43a32245393e6b3b1ea59209b6212db3a83cdf1c7ee56c6aba5c62a7c870200932f1ea22ad6a2370c47950e61f6421d981873d38ca373b043934f4735bf7b22ab01ef97dc05ca87953970d93e0edac986db2fa2d230e977fbd262e1c131fde2ead7b5d490568f018c10e10fb6f409f7f3f8f9d3944a7771aab6dd3310b787d4afec469cbf8267da4e90220ce67ad2abd19e0b37f14172f851f54e6c616ec63c20a666f86f7d8248ba1bebbe8a963522b19904b857c663a4afba70d814849a95a8882d587e98faabf026b6320ec16bea566644c7016b0e4fb411e1c722f5cd982d87f4b433618396aa9f62cec0de6dad509d3f1857f564cbd49955c7c85cd15452b32452f63de8934f604747d7f2013915e0ecdc50d82a6dd3a2fb2cfab63a2c6697f59bd8ce87b9964b195c20645e9bca0082c13534a371036502e9433ad36cbb361b4a1ef3e26208186e0be2d254f829ae5e455f3d7aa1bb7c15eb6923674132684881c72652e0570a0f0d29905885fd92c99bf065598db1fa2665d8f2564dd51d88249a9851a59d690e7f367aea59ea9c5a63cf637c0eb534caf67f5ad5e786a97c8f1fe8ec97c9a17b8a7afd9575941e011d2cd8d70ec50c31a65cf2ad0237f165110400239b1f441c6841c0851e27f759b50ce4681cea890bd267b7ef209a87bb003c3dd7b926c54ab6e60f81407d4ba74b59f0b0a7b0e8fea59e7358a63635c56f8493a642ae39b8c2c54941234627aebd7533eac005931af0d19ed221042c5239a45e7ab3e69bab250d7acf439dfbfac316f48236a9a4eb848d40822b2999d513c37f135e5aba13bd57c8f8eec28180357f5a6e8ea13374ed40ba9c53cca32a3142f7b48eb4a071b7739d907a47b310e46b8f9def9ab1eaea7783c7daffe2d5af52b4e4f1ede91b2218bca160aab35a3a078c963482d51b966146943375fc689bb1b06518258b71ec1063680a15fdf04eb5359e846a81468da3ecef02b98b574b8108fe753f958280e2abbe7675c6e6795730a23bcf6c4417aac2d7eb0d35b6e6c393ded1a17a301b8c41e5f1dc836705fdaca33fdcfeb2e008ab31b24e6d41c1b6d6313e9d6c7d54b283a6add71ee55ae650edc2912ed94a42964f75c89bd8c674f05d3d3e55b06afd12e79f84561f1dd7f1c7a5bc7ba0710ca2b835af1a2b09c9b044aa85f7637b868e795a0fabc52275d59dba00d5f644a44c99867e9eafcd7c56798d07fb2ff46cef06f9930782007579152979eac9639d9392028df16216984c9ab2a52f435211c1fbb1b30b87a6516e740aa351e601be48aae2e766ca0f12e3ec0b11c8946dab6bfd28a119cc96093b50a8e21720c35d0da26f285dc2138c8d761f98fd99a4fa70ac370feb5e53f51a493c913d3d8bb3516a665969a7e0232f88e0116c2048715ad8876a2f9ccb6dcfd88b660ee98677161c4063f1ad9280f24c326104783d50056b880b77a78ae621dc5e49a2d4a0606b1e8f57b50f75587ff3c6524181405a8c62ffe187f0ae7b155dc9c319bee8d4f50aab305fd50f418db7eb179aec9eaed5a45ad13fa05a7bd40c0c700b1191601569e6371ecf758a456984fc50a7bc33a890d1d992b50b5384fc0f06c8cac5e13d228b17121521c4cb9c1ccae8c3ab5a5187fdd22e2f39a5c7c426b8041fe72416bad58d2153608a4c30921e2b0f5172c9bc4efd218afc005a7066933e262c92bcd5087875d99ff32f4c11de72366fc9a1dbc010a8bf0d1e0df459bea048750ed82e76c3da5fe1233a1a56bcfe8b8a24baadabeadf4e82ba4b35ba5596906c76ce9c1ed9ad873b4625512d4ea0cb42ac1a09813276fda2e8c03a97192a9e23c991fe4a5601359e306ab01000afeb6051390f48bf6aea4ad9775555aed51edd1a5a2def2b831ad71e3c91aecc1e7d37d13735af4854825da8e2aa72f3287e7fc335b945069e6a713af8460d3bda6c3fce92559c4f64183def4fb43d8d64efef35bd683d327311b71579e9ede42a918c7abaad18e5207e11c8ad1e70e8d1b96ab107705ccf180225902f90f5982181cc57380bc73d3a289f80edd20ba169a0341313f834e723d3d2f2e74ebf9c5d4524988d674add5d630225eb232c7f18c253a39e4b7ec2c0afb5f063ed640a20c5d39f86d34f7de934ac1aa71928c208027ffc5d3a3595f870524e5e52b9244db2e15f81444e3178697544b098a075e23cc39f0a9859d1cf8656e4fa8e5dcddb9d880857a5a61f06da3e9860b3ee39bfb60a7ae5f4b2a90e2a232b8f7850fdecd05c2503e311408a8c894bc3b0d02bac58dac38dec00ef6a2b50f80cf080ee5b33c5ff9dbce7f8bcc6828a234d7e4b16252548ea4829854786e6695526782cd93c2c6eacc85b4668cd07a2f6a6615462c589690cad0a1c816e79e9f769fdcfce0e0f794a363501fa4d9885ae855823c7ee5a5cd718fafe00d2c93209531f730e241653bf1b11a44217d3339605c5d799cc7244c843982853b4cbc4f44fbc081365ba3aeb48822d0a688ec9d38c695ac853986a619f79f9cfd267ed5afc2c298234f2a11b8da6c0252999a1e739dc86914cd1ac7d70769eb54ec839b0a06cf9f168a433a6217a44f3f77ea87920e75d1df364f6f15dbfa4e2afa7e73c65f973e2c7358941e593c9bea238247b16a2f25a4da2da03c158e28b726e2deac2c8cb8b8bb43fb257a58a696462a2fa72130f4317718bc490f52192a5648e51327892e2556780e667b194ea3ad1e11285a4a9056f509c3becd31cf64b7ebd83b333b106db4db668445bd70aac5721c3eb9c58350c563c3a0585eef3056963e180cfe40cb482da9b2f0df1392ede19683c6b1ca81b9aed7ab21e21e2cf52f53b61a2803d3dfc2b75a6d9f56a22264325c7b8afed31b8631a47f239dbc2d12aab835b75b74e8aaca6b0a94f67ba3b6b3b595578f363ac85bcfaa427df4d6390edba2f29b14c3c7e2c304b3648a63d44256785f1c46a97cea1de350b48c93c413e6219400b94e491642c24287c1da831b8e8e6dea3f95377367eed3afc497a47060bc686d9e32587d82a914404ec6de5337030e177fd2c1a2467f5bebb552652a456e1d85895c4a4d9dc643d934b64ede1f8d4636cce99477ca672a2084aaef4953453a54d8a42b5d719740fd117d22bd4c9cc15aaa3f529c1c4f225cb10a22d5720990d8842bda61a1c3b9d42759ca7491008ea61a8f40ffd3d5580350e0afafc503b4eac5dfdba9c0300e6f56c8c2939a6aeca19ede51c0fb3579d809eeb4e570e09ea17493575de48e11fd062d31966ce5024941534bb29b2fd324ea40a8095a9ca782b782ab503ff8afac2b5af35c647f393f49dc4cd75b2300e5282410f2fbbb22c7e8e20d4705a77a4f52de8db1778b383151e446bb11c5b4e7cb024539ffd73b6e74fee66ecab6edfdee76b65f8a6428ebcb240d94590edcb078d1e646e63073de23ac33fa4416f716d0ca520c6970bbdd03c669c83f15b49df4e109f3ece792a7adefb59e1be9c0770b45e873a9fe8dd33f6451d77e93de0ead6ecaef08d32c1c5623d425b123fb384decde5ae498c2bac13d9a789a270cd8c6e79a834248f3cd71a4835a8ac701c634af80a0debdb9c8a1e1f32efaf858aa0624aad7687820653e8443688f4726a3e518a6602efedf6451ffaa3ef7216e32fad87bf2f86c82dfbf890a5b93028f17651c1297ce8e5e1c6918b067b794cb8442dc3f885a3e1725de0ba5c3626c269808086a2510ad39e3353206cf4e8df9596c1f42ffecf1529ae0b63722f08fbc7aec66ff7901c04985f24b4d0df94630e726c4293c21bb92b4582efc70e5097d3b8b630a40c9ab7f120157c314760f6a817843b058f333a45d91c4c1f798c595412aabbfad05e63c5511555a1f511dbadae4cbaf794ea0e1d430a3450c76a45d6957bf067d29f8a9bab9f8f440dd554fda9266bc13493fe2feda616141fe2e1a670e9d7e221d4824ef81dfd4b24e9f2df1a817671664b5a7ecf5a80d1b20b24bc70c61b2320069680840ce94f81ea1067f4f388cd3ee1e6d7c9a7a955632dec5c353b601050b512df3cbb8b17677e2fe615b74657f917e55085fada09a21c49644ce827cdb309280ac68ab7c1c351aca37a2d2cedb58471919d57f9484486c17092df5a83264f81d5516d448c6a7b318678a66714ef1fffa468cf7338981c1127f44385274d4d33795420edee9615cdc89481d48439a1184c99859d6fec8f751c16cfacb700064357c3a9865d1add759723d21e562096b93b5bdad455ea80dc07b6b9d36a7049b48735a427eaeea1b94f4383bcadba402f5575d2df9d38044edf133f2b99e99147fd1ccbd0b94198cd4477da8c5a171334d99506ffe9c0228bb9dd88ffbf796c23e7daa1290af5bf29eeefe1449bc64948bdcf3aafd7b5c6f91a28ae1176320c7ad9ab071f33f8c99757754d3229d4e5c79293a89e0ed554a962b4fc25b51ac3ca36692f912066d2370c1ee22f04cfddfaf16efdbed8cfebd1b5e3aa79cda08297dfe7eb85b27c6a5655465f0e55a1c0f991e2672b9134c2e11e06c1301949764b048c694798ff57d9ea0de92d57622946066cd023a52295139c6c703b03824bea8f28226864197226287dc80f3dc35d87d33eff85fcd854def24017a6ee8d334abe60f10dc1b5601b68bc4f971c512d40fe2b32c8f85760def5d27aec3227adcfb15eb5ef5552ea29929e3a6e02cb744e258415164935308612a34036940c7f733e302326b9c297ace6c30a502994da259615b21f082f9e8981a509ed717ea714b96bc22face7fcdb849fbe8ccfd5cdb0b0a7b6c9faaa27f3d5e326eb0664228dc5448cf7d678a0a9bf89fa668507636956c47250bdb48da70ba7cb5ccca48ba6ebcc7329165786839954e5940abc207066415c8d8d69d624c61a467cd52055034c2f39685630705dbbf00716a59109b29c28b1ffd32399cda6973843a44a0c0ca3a214963eaf9fcb58f26348e81b576e5a2f0faa30e8b8a89b40974247fc9c32588517f43afcb5dacbad02288e845befd4bc295771adc3472e6ee9b8f3d1ce445fb6945e68642faaae1bb23f79022ba396d243ce4bf48c3105550738d8158cc833f0e8ab300f2bf190ae273eeab2986a2363150273aea97add648ce6c7a5b6c10f4c7adb4df0bbbf47438b1d864eac0b80f159f2ae5f66a5e7cc0f67610368be6251b596e59083ece9bad858348faa136b0c3d4543a4f3e8092e348daf614cb76a67724aa7eab924f25268b78e7c521d150427925dc3d8ad565672aefa17c495d0274794b9f0a64a35de17099a4c01f4d8fabb70895d413dbe8eac9ca773621eb42765871267c0d379447562389f734f03f5ada4f8bf31094f30b9a524d972007ee254912d3bdbd3172403f319cdda696bafb79ec96a833f8c796db4fa48021afd2abfafc7f7d66e464baae7c469a106ea17f29c5d0116fbcaa0bc0eb38468775d93ea5a34454a55a7daf39b14d41caa974e8a53e6312ea642ebfd5c43d28404f11736d31f505cae81dfdefcc8e9a4ea6ce50e9fc49fc5d32cac8f59b25600059132bfd8fd173ad8976629a780ae008596b1c8b5afa4d66258a3fd3d19ed49f098fd285c539df2b66413fdc1b46d66884e278ec43009fab89a4ab88209b3a71439f59063ddeb9034f1abdf5111f6e7222e0a55cc4dadd30682a48ad5da2531764b3c92bba7cb675d454901ba511167cb9e79e6596a99a9bbece20ae91f007c95e3ed91626f09b9beb4ee7c6233a73dc7835c4b2591d12e23452d46e566d7ac82a9223edc73431a51c192d421921f02272fad5baac4da92d785dfac9c680ce86c4ba904a408ed22006dae06b3e9a122dfa3149cb05c78f05b3f8c97ff7c031f3b627fb1d8dc960e520db27feffaa6d8d1ee8df0b8a57e7516e298ae4521e295bdcfd63c0033ae4337b3d064d9eeca98bb662cd77a0241a4bc396ca0ac77a1d86b3f77a7e58cac9ab2c6b699a2fc5365c5c85e25fd6254cfe8f5b45731cd991573e4e1718e0fdb81ec3e9ab202106fc686124fa34a503ceda4ab0dfc5fdb8ce7ec288b6fdf34b7e5f2971cfd40a17165385323d100d1b7d654a7ae350dd22d32205bc9aaf0a3b61d65f2843b528372bd953093292c88f4e42451cfc3be1daca1db3672e0442a191290784e6e2a2a40aea38c21e333be38f679e6e04c289ee27c5d677876ca1b168bed70571da0ac1533a717f9ecee4299ba32e7adb8e8c1010fc0bab0a27193da8fc18567d53386d3ef436ec4570307eb680e6f6a7a0e454f7cc8e2dcb85a08a8e15a145f784ee9c280b83a164265b2487e7d331dabd1f38521d93c3d7e46fe8c61e7443950c9c63afc6e81613a39736d561beeec59d380e73593f6d430cd410b29caa0c980a213bfc75fd65a526c34a589fbc1e7cb88319462a81ae88c4b6d72985d632e96a732c965d13e928d98488f5e4ce055e20a4938356f9c0d88d4c0505f86bcae39f800daa8b56f073348d4c7b7595634db56c18b5d364168e4cd36b6963ac18a0225b705de53f62ee9d152375939dacd2543606d9f0340db025a74c25a1cf4d3d1775c6bb5100fe03ea1f7108bb03d5d27631aa2cd542ad9280fbfeda42476ea7bf6935a299ff2353740f850c95c99a031324af59a1d5f95e321f9c16e3879d07b13f167ce67f8794a937c192247d08dcd464c090f9baade39cf5c61bfee9b2736a3229c7c8febfce3f25fffc7841457a418bbe0f9295caa730c174e950ca7aa245dc8c8537f57e0885b2f72f6ac6c2bda27e85c522aaac37757072ccc5f966e4a8e40222fdbca0fdd3642f0c8f51881955771ab4f327daac490b5f7bb71406fd196912b558d7ffc56d55275574e956fab72f456ce67e434517ec214eb5917da8da0e6a40e330065d31ad8b1d3e53dbafa7e180c624507465ad4d9e87bb9bf6a148cff1e187619366543e4ddd8738b3803027f747c351c4755d9556cb650bc6841ef175779ddd3c932d5937e298d01550f1492da8106c38f5a6a40ed0340c1f3fd949ce5ac56b3c7f35d338f927040d9ab4027e4276c1625d062a8baac0697ec2e6cd8ca24f9e2a9ec00371ee6b7c232984d33cf58ce2a296418b5feee7171919f29b56c9d87213088ac919cd7a96f7b997ad6adb73ca008a920b4d97e8230d75f730969a69f3d163b9fbbb6a3a826378b0e8942aed23f63460f5119982f1c66dc6f978dbe2f0f48be1950a379962618cac01684f509e02bd0846c52104bec51b75d7d6d188f63259dbfb8616a4b2282b6af6ee45e726854b99c6288a980642900833c3507239c5d1a4f9ca02c2134363b569ed9d02499ea93d2a0f4c22b72312984fad9976116114fa8e1f3adda90e8cd561dae5a403d5264cf9a4d1e726258b5fbf59d27202d677b032cb84297786434f21be41bce33b3518ffadf9a8330a9fb33804802944da790cebd83d62cab05e03cff9e96a75998f24edcd2aff63e47b71aa95bcfe27f1deffaaad8f20c05a5ceda6608bd51e4f22795bdac49e894e64c8a6bb37a92792f0ea0d59e93738c968bfb585227c02292a25e3fe6ee92d6662c18db38959a25262c6c57f53c2e4f4ff117fd77863fbcb2509da1f1b5b9bb9507846094b5d5e6ee5ce92dcb9fa422ea3f9debb84325084a02954a8d32a40e6c760b935c3440c6c44ecb69e61c879c4c5fd9d69af2852a8e276cb7169720cd90c48e0367fa299b9a2cfcb66e17f8b80c6ef0e0c4b68ecb4203c693e32af94ba633cdb88c8feaf965aa4a29559e4c5b224d3c895d6f1f55523ad351d1f8a430a196bcf65172aacfe80a011d50b27a4f5f833e85ed11358560dddc8af3d8b4208e9a474272354ca3fa3a92603999ad9b0e39a49a88add2d0b791bec43de22ba061460ed771ce15c48173433edf1c9d2de5aa1a50e7894b995fa2a3f844c5744a91b03ad339bb71b301b5568d0db9e829a4a36742b0ca891309c75efbc6987ee294ea02e7fceb88992a10c2f0194135491e663d69e3456d87617d5567f3924220e74046e6294925c3088a0af868fee5f85ee2ef4da7a32b33fc95f8e27f96c989003628e9e5fab1e6a056a8d1daa58aeb3a213f45a1063377dd7ca7d37c5536abe12c16f1a10bc58375dc89303961536ccd486c52a8caad126a503818625a56fda3931d007daf2ad83b7c58d5eaa249344c584f2efd435803a64952cf754aa74bee1aded10864cc4158e04462d20a6f82a89b6aa9b6f968c9613f7270716df78873e6ae6028751714a02b914ecc41eb6a452885d9f224b170b45356236f9600b5c84ad501aa34b3843f50362e39a031fb1bcd044f558dddb4e1dd33e0b566a95b2cfc2c13c97aebab66f77418fda860718685808fb7415d481eb0fb15d84c1d9ccbbf6a1f75a4a6dab4a5f2cad7ff281477def1724d4039b43813bd99b9e402aa90a0f4f0eb6f94b58c94eb16194acc5eb3c1e9f076984367da883fdc9c25ca73b83bbc6206d0b124a925e9cae7866f6ba552b89deb2656554e0f61a22fd2a8fd70baab0b10650ee4ba648c4fa7e19633952e7f9115a713448491f7e010dd5b0d931d50741fd075a8a37ddc2db3270a06dc486e5124a00ce58721041e382e1c457c414fc7de8e29cd4a5707a46a6c58edfe574cf151a153121f8241cbec54eaf13b54920992bffe147727f6f19f82ae7678a77a25292b7b5cd87b728f0953951fe831b46791e7c1406babd21dd78a68dc1ae6e69ed3a92b86914efed404725ef2a761a07baaf7b77b4ef35b0d26ffb69d7a36c9ae79eb1dc789f58081f3a064ebe429077e4eeebc80f7ccd73ac6f381384575125a298f76f0fc410f00b79e93b252a532a3e1c21c0898134c681ca820c3ecde4d8d2a7890e5e7b03e093c28c2cbe02cf5d15e5faa584470a97445af41ce3c22bf0280a9c4481685134895ad52ea48065d27abbdf5f7ecd426213ad94a224868e0536d0ddfab3a1ad39351daa8a1c0290e0148dc0fa68a2bfaed526bc2a289a35bec04efe1ccc40bf141b70a4ea9a3cd6b366a077d4c29ad64487eb460b299647f489144b5970c1b1795f23e126d4abd59c86c52f9475e1de75b69cac58d91ee2dbc675a27497fe43c9d8eb5299fe2b28807c6bc1b7cecc0acfbd6b81f326b73bde7352b2dc7b6a12f5035e77d34c766bbd4117b67c8a248e5465bc709c264c1579530cb96795ecd2973a09c42752e6526a45322805d22c665d6c2fd7e662794e56f5fd4ca41111d045b7a520ace3591a49758b98b5cb527bcef29ee138f1d3ef88b08afd5fe5c05cc0e49d74c9f507e289966b7a0645a4699bf17f11c73d948769bdbdaa01ce0a631156ab32c9153f02c426be045ce338ed7eb9cb848374336cd33898c04e849c91561e0b80e29c3059c43bca702e9398552bf4061ffad21e686a630cb2f0b4bb41a235aa9bce49a1276d0c92bae22dd8175e06dfed8f37d454a15df49cc5f1243f04e07a52a06fae306346eeb2460c6175e1ce257c23a539ace90e7edb15c9c3a15b6af89a05840d841c143db5e0d4776aacbd27324588399e6728800d262b3151b72de8b3810cccb165ffb6e691e37efa9ec1cc7fa7f4f48c60a35bbcfcc99b654a5a290ded3e615d536c2365536344af0374a6265cf42f8cba0f78482a7de88aa3cac915321ca587a063f6beab29be908030ae61333ce09f95c3e85315287733374c962ada798bc254eb97488abdc1a5a5ecf91cae2dc31944b865c3ee7e1514a2a4236795197b5b73c38a080854c542c36ed0c6c9759b2972cc5effd093fa9ecdd38a5a8824d2f520cf16692122e6a63fb6e4f3783a666134ce9532bc85769044573135894c69aa5de64f7b2d56e048c0abfa80e7c12fac56fe4bc956530b79cf9a54b9697f7a4b1d161a5ccbe9a6bd967e8d07098069fbd73fea75800576b36711e13348d63d932506f1d5f295427b10ac3f16a946e160f62bc549d4d1fbf53e431a2f2c243c6e5d972ec9e751adc0e872c5b834a27560afdbed7d9b96e718c55a96d97a3106fa2b2c732dcea011275b28ae47c6967ceeac7a27bd3a244a9a98d493ef1cc66f841daef38db1e7cf726f5781e0ca8bfe6dc41cb96d17e269c9da486f718cd9c4a781998cb6275682b0a2746fb1e606dd9fcf60b3d03cdb93107b62bf4526c69ccefb8abdea96c667f8fd0b21497c21262c5889ec7977970e5addb7776dee694a09478a9f2fde07e39ef6196de9eb37b21424c1fb7a793c0d95065f944413198a27adfe0e29656bc8ed2410ae1996d7c278f0870b2f6fda4c9e537fe792cccbf65f4e362351728348aaaeb493967cf680e56ce82c966d23adc83e2b3fb1d2f9ec196cdb539d70fd14d8f27753921872dd7f03d9d86e78fb8a88d4d2de4313444765c6ef4dbbfb0b6f5dbaf2c29586d408a59d2ff8fcfeccc34119da292962f9c6e3da966b4aecd1f7d1260e2b29d48face3c24d9d351c9075e1e82405c970fe36fd017c4c83dfa9c7542cee761202fcf8a52429b1a018b31fe47a36ab0f6eb7e772469b6988b10bb655033bc4a70e05e6ba66ea5840fea49e7f2ad161e241ef92ca29bd1843358568dd8f095ec1fcb94c65bd26152ebd1700ef5cae774c51ad7a73924dd77afd82f0c634ea88dc0057e6dce063233b87f535a98e2a85963f263cce9811ab46620f255b2d649b6acff91134d95d6dfc93ccbd308aa34623a23d9a6a332abdee3e3d746af23fe93dcc064b1484d01002736ba106b3ebc2cc4f1560d8c0c1e069dbdcd0365487566c62715c94f0a3a7d651691c3a0fe80dbd2e420881ce15776b56c5cdce3b85744c0bdf294ae2c8483aaa246733289a0d6a1aa5644498216ce6a305349fb1bd984f71b8be5788799af1bf39c8e1d3b2f060ebd27957e62e0a93b3ca959e376283a6ae4437cfb39f52827600f3923c6b66b46114c330c75ffacf9b0e59156cbff7ef929d343efe11b989684b8052a144a5da1ec9f85384bab2fbd808ea7d2ad3af17880b4f1ceaa72ef1a5895ae718861555ccdf2803764cac6cce43771c8ff2d8c3a50a9ab020cbfdb29c74b8c51a042bfb87bd989e008344d8b489ddeb78945f30ecc991ab0beaacac4e53f0a58ec95f937e788748d8ef72c159c3731c07d464fa002478c0ff9f9fa5a0547e08ff24842d4d7aa91a6b033272489d95ba7a30c95743c5b313ed6f7d860e968ea496b6c5905142184aa2b72f47a83d839704708430a7fc7c7e7c0a2bbbb952cc6a274450192c5a68ecd9d2a83db874dc83da94e37b23c2b6eee2f4b2fa9117ad9bfc85da045ef7149c5b382745874173593bc25f802695eadf32a6467556e27632b97e75c59e61d8832b124666552b9fb51873c8ebd67e8a205b38d0d0417500b1716c978682b5d10371e4d76335f99a98be8f42ae45a93672b4d129d724e4d36f8ee099ee4b04e788b9d758134b503950695f956376ceca2994c1077f0871e7574d2d11a1b80c800093bae336cbe21f5743dfe9c1ab22a5fc94334dc4f21d6a4dd38c449d16c2a3875af92d085fe0a6396a7d7217b71b0a38e7ee50871506d30fce9c4b8b99957b125eaed7a8feadb99f3c20846f7c424e761789d558353eb9b22ca401bc55cd96814be919badb30d9e2e22a817763c0b069b52e61a7f74a6db644b68eccb0235a7be6a04c7b59a9855387aa034e42033b88794da012c7a4a14039d929fe55eec87ee2b62811be3bb3523c0f5311605e281f7494434b41e53659582de3953ba768fe54e97daa1e849666a9b959a54183f26646f5b866767b741261ee7f49b0c122fdc93835746afa856096909ce4aa21b11d10121605302e5c2ad30d95eb992438c4f6bd79fc82b147c3a116ee5e9d0d7b305ddb35b87617a8ab86699e098c8d8fe9b7ee6e009df13a1a0e5e492f72fa4c9a476c5371e7c70741ee47c6b1c8d5494b3cb118aa33b0947912be2451e0d449bf7441203d03d5e849ecd2280d95a4c5339d516f4fb1823b20e65470c5b4c90a50ecbb6c93eb184d6fdf9f0b03786a1488f87d2f7af7ffbeb3fe510182fa893e581179800edf2dd624deae14692200d2455247db414ec36d432deba52ba8ce7040ba188aa5349d0c9dea90efed4b9bd988775f65be992abc4eeee9156c4cad512125df9c2408f7f750ad1f5ea563a1a57a39b4b9fece3c6af7af52d9bbe3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9be3bd9fe7fdac996b594d114b88d40d9303793fa8abb67e21f746ac877d336ad85e2cbaec9acfa9fed2b7fa727498c6cfd230e25f69f9992dad8ac7dc3cf320781f82d5219427f97dc4dc791df753299d59445dff53206fc1911806b69d9b395982898efe2a6018e10204212c2010e9c2ac7bb78696ae87457eedf8ccbb1656ddb2d361123e82c2552dab4e66f1bcc2c7552a442034dfc06a1f902b39fe3ea3437fda2b97ee28d3c7a11f74d4efeccc02ff1b356066b3a69aaa437e1e9b5540c2e3fe2af4df50a6f6288702ba4e83a9419c3a8a2a57c49c63ce79cfe6d5b9b709e27906aac8ed58c4c857a0570da33f220d5d1b2f920cc176c0ba3ae1a41abefdbeb4876969b1a9536938e8166788f3d6748e08b9aa9f37fb5a531613c512e25f2e67d1505d046e52b33b61f3aaa8c4bf332e229b0c8e350db8884c4ea2af5c66656e47cf06a909603309f83600b3e4f57327357ca2771d1ba883a20246565482761c1610d838d4259ccd46350e7bc3a2d32ab6738da8f973e4279e5cac50f33817116d3a06bc885d88cc6b0dc76b2cf0c3dc9e286cf8f10ccc3219df85507c001e9044d866dc05f1dc6dc748d9b70484fa698c7027622d76e8dc6212ba0930dcefc6957d81a57b3b2225ca653856b97ddf209c1cfc30d717f0433e37b2ea9603a62cc7e492e4346837b2d675da00e5d9563862b3f27d427fb8cba8dcb18a67971267401f916d35ac4f27658e4fe26128488b33b537ce0d3763ca59b912e4bfa141ce46eba529697f195241d8d8a4bd4c6c5d095991c946530b9944fd98bc701d7833e24d473350f95a6173800ed1c4757139e89ec94196d0d16e19ec897b85a1b01692178ac95f8081b2ebf02909a8f43bb10f4e37f46de481c7450c1a1c465dcc5c529dd6d8886ae72e6b855d7a3217e0b0a19ae6110ded02a5e7675d0d9ece3bc2d0419bf43285f39c8edab1dcec01140fc23fb40bafce27cebac9254ab52d94f2f876bb83053a73ca51ca7d5b8000a8fd90c48555119fc7da1842e507b165766f0abc87f840952330a1184c6e036e4040b27b696a67ed7d0df5645d8dfe2ed69e17c8dcbfe0567d6e4505db278a6f53dfd5cfd1a114ccc5f98613781685eaa3f0e55c1e6c91d7c0d30ef7dded340d297abc44ddcdbc6e78457039a2ae6917ec9cd62d77f6f40bdee430305bb67e62a0ce96472ad6f9445f8894a25b4f14c22eb38d438e0955cbf2c7b447714c6872d314551266bd704449db56064f40afa1e4f33576c020bd831678d0e07335ce33043d3745d3d5c6c73fca9eedbd6e6ad1c502eeb1396f2a7efe06e4abc6aa16d0d0bcd3622389357d05f937786a9ee8dfc15001c5d830fd150c3214ee6542d170db6dedf45c24f816ba63fb913c0905c52291cadcd990bd652f2403bae9b977dc5c95f9be1f5b39c3dca40e1f0406be947f02fa985cafe7ed9d9feea3a6fdb99dec19a1b97b363894ae159b1905d6e121af6f4e5585fb7608c6a0e369d7206f7fc843757fb3268a681e3596cda7986a2262602bfd778fdd2caad6a971aed45aa8fd3a9f4281e0f2a88235a56b5ce6fbe0a281ae709ccec573075d8f782999394ae3ff277254da8b2fd4755a48801eeb0c1d2ac38a913a364f27798d5e678ced582657bc9d1e7c879d1f9e4e4e8d72400fb51a19a3786ae4d709918ee95fce01f8eee3e75839e66f30afc361921a19d73a6779389879c29ee515d98f90e86a87d31b0026284270c2afe192c2ffa4edd4627f42a437ba032d139b6f91785dcd4e027268e676f425c4c95d772acfa36aa6647fe178f95d8935998eb67de6e05c7491ad8395feb81bdbb1e34d735d4dc65b19c33f3987508f98d49e6d746dc98307d264051a9c58cbf676f5f583b22b12203b41cf9c663ffa3c2f04630a0988fb6ae93eb5b8b7ee03ea4f919d345eef32b49188a660c4ac72f0a225193c7553bec5e08cfcf760cab72836923ae8ed83366e0ebf03801e94114f4c6cb46e14e09a6cc57ad9825758b74812d73f796ee745bdbf1e3e898fc51cce3ce26031b5fcc90339d2102b68ccc6434c2fb97359b9445101d548e8523a76ba663330182421d4c3aaef600a1824127740329495ec90a9277d625b2252c833eba115a35ddc4cdaadcb705008db20bdd56e1b627f507cb9d4f3f20941d1320fe9386f33df955f6cf66ab19a628e39593997551a70571961efd353bb36af99d023a662fe9a2c36ed2a203a6f2b5436b5ecdecaa8e902f1f9bd7bc9e8dc17ebfbffcdbdfdd18a5663e01d81cad13e1bcea32d5ea7298c2323a38486181ab5b506f34f7fa9e711bb9b4985995cc20df9a4c348bc37b5212ccd8a688052ba34cc98dcde65e2e80ea0ec2c8cd64529a45616c35f203e815963562fbe49c35c0b6a2494a26c2be48f06633244f69aaf4040e7b42e097fdb8161a03d3a3eca58765a0a0d4142b585dff53706a1ca9a5173cffbb1eae3254ba93a8e6e9f5977f880c163f2e334e13a9dd6ee583e015843265662640a736400195dcfb0676ef47a95bd04c67c0e3f6b319d340ecb1679bdd57de7cf6b0ea4499fc99193598e954cca2d0a9eb3efbad7c39966a02d09137981721868828eec697fbadbc61280fb5fd2693ce9fc53cdc3f31744f46aaa09afef43584543345b64e36efa59bbe038877507e41413babecf005ea26a42b080a5ef3a7c43c7c4ced9d928fd77bb9816c1de9e7941f8f2756042302969837f607ea0d89544a2b72a6664f85f2262cca193ed686e3735795950d84aaf25b4431fd9884fb9b00a76ce45581711b1a7139b629c2a59d2c4319b900bd7ba12ff1ac5e2b40a9251199a69d7280ff96f8500e1fe69368d076327745f567b27ca52cbcbaca6ae58ccd7c9d6148b4e12eeae1bb8bdd09bdfe82de9e9165806c343457a9cc285f5678f923bebeca18749fae2ec645b9ede61e9c61b30287a27e70b1405505d546dca7e6245eeeceddc304431fdadfedbd8a04dd2ed6e218231c6967ca297f0bcba063ff97880b89fe22533531f14961b0d3a6f684248907d05bd127aa892bb6bb1a2cf0a44c9a20d6cd03d08adc902097f2421be0c67d97b35aa58c60e68339723e2ab0f786e609712ae92518b988dda4b0a0720ebd04cfd44111b7f0a04b00ea15a129f20fe1f0fb0ce0deed617232da9da22ea7e9f5854a5a4ee44ecda9c39cf7cc223d32693a61baa1b4e00d7655b1c06a24c35bf6ef2440f9dbba9bbf95f90e4c003b2eb0640a2458f6ec32e1ce72fa57c78e91dd57be5723d2942eef0c676559864cb9701d22e799470b445c9de22a02d42abad77e844d25b117486e70f46c6053849480d90ef44ed953d0c7fac1c550b7e5f2720bdc466790d88e3ee4402ba65e43dd7dca9724602f7b34f4c89f7020b52286eaeaa3cba2c8c40972b29db40a4c98e98fb229ee50780a7988a8af27d4c68b0dba4de7dd22f81d0ce286c345974cd961b3a575a8823a7061ec12b8257f1f22758e6fef54449d3bc1fbaa3d1d83e4977642967f2549139eab806000a231aefda77a5c150bd9c4212c49e37e2707d942e310ae740ead5e671e452bad7afca88f6ded8c4bbe557c2cb2ba37975b2d5567f09e923dc4afc21a13bdf6afa95d55b95b9245ead923f2ebe614d6428ceca42964132511ad3a3012e31351eec85e994619bd78c78df26a058433fa0457c0043463641e74c16472756a6e89e9261e2a44a4b50c8162c8a7e985988eac49a126d347e0abbfad5eb0a338e73e6741fee589068570161988db8baea0d8cb269ca0642f9032c8066a056907e02c3683f2d9522fac6a263dbfbad114daf46b8b194e0af58b3e3f1d68489461007ec252dbcf46c7c9bdfd1fb594a92351b95e5d82a9c2c14498465248e0f02b39bd49aac5543befdedcfe54dc764f2736c1531cc241f1af3835d3d77b7b73637ebd877ffd5fbf66067642f0ee343112207987b87bf0116fd97aadd8791584ea37a77fef753a57938acd3b731de25479d2070b2aef93f3e98ad012e99156d4505f38de91025a0430c83520f6809a0aaa09a841b885c63510c7d10e47956998175b4ce410814288a03e4929d40f1927b1963bda8a563f08276d0150ae7565e89e5db27cb76c67e4f2bfc073b31a1099efa1ebd2b2a694ef148fa100765cb31b537b4a406e0ad0c9607b2b78503a81dcd955e9042dcdae72754707ed9732d14f8013cef93bbc730ce6fe8bd357947793ace3dd3f0ecea35b5cf315dc2930aa3dcefd3b0694d6f2c8a7a01a37958ab24e99141b02279043e96410ed79ab74a2fddddf99fc00d9a062b58330a91deffbb5728915e215e2b4e41a97dab1077a24f2db91f95a421334d4694704a7541ac94dbf98a4e04d8e0ab82aae456d0e70913d2e06b86024543b654e8107268b4247a89618ed20759ec87f24b7a8e99a45ec3561480fa4dc20a99c59c30245eac87adc9b7782aa9a341588463e052f0ed0dcc0ac0749068dbf3cc65198ba45602a72532e6fcb6e544b022f3a6c374426c9086fb5e95b9d6cc78bdfb7de56bf75b36500dbc8963e8f61ca4b43de47c775981545a1025f32848f85511e47dd130f4ca00b5b72af3f6406c24ddd1d1b2d432a7d9b23b48869043f85fe6f2593ba71c64ac9fca7fb8cb1aad8d7abf698c773c6ca48b226ebde71fc97fd8da184a1b4dcd7532933ba32a2772ad0374a34c2837ce83ed153bb1d1bf1814515e6b813d950d6a0b212fb35a63652762da650476a48a9881419764138e658ce5a308711420ec9a749c9f0b7b2ea60a7f612edaab8ca04db6abd9a4dbf65bd44840b653d2354a5705e493b6ae5adaf826a32f415458f92be3a18be8086482c9a9f7e0fb7fbe385b4e6946a181fbd4bfe7ee0bfff0f2240886faa7154e66a65968c653799329e2f6b2354e541642a02d911ec4030c6352f3044ebbd7cd8b918be7223ba6ae10e41b8cc8d091b3767d783526554506eae552b081423d0a83abc3f7bd053532feba27c98ab3f7fde8ef6950c06d43ce4f2618a14fa103675a3c775fec75f502dd0b325087079b8beb795317dcaa7a8efb7e6aa08fa24e2422971e05f8b6a1c7ac2bdb5bfe9fa37001d4ded8cec6bbb3ba441d8b0ed8582915b2513eb729f6a59cfaeafd8c56cc849a26777195f68442c0fc26cc2c4366e45d9dbcbbb032b9b29cb14f90858aa13d432c0d380ca8cfc9d5750df2e9af6f12e7cbebd574f4cb5652f16ffa580d03df88f63845c7a74eac80538cacaf571de0cb13432fa3d99c3568ce82382d55139e9908accbcb09f48e7c59a4ab9f54e21dfba071ccdce0a6e1fda4c05c1cb31553f27fa85b6f4da746e8eeb7fce305fa1d5aa6b1dd134e90ca68ea57cdcb29fd56e1d2d7a1be68186daf59a6519abce35911bc8b816e0d2ece06289646bb43f1184e57174d25fbba76809e48fdd85f4159da213b7a12a0b09fb01da994aa0c2510e894348f1162590999c3abe495a79a2c40ec07cda2fabf20e425a0960ef613d2a4ffcdd5645982d56aa2b9f36434b7d68c501d701fb7952a72c1155ebf05cdcab18ca26d427405d9252f229e8ff65918c307fca7b177a79567535eadaeb6489ff71d6207526d1977a3745561e7f6d6b562677a3be677696d928b71a3125a73221817984b8489d2a6d9d28703dfcd37f9fbb2f31abeb62200cb539dd35475182465bb43b4c047491b447aef46cc755ec8aaa3518372d02ff3085d3d9b73321e87ba781e647ad9d6f0fe90cde4814762a5da8e188e4194f9c20d4acc399465bb4d81a922cecb67c858a56ba4748ecbad47376a04ac4f0cff2141c51a9c855151c31090675d987f33733b218b7ae4d2440d5c3d02c4271dcd3783be84719dd013e2a3e8d7fde69b8db9a8cb1d055d0feed4f9187e99386617595135220a173038b4ee7aed5c5c4cf778586703bc1ec084cff349bed943262b0d044185747f3db7a8aeb621744f74f1cabd5c9223d7d4d1b9e2592f78d164269eeacc7791774ba8ff1dda256e58d8657122c1d448699177be461192f759194cedd2ca935b31ee66854da728d3d53100b0c614b313913972fcdc39e94327182d19f9530451df2b83be7a3edc4a09802b427ed906cbf60bc834ae7dcd94e0b933679f952c5703b3a047bd439812af9573d79d29cd5595ceb32c98878ca94cab5a3e72956696c0ea41bc0a4f511ce68847092ad9e1fd9b62c673010d291bd0056c358b9ec243e553aabda949361d2bc85a654cf5226661885acb3666efdf4697ec1a9b229f18a0a2ec74afc0128bbc4319cd54cd9524fa608af47c9ea8e25bac3d24d1fa441ab00fab749cf0c105af20898a95c0db846f90c29d1c9c2c9c6fd5f4e6d48fbd1b3fff48aeb8f4341cfcad218619e6453d4ba382707a60a83d222e764be4ffdca619ea83048f04c00c3bdf949b4cced4da4b755610d7270b9d1e32425aab5f4662e5c5ce48d70cc7608155380bb9a83d1d2a554a75198f7a6c16a67224abb6471f0f3adb2023950886a5c1da5f006b6d4fe4117bf16586087d3988ed60b351149cf7670a746f8d9bbfc0582076debbaad4026b9db831ad12d33813046ba8f68f93b46892fa110b1ee6a1774ffadfa49faddbd71a6aa65f5eda31c02c5c44814876c2b5f80245928358593417425dbfd5369cd54945ef5fac4b8a97121788e431313f3b5d020abfef5f995b185d381e24aef883889307089f3d2837f77e8db748a395a5bee7e72dde3722c5bd91363a5707957e64d25c635d7d477e91c515e6a73716a949877f26906596a4822b12634e9e94639522deaffbc770ac2febf8cfd5b8eeb4ad6ac0bbeb323f594388da802ea00f57250381d20454a62046f492743a1687db97d36dca9994f859dd87fe65a734648a45fc6b06197bc10cd50cc5781193bba2ccf8de1ad2ec4982aee6d713ac66f50c8c2cd38fe68f96fa779af3c7f907350fa68446fc48623560f64ab4d5f5c0b5f433e0ce823cae1050cb93ef255fb5cd54f4deed7454fcf95e15618183611d0c77d700ae63ae8d490f4b05c7bc7de02d074b97cbeb567c4746206d11d30d1f3de6150e6426c0f6d95cb51e9ede37b2c6137c41dc99fdd24d45783810de914f66605fbb3138ac4153df8e6a341532fb05bd75975a464409e6f794863fbf141b946603ce920ee6d51063e24af8cf8da1c6f8bfd0c6c842a1c4d2f063e9375c5df7a7c68df9eeaa2dc8daafee92cd97c604febb7cccbf7fbccd7462fa8d74afcdd8e5594770300b078582a81a2e4cbbf4387a6ab5d117ad9c3dba81583dd06b3af76aad3eebef9bfff63a55230cccee56362ebba228c60effa09c5ce5643572ba4e1f162d39152bd87a3764269387ac499bf734ce1064b7a824aaa35153c9f5c6a32406cfebf044d14a1443d5cb4e4aee9fe3feeac086f3a4115035d480c29ffcf316d6d704d8680200e98510bde25527061db431280fcd334fcef757efc7f72c5a12136eadd60bacaafde1fe3079ce95794ffd62074a117f242614fbe467c1a9848e5f24eeecff46e95ef169df5cb7ef0d1c5b05570c25e733b3a5a1ab2d56ec6d33b622642d75b28f47a0465e4cc6b0ff29c7441297cc6c57ab38a61a20f431374dfdbd3e766bb04fbc2d021c7efaed2277eb71a15faf4bbce54bd7cf25af6d039315d6ed9178330420066193abc6c78bc2483a683b51074f479bda1e5d191aa99be6eb430390157ac132c4d57f2a2ef68b95eb9ec980bcbaff073f229215205476a2d4a9f8c13244b3f420b56a7bd77bc6465b28cd71a8d393b7cc653c1d666f1c42c50d87468a5209408e4d674a40dd7c77e6c1fcbaacb2890786f9a9dd015cc402ed22a8d48aaa5f6976f4e312efb10e5bb2787ce6d4ac8ef38e772b9983fd978445525a36f64d04c330d5ab7555e935fa20805a7d6f05d4ac66227f83c773ba5e72d3e3cdc773252a52ccf6f8d3882741200d3ff1f42c274d5df2ca411622a186311ce8909c88bc62e9f8b01fc99d0872b63fe33fa67c814a47d3cb05fb13396b8c6145df99c97e5bfb6b4bee87d4c1ccedf3a9b7a1df14f37d46d5e57aa13cee553512b91930d379113772a760d7116878be20fc08835ee38a78fa889c84b9d4226af7b7b05f463e1ebed2a2ccd58c3f911f42abf869f3ab88e2f77609485f9551c68642095634a4b7f9c61300d6682f6f6e9d36cb0e76fd834686b7b17e7a5b76a092b30374cff6a7c3cd847ba94b975f5911e608c2139b93c852962dfd6a34705dabbc87ac86e34f922ea0b64f6d14ae67616fb790651c57bd8ac9420de6d4c47a640c08a3f1956acddaa2649c588b867f9756a35aadf7e681fbf06f9be0196c0c9c17b8e235f4a7e2d3e69f08ae153cfd919ee5435b6225d64a80f4f4fc072993b5a3d1af6766d3a3e0c5c0ecfa043ae4fe6cb7330dbab887cecee6f81c121b45017fc33d7f06d9a4de5cb0fc6bc5ed2173091183070927b09246df8861a015b48b6f8cf33828d6b3ae690a6089d4bf8721f4c9fd714b6e5c171c010af550f064fe05d54cfd3da75b606ac66fdd87b6ac5587c2e50fbdcbfe3d9814beaa2b2b7460b9f14d7405ddce756f8770c4b6542b80178e3f62c6ec08275da696bba336f34fbac33c3b28c4c8dd56809826dc22d94953180a1e0a384c390a6c214ec6d89dff11a2f3309ea06f2a48451ed911e24b473bc21c2106b26bc616d16f31f7c65fffff9efa50ec5a2601e8adb84f691fe050288a020a4cb6ddfe5a2b44e16daabb44439a5a1fce910977de06ad971a80bb791bc15f5acefd33972df240ff41843e0633b98239f56ce5e1c886c7fe516d1dc0434531078f2ef2090a4b30f60616de026bf7e07bb20269a796d413132eb61d66f8f73cb9a2d3de73a1be7b785d1757e58250a4731112701183f0e68e0e50662b7790a054a924bc471c5a7e5b0cb4130e98f91cbda3e519c742a1f42e4d2162a45ee2beaa01c15e665f16309ddac96525f7c3754bde2128cd8c4e042dea97dc01c37cf406f7653ff058a2e979fcfc76a1402961eb7f7e069d1f3dd07ecde4184974489f17c5c2acbba846dc36d6f5f53b3febe1fca1f93969c68b8e8cbe602acf735326ac6c24e5ba544d6dc8c93878579e002f071080ed2ed8a6ce3efcf88e165e700146c493aad0fd2bf12fd3162a6cc99444e9b11fb6058164f2169ac9523449a806e7cd570364f645c2f62c4b156b6325317f7dead9846807d323a1cec3bc035855d52ef6c8cc77b6e3801f2e591b772d1f80d8b3b81f0c473a727dcae7885e32ad6e3b367e79990bb887162473ee8192c192c32652119f2e44010a85e16a30bb2a60354e2c0ddf55ece7d23224733c14ead4d3e61f2ea30d3abf567b2bf354e3045c862991db9692520848a6230beab6774295b9e90e824f62403016544e513ca9477360008758879756e51a9ccad9dc3f816c868d3986c1bfffe5a138347732692abe4bbcc1e4286ee6789ed6c97afabbc069ef80a6dcfd5aed0d3506c206de4fc75f6763808633ebbc782529ed3f41f5a1032f8a61647bcdcae3e1c50a6675535ad81ac689ed28ff98d59b038a8ed5aaa7da14af32de615ac464d592ebce886138f2d6d5cc473b52c77c338a83b7939c34793ac680bb6528bbe4360953b72c8cbfd38fb8a080f5e804369dd9cc043b9a9dbed06946067edf8e9fad82978112a408dec10be15748b9fa1b4446f0aa79bc8fd52f109b23b1999692828c0d8936a003cca62dcc8f5395119b5b2806dfaf762f6ec2db6669d998bb6608aec5a5e8e31c83ade7b978d77ace3449a558a190ccdd6e043267f709f94ff23622fe62d1845039a177fabe70815a7e8548e2152f74043b54ac1aecfa30e4273f9458b28afa618a3cd6b87d9f20efacc74539489b64b7697e801e264c18d6b82eeb0233cd355f0ca74d703225caea8198a51e545c0b3d9c01c87d3de639ba512395fa6dfc8b6d6efea8e48f96fb71a9d4d7257190ff3c4f6730aa1e9ce4bb34923dcd7e20acc958783d964bc8129b45382201bfafdd8e0a4a527a5d6853e2d4bafe9a3163b0d1bfe08f536cf44d7d3d86bfeaa57567788aea859ed94e900e683d83c2c9cc83146707423be0e3fb0bbb5e88bebbe89a4ca2605235cff1fb710e5e8a596fff5bfff77c3c5427239afc7ffd7a98598576718796abf82466986167fe7ffbd8bac9b8feb5123c77509441e2b475b8bdef6f1ee06fba73025cc11bde94812345fac57d42c6bf2bcc37d83203fad6f08be61196d499f75fee2c3b4b22280493be65620af4e657b32b71a8affa17685d5aff9b176f9ee95805fa4f5f5b7c045f98f7c89e58858d90a39673dd8362c5fd12d0e67b339ca375534b1850548bf30f32966debd48490764144d752c064152d5451a1a16e80a9d945950226dd2b019f92ef9ba7b4c3a7b8e809c8bffaaca3c402bdd6a7768db08aa453e13136b1af2cecd9f528fa06823fe86f0b943a3f434fe24442e340eb6c5172fc1b72165699883f4cee079977947f53ded2b3194faea9edbd4186fa20c59ec22b65e5472c977f26e4d1e7be657329b61665771e4a0f832a8739d7d2aa9d95089f618ecbfaac1ee286c40b38d7f9c0e9968ca7d1c0b751914112a66eb1a552cfc9cfd40ca15149f7aac9b7d352ea16cd4913cf0ae640fcab88bac984a9288180f9df3d8f7ac6be80f63ac2896781fdc2f2cc75b89066f78567679732443da5d2973e6f1577858f830afbb4e6c59686c81bf1219396233bfda89e8a7f05089a308276383bf851ba5fb6f2a6442d380e6e66bc85d7e5e55ffcf5385c8b3c577450df010a4a557fbd0ce940b37ad3f05546ba9cd7eea113d07b3324036e793a3f05a4ba4c408cab4318ababce157d09a1433370a3eec7b6e8e5acbaff4f08850e1775a4ca0c50f3b6ae9328d69aeef530b3118a11befb1b0e5440ba71faa8fa92c01f6e83023f9bd61056a469bb8407562c8507338c947b26247c875ee904c4d5aabb0ea354036e548bc950c36d71f87a58c9dadc60a7c870300c9174f5115cbd01b86a34687307ba10e2cc4dcb4f3ec1eec54f2e92cf1bb13590d78bf755de07c5898034dfa3e91934df7ff884a43d26ffa9058383866dc3cfdea8652d2ea630023e4f287b3f405f7f3fcfb33a5d22a4ecdb64702714748fdea6ec4f91bb899520529f281b9577a35de73f99af8a639fc2e7e273e61eec63c20a646f96f7f82dc74b7f67d152d6bc662330870efa8c7685fb5e13853306a56a322b6e0f06beaafca9bfc2af322782ffd4e46745e60ede9441b117e6c52dfce213944d34ec20692aa3f316760efdeec501d7f306fd5ca96299955e74cae6838a9d91229740ffac915d8d1d2b57d40a40254e586ca56799b0edacfa26f4b61b17aacdfe6d7f934d7721b260f08c96bd3184030b842335a0b2803489fe2320dd9b5d950ba311b0f43c0706f202e6d819ff2d62be8aba317fae6d3ace7967b162c83a630786c730bfc0ee3a1290c128bb15f307903be1c7663acde4951b65caea9da03b989a666d4686759f37d363bf52c7c4eedb1e7189f5392469dfd3f76f5b973545ef103bdef657ae891a1ded8f11ce547808285f3b5c731438c298f7c8bc14dfe65b9080046363f883ad086801b1a27ebac6e0472b42ef5c4051943ed3b89e6611580d373dd6b32ace4d5fc5720d07843e9326cdcb0d558f4f898e79cb6a90d4ff13be5a4a9907b045778382968c4e84437da37a3042e98d4c01ded940e2160b995c3ea65346bbeed6c69303a0f751dcb1fb6a117b449355d153602092ea4649e9e18fe4978e9d057f45e25e3a5c28180b5fed86e8ed019543b902ed730f3284e8cd03ded232d68dc72ae0aa9f78885205f3b76be6fe7e2abdf4f8eb5ff8f9f7eb1a2a58fbfd5bb211783771c4cb55687de05a32d8d60f5e69361c594337cfc90cdd8d8321f4a36e33820c6200a15fdf0c1b4357f12a6058a1ac7d9df03720fafb60122f8e7fb540b028aaff95ae78ecd795730a27b6736e20b55e0ebfd8e1b5bfbeda4777ca89e648311549e3ff7e4aca0bf12fd61f9cbc645580e23fbd40e38d89639c68f9badef427650da7ad1289731876617ae74e95602b27c49216f621be9e087343ec5b06affa8f32ba1d54bc773fc791b2e0d341783286eedbbd44a42326c916ae3fdfc7aa69e3cada7dd62b1baeaab2f40397b72a5cc98063dcfa798cf2eaff983fddff8d93e0cf2070f0403eae13dc8c9539f3c3fe7a0a440636ccc02d391a66ee923241513bcdf83b7702a8d32a114c6a3cc01be58aa72cfad8c1f64c6390e54204bc856f7f2a38466902d93cf818129fe00647848207a41e9329e2036da3a30df67b2eae362bd3bac7b0fd76b2cf144f6741667abd6ccb6d26ec167be208647b08120c5e9c13e357b615da6236a2fc41cf20def6d3880c7e2e59280f34c306104783d71056ba90b0fb78ae621dcdf41a25450b03d8fe74ef2a19ed3dfbcb17050a09016a3d8fff05b78d7a398aec4e2cc7751f5f5caa7c2da490fc18cb71f3744762fcf6ac2ade8ff826aeda81918e03622a2c11ae499df3a9eab9a4529427e2a29ef0415129d2d5bbb7c1461ffc30164e7e27c7be42e5e48542e21dee6e430469fdee51475d92feff221527a4c6c824bf05d935850ab1b4366c0943bc1dcf4d87988914bf475d287a8f20368e1325bf463f289a4db6a06bc6a8ed7fa1f52845364f30e11ba0d0c81fbdbe468a09fe103452a396087d80dfbd8c1477435ac7afd3f1a9268b7ee9a7fbb09ea9b72e8165b42bad9357a7a6cb79e568ccaa25a9c408f85786a3404ead8f5174507d6ba8c52b1469263fd15ac02663cf950838500785d8245757940fa95a82b6cdf3555e9ca79746f915a3e76821af7113c1155e688bf9be89bca2b1295e010aab86bdc3c8bdfdfb24deeb9030fb7382d7ca3863d32dbef3ab2caeb131b46e7fd89b6a748f6f151da8b9b233337715bb1974e95ab4433ea745b31ca41fc7221b73be0d0d1b992d80338eeb586c009e403b29ea324d0f1952f0bd73d5a283f80edf20b413430b40b097ccaf988b674f857e917d9555482213c53ebee191328d998cbf2a7213c09f53cf21306de95c18fbd9319800c4e7f9aea3df5a1342c1ea75128c208027ffc924f4d213e54cac95b2e97b4493ef87720916a062faf8e5c2c06681d38cff4cf003326fa2168c5aeef3c24deee4544e8d49a7e1b68fb26c1663f22bfb6187a457ad95282cae8ca733d283d3b3457d9c0380248953175693e58d00aab9676b8912fc08764052a9f011ef07d6b97fcff2aef39ffbcd6868a3322bf2d8692322475a514c62353fbb22b133c96100f8b1b2bf296115af38198bd298c4aac38318da1558123a096977f9f9e7f557638e429d8d414e8776116da16628d9c8767717d6be8fb1b4827c624a43e460e2427a6fe6eae2055ba989eb16d38beae749633461ec24449d11e16d23fb9454894e94bd6910c5b54d09c8bd38c112dc42a0cb7b09fd8fc75f459548b3f439820cdfcd084d014b8242c33f4794ecb3082291ab30fd6ceabcc09590736941d7e9fad7cc614a24735ff18a7d207b2dea598a7b3cfbfeb3fe1f8ebf49c974efe48fce8821834fc90ec2b8a43b067212aefe548943c108c25ffa2c8adc55d187b7971918e6768558a994634269a2fb7f9c3a022be6131647f8860299963140c9ea078950cd0f8593c4aa94eb4b8442169cbc01b14e701fb34c27ee9af0f7076126b38bbe9168d686b5b81f5aa6478d7c4aa69b2e35135147aac0dd6c6c2015fc119b8096a6f3f04f11117ef0e347fd6bca0ee78b6eb938d1811ff0cc5bf13260ad8d3d37fa5649a759d8998844ace79bfdecec98a692e929fd5271c52592ddce2bb5529b24a61934e6779b86babac2a6ef3732e83bcf2492bdf4d31c8beba9bb058a61e3f3798250992e90851c95d61fec4924b47a877e902c2324f908f5806d0026579241b0b190a771333060fddac3d5a7fb467f35feba67fac777460c1d8307b7c88105b359298805da77c140e26fcee3f8382e4ecdef7b0cb4498d4aa1c9b8b93983c9bdb9867b289ec3d9c7fea39b735f3889fa99e2817251dba927629b44951a88e9241f7147d216e8592cc5ca03aa44f01260e1f5d86106d5d05b2d98028342ad5e03c500a95384f932030d4e3a0d27f191fe102ac3828e8f353519cd8bbfaddd4008075afc2982162ea8a9dd131d47898a3f8048c6c77543934a81f966a52dec8e11fd022690d9333941bca029add55d97ec4891403b0612936de2a9e86a2c0ef705fe8c632637cb67fb4ef346e9e93e5c3414e30f8e1852acbe6e718424df574c75adf86bee3006f76b2c383682db6638bc4075b328da19ff379fe626fe6f7aadd778cf1b5a27cd32047b7b240d94d90eddb0b0d0d32bbb1871ed1adf00f11e86d9e8de114447cb9d00be531733918bf95f5ed02f1e9bbe6a9e8f33eea84bba90b70b65f87944f68f78c7d31c7dd46075cdddbc313be5947f060b31ea12d819fd942efee71d722c615a713dda789a270cd8c6e39541a9247ac6b05524d1a2b9ce71c16be42c3c65ba4a2e71fb21eef8d89a06a4a3ded1e070d52e84436c8ede5d2fee0144d02effdfee10ffd3df691c5f89f9bc3df3743e4bc8e97e6bb7031f209ba6eb0787f52bd1b64cb2f8a67f802e717d57b4e3905cd5f2212ccacf56fd77bc3b0796d34b9c2083317c3eade651adebd5d1c6b7336a38ed2068d85b495f9b4e373e4833d1fa49a4e218954a1d7c0c65c94bb4dd6ef22c1b23e37695cba3950f66be4da09021aefb912527286967d2aa12958630893d04f15d46357b87e554f1cad65d0cccdee120229bb2b1dd4b2a36b4821783d471258867127b258f578c37a58de26b488e2ddc0904b2bd90aafc84ca660777e42508a341e6026d14e69e551930d4f24e5ec0cba60f7df9aa178912bed659091dc3969c49ecb02fd482d616b336e9a88aefe0e690806d948a4d6f750fe48f34b3010316b2aa94cd08d5c0982c7bd169e14669a02aad3c2711518ca19276fe453a03ef294c4fe2eb7a9c1d86b161fc80d3970bf87733af21eee1a07ddb790cb4d98d6fc2ed90a42960cd085912191711429e2e50f36d196332fc54b9979b69a6dd73f6f79de7694cc1fa1ba2e238833dade7be4af8352a037823470e76a67d25d22bc90967fb5e1d9eb4c650ee8be09c7dcbc7a1a7e5e2a279010253bb721c0cffbb344ce6abe23ab9e51a94ce9391477743dcb146197f4d545520f3825ad456a8a13e50ea142f8e5a90bec176e9fce83b7c356923019bc62b4168692fb546c7f2a15f723aff64b87896d7926f84304b6f9b3a5ca9c828f2698b679e4f5904c5768dfc92770d0e2a30fe185210ff2f4b78f48f61d30dbf9e6914b14f8d96db01b02ca8cd3c6a97b4dd808eb42fd2edc5f2a96103872030b1da97e28cc93ad269de5da398c99cf90f2e9aab001d13e6d96aae3b51b1c083f01cee532576475534248014d26360c57165b737df4b6b8933687e766e4cb8e2514b126bcdb1d4dccbda850b491928ebe565269b973ea70ea7f0822b22958504efacabdef867c1d0f4b745c4417e672dbb9379d43291a878a9128463878645b44208b7127ef7c9da51d679d1df48c82bd3404c9afe1464edae2b81760114195f98cd0beaf0cdbb709dbc6b78b3022e9ccd0519c9af3fb4aff8e6e4457a4b9cf949a294cf08a4e8fc9323dbce6658fa6648415cda863221f2601e3cfe020ee48adbd395ef37053a1dfa5cd033fdd8747a2452c16c306610788d825a8527f0b179a67dec8332f8c0903be89cc65481233230387ba73d7d38ce830d6405e77e3036d86d426114ba5413f3d8a79b2ef4248cc7b558d95764f311b0ca39791f8352ce14f13ac303dd807fbfe77ab8abb2b7ed44c0abbc439ab18538f90b23b28981e34327ac629af4e19905313e5a2b3898869054eb2e77e2e8bba929aa37b327147a6e8528b9b7371f9ab31cc5892408649a89d1bc3ae4f1c53e5c6c73d2ed2cd8833cc4708b8d5799f1efef2adfd6ede1f99e1c57efbf57c874dbe9da2842052f0dcfaf0b89ef1202af022ca69aeb760f446acd8896f76000df2fed34770ca69b2a076d5d992b756b8dee38433983bcbf963c48c285cc2e9233d0f1ee163080a639166bcb544f104d65bfc5abb54c8c7cd97b86a42afb5b2a2f7becc2e22acf374b4ab0df25b632ae13b61f57cfe91aa9072e35c4df5b0e8094e0d14ac3d7219f2772be42c1b41c8debf29489f8e84c885d5f7aa666f2f53c394278ea9c610314cd370372ba2a4a275eb2f942150bc5432e7e43cb9ea9ad19e4ea4a504282443842b8976895ce9c19982550b6d38a37a58c8d0e1c7230ee90e8f55af44518bbc9b00d04a9486ae23e1f4050bc59570e0eb790214197a78d89550a83713970864ac4cf1e34751afd770cb02555d397160f8a98998e095e1033c851e745234ea76f8537603cdb45c768b222ce52f943fe9bbcdf5446347283b8d4cd8f535978e9aa4a026eca2b71bdbf8fd4f422c94c2fc65f39519cb5026fdcbc5b76e41569c5b11de07386caa7bd8a1f9ea67abda0817fbc1c1a61a26d6e8b2f04e94a01772c513a7b162c9c7afe0968fe4bda3fd8d832f72b002dd84b9f558a5505a6dd9f7aec92d579ab770bb7a623513b1ee542dc9dcdbe4194984a5eea6b96fce2a597df8f63861e8efe5f5a4f9748f3d942e188e69b38c3de9611ad2fc376fcac7096b8fb5990fde139675c963991aebb7d207d581133899563390a54c6a37d052b9e02ef840f346c75d112c9c3f32616223e3bb7184b96e1fdf83d5359d1dec2895efb2ee499f91ee623ec050fc83164b993b0f1efba1a32333bc467cf926495657e645034157574aeeab563da9f64d322ee0b0bb0271fc73fbab791d8eac76c7600016b003834471faf277ea76aace5612d41475567ea44a390efc000bc05c876b0347721042358a5f51d46cab17365ea31632374d0d497164e998aa6bd13d2f3311d6e92f400d420fa02a3a3535cf76b63be3a735f47b38ac388e092224b0165bd4fb69ef7bccb33fb2c289b53a7db8a7b2b4744c121ea3889dfc11b0281a1c87b12b65716017533eea238fbd6fe9bc5886cd9ee5ba35e5f5b5e4ca625b7767530d65f821babf7020371f095e007313c69197875988878798290aa3edde5754b81a8687345ecb65dca9723668f094d67ae8b681d38f115bb5545a75ac1b15ba4e8a4687679174daca75f2ad9fae14e0a2e17dac7d75b16caca06ab534a49650bb304d315b2323a8b8f6b52f3a52aedc1426be2d30dd243fee31ddce90674a9fae6a444f821222fd4f8405d6c43ec5accf9bccbccd9b5fb6f92c05bb2d881827dd9b9ddb9c47726fff1ba91a656c75538f17bd66fa8840d180ae3806052edcff93a7c505f7742a50dc3762aefc8c8e556ca71d8347074711133e844234047b112c5ff2f70c7b8abee628bdae8490e51ba8fa95ac2ecd7493a1686c77ea378133df57aa51c9c853f1ed49fbe9036fc69aa254a9e13ae0bc4284c7e123ab26da50cc29aba27f50d13b56e99b1d8bc6b5bf929daa9c38e9e0a8619f25503a3f5a27ca1515716f764f495473c56176160d4be364e612bc5e62949d1b3e8f326658efced38cad23fe47fe98d181ea8f4b52154e94b78f41dc8d3ec02670e73e45d03bf5107f39e6001b8f375f27d4a5b5fb906de86efd1cc0cec9a78b8ab024187d5ceda9e4f0b58ff5b2d4e1cc7766b508aaa45a76a0b1db5e45f687d493f7a1f7e47e7559ac04a914de1c6dafe3d911e7ee7e1edec395d1e30e5d66f7ceb05c54f67e8f7daabe1ec76ef5cea92b50a9438e3cf3780436b6aea0fcaf501137251f9d9a42f2b11e99585ef8c23e8f315f01a4e3e1fa668a1471311a17ab88ad5a48709b5492756dfb5f3dacee76e5115b3b428c35efa3b0ba3d399989122b387933b834823f4daf9bf7c57bd7073450af59cbcdcea3f63d7bc08855fb4aefe4349038a37aabb4c2505be431bbbb4572375d8a9fb5585109d7553809abb920cbf9abcab0b2e083c4a9f02099221ba81922142637c24aa33efe71a30f2f1b5d32d8083323b3f9ebf04a934bd1382d93fb82d45c29e4f9f45674d310e964379530adaa6fea8f7fa2b42ffaa8325f52add9f5bb097f2592383e4ff11f5081b2f697b9a960c15475143a1c84f9718929ba1b843d12b18d4b51b7734fa6920cf68d45a9c277ddfe45b5af9b32ff9afc3a4c46846c2750431b98abf70aefb9e79b67cc3d55c21744f91462361337ad1c5f4c98c2f1786bedea29cc84943887a259a254bd2cfbc8a723ca3c7f3a0f845eba364bf8c2635dc9637fca7c321ecc87136534a2a9f930ccc64da07a285785d57d7c38b1c2a7813628f098c3576fc2bae533207f0ca7eef4960368097b292a66631fcb684cf8de314fd13efb6528ff5be59033322a094db0a8703e4c4f4a1dac532dd0bbd210e4dda4e14fde3e91535f1db26bfc6a5dc64365df712b4030ff38a1dbb2bf8eb019b8db2ab4780aafcfbc511456d3afb664d3d5951ffe9022450bd26ce8c70169d3bf63e9164fb33a46023d21bf38d7a9af207b26721b6f6fa7704912967fe650dc12da226061162600dd4f5f06889c7d2fac36a1d5ab78b2470c29f01fce3f454b2b0c7587a9f29150d5f6a498588201daa116bc1e662996b793051cffe76b11bd8342ac4c944e21e6ecffb1f281617019290e4d60c713658099227d2e95ac818fe16ed5fd08824d0e080af214aceb855ba2464f1583be620c1a498089da9d965ba93bdc87e6a4aece05ec1b1dbe4b9ca3a023c37e211197018f3a4ffc0a3e69a2f1af55f788b37eee0e634d276ff04add0c1afcd6df1d7721328acd62232cda1f3a7f8e75192e3b48e9244bbad467920a2e3821c0ad01c8731bbaa5692869510249ae2e8ff3b744342d6bb9a5ad6438d35ed42d3a1e828ae41cdb92d38c8bf0ba548991003f3bedf5b72b8364f635f399e073c78e8dfb6617e672309cdbfc154ff524c9235bf5dabfd614b9de024a22912f1f4dc865c2aa308831bdfc00f08e609c2c9b6d1c820639246d98e260aba94247904c181169a5fe29557b94db22f9c2ea3e538c4305c294f4e49539904a00fda5c38d21385dfa8c73878274e47fa8171f6a907cd277c4092bae68db4753356e98d3e5d6602df9eeed2d4285177d53c7c97f6480dc26b809eb7780af441782a71a92169cbe4d25fcd946f5c5bf1403ed73f7c92d95cbdcb299a9658bd4ad23841d65a21a502648c32a1c2d5ffbfa48afe15343efab3996623edd641bb2f3bd0d2ba6261cb9774287c79d5323480161e2590390afec18a0e25423ada2b8206782d14782fcadd3ae5ffc982feb23bb749faa6d1a6b1ffb81f098e27b522274a2f7e7a7b0085364e610a31d0e2c61609eef88879c851bf9f5b169d76f63e99aa995fcf336459ce31510a8fa12d3b7e59fac786648574e73d26d7953d33c1dc2e853f3dcf253bac9a50f41814eebc018f1c068c16553c9fc8e2b7cb409fdad4c904ab12f9bca0f2567be34f4ea8b4bc4cd46451e86e22596a8bcd629409abd088f3f824bf371ade3e08735313cae40949a3755da8af4efa7c19b81188a087aeccc8e1e52b0d25ccbaecb6714e8d18812aeadd3a2c553ed6808e9b296fdf0e185333bb75b32f609cd63a05dd1a0b9752977139353d4b25593bf073300b3096578216ed193619b6816616dd7bbdca68fd7f9f75fb8343817592279a0fad1673733ca0db7f0d2fcc0d05e95f9cfac10d9353092d0de94449222985c72eb1eb676354a29993fc098529f38396ec5ce84b7382b00e58aeb7360fded7b833c1879190c7ba9a1027b546a85167c584d1bfa34b9b883789b77769cb4d025b8a1ff89164cd60e4b5ff541b20a61b176df522af16acbce1c06987ec7e5b90ce392e58521a4962bf1f708daa9ed5bf354fe49c6f8c6367189084dbd09dd13528ab735b939263e3abc73e7ea73f3e1e7a3928811bb0df5ab09540c21b531f2bea69a3c7e3fc2b9374dd1f583de251eb281007e5d156f1b1444305947426a0d06f54d82a2d7ee944ec4a08f46abd397f463a1e6b20c1ef688a437b2bd0022c9bd4a749b49715811512bfab903cd9a2146b9856d4dbdbd9d29bd6b2688e4e9a29498c3c8003115ffeeea4f37b7d742bca2bcff0b59b67292baf54d4cafa9d123ef4ff864ab419c669d6aba4609784a0afbcb976059454f8c61b0aebeafd625df293e5c51a68f081077669f264ef953f3ba8a9ba6b75dd21042bfd258cdf4fe37ac195cd9387799d41454439774cd323fe829c7716c1e330ec41893581465b60480ddaff04b973f06594c873dae881d9d243a8b7e67d4a402a62a8bdf61d9f00250f53b2f8c53e4a03738824adde1c4d76384661b02759440c912ca1e03e5858d7f2201bbe456c62d64d5a51669a222c73da3e6d2e902705ea01c84b06bac3198da25743dea072bd54172a5be4698cb97668a765a799597ed30654f523d355244ce1d57a7e69887f8bf9779c8a9fb392c1e74b1e7c7bb9b2f82b08564ad389d4b063444898365f199a5f869de516ecb200b0fb56efecaed5a5efc2c854108fc3fc9a93a6f8bfb698a94f19f36685b23934ab7501404ff6bdf62cecf7d183c0b231f3e5d4984379cb85548b726149698a900e0353b237d042af68659fbe8fc688dd01b81805ce7e374698559a10fd0e5be0d0f0143bb404c2af63f5b9884f36ba84745f2b177c99309b4fafae14a5bd549123e13bb46714f4862ed12d69b7ba059e269b11800b36ae8463fdc23470a4430afdcf6bf3a3952f5a6ec35c498579f10ec239bd0aba2f38152434d9d585b84a224d9ff4f2ef32c1ad84a46e858008254ce3e6b33db71ae5b35992c0130714e4728ba403855175f491fa5671ede43c93874a801ce8ad8cbb7b894874fb19d8282d034c1154b4d4d81e3f0dc25e30417da0532e90a2849185fc519ac04ce846347fe15718b450095839baf98abbc8dba7f22540a2031b7efee322cc81bba67e62c86d04ec705e54533f7ad040733ab101a9bc8083ec2e4316da86a3064cc5f81a40acd5b14a3f272238d34a02434bc610271dd390dba6316a627900ee8ff373489437768f8691833ec4b556545b355865fd490fbe03ced5262a6c2a7132327098db66456f1bab9bb594478d46754acf5c2bbda60080593950b5825f6e061c94a3937ecd58653bce3db472e18555e7ab9884693939ac8e35348e73f3aad1a95e6584933135a77d01ffa7975a1cee4e0ce909da074310ab1432e33a0accc8f902bbd08b4f3c339039d07081e8eb80ef4d1c6d4a9d853ff56d8c10eef0ee29801286e04c3677d319f42758a85284ab4e9a388344d5a65b525fbb69ceb727768f0e0e9febdf420312b10a5e22f461bf2c84bdfc28f289586640961b33ef3837f0c8050ced0687586716e7376a4e64ba647022488b1b79954a477d5b8ced4bee2b9ba2761145fc87abb075f7be5dcf269f062b4d94589fe8bc9188fa45434f9a963012d630d6bcc35f91d7de1f49042b695de8017d0fcfefeeabcf8b84a7492a8ead5dbb963a61082433a6f41d272a6fd3442550bb4e1ce00eb15265a18700c76a52aa0577536e1fa0c81257cbe69b55e879c96ce53cc9a6129f8bff2de934711bab8405e52519d3c047dc22c04caa5154e8376e8335f9e0a2bd1bf5e3ea42265bf72c4e4b249b5ef5d2d9ff2808b4d94f82d7d53bcdafef5d3d6e0e4a97ae56f5dcac5289ab2eab0a0fa8c3cdb98c0a96caf2330ce50f98a2a9399b44db0564fb625e4316a568d5ba3ff68b019d7c99f4fe4f1c18d3fe56e791417c793a07e976b18ca208c8a6e56b00162433f762e56312277b2b4fa31ec1d613473a0dcc7c88a4f4a4c00ddc2204ef40ab7466104f4bca6d54bad7c54580d77680ea9f9ee4a906f19bda402eca6b8533cda14e24ef66f30397a92499ed6026a077f0cd39267e96dbf86e78806324bb1afdbc3568e8e4e34746778e48aea6d8e1e8bde699c35e12b3910f3565bc45cd886aa23b7925b8d750154ee4bfa4d6e71feb53030fbafa3b6a0b7dab690f1fd0cc214731b58b234d41adb42c12b2ee283d19de7d58222f35d95afc4522b4d704d9555778b42c4d70ee35f5db6bca610e30c25b7457cef71c08c9611e188fe4d44140b6771735aa19e3cce525c46f47bf289c3339a49ca9b98419d534d3b0e14f35dedba95f0ab15f9a361d8b04726ccd0549fea709e9d609f8718311a237e3724461541d539c93554987e97940a41108909f684460116410cd2a5f98759f4634d32cf37ae6c0afc082d349c9982585734eb857e116741b80cb2211d08f52052a5864b09f7ee9d4c5da8d3663644147a52f886df8a298a8555013f93fd39e25a16c00497bf8236773b3b308ef908199fd85fd57fb9c64b0ea518ca07e21dcbf572a41089f8e9ee9e6ad68201df0f375c0d51700eebca8476a08e3fc03df38f7c0ad83205ce2b129f327fb7c3bc8d9a47ae1b404d9fde6ade29b0c098af25f5e887a614f9f89c07678237bfa25e5e603df7e88db570530d924ba7d7ab9c797a02732463d634b9fcfa7e88951ac996059796e9b93d6c18047fd86361a5d69ef95316a3f2182de6e243088588d37f434b4f5be294631996c6e6b44f640c9cc4a25c6eff664a442e7609817f46f6489234b59926b15c6e570daefe77454ed85350e6972b99a5d30fcd4850957045ea193dd3e1edabd34936793d7e38b6fe9bca96ca46b355c03d56dc70271c7563480b9d4f4c86478d03ddd6c0c15304d5fdb48577668b3d48dc7126368efdc8abb8554861883a3c5d8ff02a35db02fd12a944e16643012d7c1b128ca870cdd8cdcd6e2676620b7988adc61b7f8baa0ce0701f21c8d8cfa5b8ac7f0489915b63d79ddbd3be88b2302ee964293ea9113dd947277aac2b735c3ee53a845a5cc3fb93c1b167b6cf71a69c5f9c95e30e5b2272479188ea2057e5533cac45c2a1f133507b726d5f6c3847bc8c9b548103e68ef72308b3b6bd1b0a1f2f058d5764b605004229896de95c6ad99364e60b4b3297c75195a98e4e0321e228e6a8fe82a731185ba4c9577302eb36fc5546f110c90e087d12fc2aa4be095784bd843d6337b33f4cf2293968bca63d8c8dbf785be723813d5d62e5b2ca934dfcaf8e92fb33f7aa586c05c030c661c134db52c0f54c14fedf7313092a6d21ed278cc7293358fdade7d66de97bec916cbfb7111edb712d0fb5d2a61be4afaeb6fddfb098145e9e1b0b3bb5733a39b6673c0e88765acfd16c8d64420a87c7a6c67fe9ad2e986247a610fe613d9a7775e108c04918f2ebb2afd4d05fa6a90efd29748a0487c46d54d861131dc4a934bf59b387931d4bb6398a12c182937369e4d221b1a624228862f9c2cd92ff924b7ae790c74031051bc3c39ef68137d8cfdbd7f360de1a2ccdaf559edf3a6d2ff93aa1dd8fb3a4aeac44078659e8bf59d6f6cf2efa304b3a22254c2c009de4cdad4fff265d17f233f4552df99a9859fd7102704ac5d4996e62000ec9cf80d6a7d5f43004bc31606071ffa412d521cda2cf9e6549ef4986d46d613ce7efbb4930e1365f3e8f3246db7d6a2588b2ab96ba107e319774daae53fb195215a8ac050060857a5d78a5e135688b69f8feab6184d452b648efb787abbf0fc407a572178ec240b216690ed70a2853b914def326882575f110f2b80d5b58aa0323c33140488b8a7e3795df22a3ac7a272e0a8b241ad33910f3b0e8d94ed668befd1ef6b10b775ff5b53fe38e84f64c4375d5338708da742ac11d4309fbbbdae03861e976d162a4005ef0fe0696e5f63881dd342810960305ef9bf128366e06fbf447349fd30dba7c244f450822b930cb79c556f5fab60066bf98857279cd032e1ababa2c638c12204af019704d479887c54c1e91ec8c092636a403cd8c40cd3c8e29eec3352276d45bcd6725f6f438b25bc09850c899a48a9c29eff16a1b8d09bc4a4cad67b985ea5bf361e35a685248ae6f44c620aa0b33bafc6026c6e0ab91d2e1a4651a48eac18f4a1ba811649d0fc9ffb168d7d0271f2c7911adf98214673be112f0b0305d283e7270cfabfc8c22e66874a923621515bea630f1cca539b7092ace802612f545ed9cbfe6adf91a94da687ccdf284bc8c6fb9a86fe9d8f5c0317c82cae5c84a1065166a314da7e6494d6437c4728bda337d66b7a60aae1b1f73c890b8845f17572c5e28f88f1420c058e1079ea4e1c93fd54407645cfd55fff766ee21b5aae9c3ee8df319006ca1072db249fa8cb11dc27aa5907e13c4eba104f7daa07cd073a752bc496ef8e13c312c9123a22be78e55442e8d684d746d42371b1c781a364d2343c29a2f1204619a6f39ac986202307a7039234dd7d840f8b8e5de8092adc8bd0e5b100f79647a70cdd72ea7ab2ba20e8f4c455cb50bc6667d446f7ba4d15efda84af1fd1d28805756c09470e8f33d2d51912e0adab662d8bf9ca68a1eed2f67b40eac0845a9219c684ff03b1a97b0a4119d8a09c762ae3a33430a97b5231bb3171743bee2ad8d563d996ceeca10744fa609c866e3b060c2f5359cd57f7da2f611583e09be4d650e962217269600e302495e951035fea33149d19f82c8ed24f03de03fe9ec59be7cd3e7bfa906d6f8068387a71892511c240f04f4d58b1d121c06d52f17a13799ce4d33bde912fba1da1d7fca104ab66973115ae51dfcfa081fece20eeef058f7c1f80a40d854e3cb67561cccea5752a26d43d1fce2c4cf67b9f00fe76ba936f0aee34aebad4bcc1b2e30c6e257d81e13b57edf7eeb4493bba5fdc634ea9fc75f0370af1a4414ed13c59c98873fcf144ac764ee315935bf256323bd04e908b806d2239b6d3b3b6d8bbec59439a57c6f9c6a9941221fabac53143c75fb34805da0bce72ba2a42ae5a765f5a24c294b9e70e43172b185557e8db5fd01ceeecffc9fc36abca1ff90b1307617b94e861b25adeb23243ddfd162c44df6277065e741bbca0919d0e7ca275735b998c1e202f5c1b43e1ef09137c1f63fa31799fdd5c8b0d58db2bf23e3050effdbf3e9ad2d1e739070456029567968cf35b021adc2295192f5f1c45d9748eb2343140c0680a2d06475215ffacc9a5197cfaf0c23c12b270c338b49239aceeda30e2489c596b8ffe50656aed1a1afa29842755166c102ec2f5662c2e64705549b7fe477dbfc49364bd653186a7d6612899480cca6fdaeacc554f961415bc70f34bff7fd6de7f4c2d450fd6456997e94efa86295291eac337fe8dcfc73c5bbce45c88bcc698731a31271e5e9ef6af0c6891cc84752f31976776874576ade9b42651e8cfce68fac76f2c0bb37a2c51043ddc2f8323f337225bde7ab6cfbe943609191c4dfb0af7e22184de1bcdb354450ef267ffa006eba5b13f98c214aa0875d9718e594de070688850114b9523b531b51f6eef9d93de0c74eb9b894ad42878ce2be07fa95dfaa79cf42e19d47d4aa9d272a64d0ac7d21715445bad0eba532abd40048bbb3fa62ace43903193a69e57e23a7b61f458a582d1a6628ef7facc5cb41aaeb7e713c27cde8976802b344d7b9d87e5eaef19e6169bfbede3cc407880adf76e84b77e6ff968ba278e13449cf71cac5d294ff6febfa600259e0f056dddfdbc5b8e509924aa854e688e629ba2a4a45670bb60889c823749e85ebada6a40896a469c8db0f8d6c848e4365ac89a7212cf3bc43a7b4c5820aca2af1c29d7adb3012d0f3cda701f4db5b0afa66f845006553082233bcc66e620e2d0cfcb0e2dd8bf542341f56fc47c003ed140662ad0719ff64ecc4197d7b5756da8061f6ba9714bbcbaf20d5c32041f9ef8b4ece8d44e05afd47a2339a51eaecc830ea61760d4637d8ced2f931e01e1533d532838ee644be6aeb1f807d047c466f93df780a9170fac45054a961366e25522aefebca7041580513e84c3d7eb25b33b358f4b1db7dc694666fff24c12c3b146ee20d3fe681346a901b3c4fe85afb774d79bdb75405087b7703a52293e4ab71e2e8be4b2eb07e4fada8ace5dc484e960e0acdbb262e3d6c5f37d8ec37ff9ce17ff23d7a44a29363022f476926b725f7ae668415912b679b359034b3ff93c0fa118a4870bc33bd3d3f3bf7e2c95a84abc1cc9b9c4fc0f8a21a8ebe3f264365609dbc485ce38cb3c8eb31f4dca3ee04b9c2ccced68643e1e75baca189129c311c6f2766bb1b62a3264eab781a07f6772544fba52417aa62061df99d41a61c16b26b001fcb98a38748953c58ee3f13ba7277346bda56e2e9953551b28eb650147a737d58a586d7987aca9a99e4531317172d7543ed63395a2c9e1370e27ff15539788bcf417364b5672439c93ba1af425475889bc910e66c372dd2e2795d97eb8a88120378ca2fee6d3f937cbadf4d325cfcfdf6f1b76ca2eb5029834bd9a25c31b1b2521fc327014940c5a8fb12f5695b8a28143633984c39a70dd503f7be6e37d55d176f8f7e49c738e3fd61b065c3c26c7f7baeb987fa6d18f654f34d6a8254ea06ad7ec5211f4580e5d2f3ad395204f67a402c645a3f88d9325228fdc96fa679f99a69ddbf4cc5a1366272eab412ba5d18c3c86d70b0c31522d0e212908a4111b52d9a5ac1d11cf810bd4d35b205897c59d36c8b8a2e9710d6ce27df99c2eb6721fa6dd7e6d324b7d8571cd2c1b36c7f6469f344f353f481facb24245ac6b9c5cb6ced5c8083e78a79886cd9cfa37d96d45899d7dde25216e11c8153fea91e142ec2f9b4b6459ee6b8b2cd740d267d284dc31bf96e15cc64982de6ab41a04b56e1b0b1f1549bcedf9346ba5345b1a3c0f9c965d0de16e10f8750af75a606d01612c7faae096a45ccead8b9646f1b7bf66a93694decb78fa4a94a156b489db19e4ad0dd7f6516a18028d3b2bf85e8dcbedff8bca91780b0b3adaa3bdf9347ceb76f661ffc10de3c35bd9d8d6a58a0560cf671baf18a09546244e7c4d092baea1a40c5db2c668406d12bd1f09a22fd44bb7e840f70fa747b960ad712032c340dcb02992860593539d307d7196a7bf74255c30a16794d64c9d8e343574b7f3979f00f45f95205d9274ca7a40b790d5d10f46feb66868b30c2e64f539a30b51d8620b9aff7a60743b9df297028d69291f8ba54bba1c7db27afd3a34432dc1077c2334fe4b01671aa2c165486af571f8441acae1e47efcc24c8a4ff4e5765d4fc13120285cc990dfa1929aea4814c17532452d44a404e6218f6e362ebd017cac3f4eeb5c57532264b42f33da2ee053a2e78d07b689f1691e24038fcaef9a82900967abd5fa84f0a5215ccf4446826da6f400e322a911213d884b66e5efec96010bbe7c70612ce844e3e2537815085473c627effc1be48c50dfac7bac87036720ba615bb699946cb02f12158cec98c91540a084b0bd7e0e1db93fa0c6330866308cf55bd49af9257e3896561bedc1b1d2300b0a94c81b917e1ebe9259f9e03ed82e3b55bd49d1ecb353064e4438cca13297ab1ab1d6fdf351c4710d1e441112639eb70e09e05acc03606a721f35221c37895c47d5f13bff5b2f2a20c8577650f0f29548626718e3d8a595d1f56a4028b3ea0efec2424027e0c7d359365de278d48de997f8a057caab39ecd20a80b351102a472bc6869dae9b6e66d8171c6f7273bf3f29455d17394e8e564b6ebeb832e94aa19aac010f0c619b29a8e5abde44a226d1ecb19d702557ec9eaf81af266062c8ed0d9222944dfeac03d3d413bff7e5b01756578b5ebd4fc93104d3c9a6d2d2d7fc7bbea064c6f861116c3869ff38135de6afb26d4cdc7612ad4b35cc9ea2117e7042d393126ddf4ea3fdb36568a5659cb3097cfad0c8f86be7a2e57b590b444c4b390871a7e77e91a8522e9b2a2c1dae19a01aa71cd77dea9cb47a5df428db7d9ce5666b1321232a32bc54cf98b35f6e1700225367dd47d32d33f1f3571314500696b05b0c582ffc456598f59d45c58fdeaf23f3ea712517c52c2b52850ad29f190323ead7ab6e0544cf1fbe8a2bbdc027c875121a6c944e2fdea7d0fa984260ff6ff16ffdb73b9c18594b08b7377e420b6daa10487de2b58ecb85da3ff50a2f60080a968a06f94931fdf39591b9e1c52954abf8a67fafe378ce297a4aac94c565da0cb8a86d8dacc582a415cf5982f7d46845ddefa87bc108a7138c91ecbf0233fbbef11231209f1f3ab2f031ff39cf3bf6e9737cef4c18c89732a950b4002c919b69756d1393bf7a4a90eed94c643f1f04fab01121856f907e3fdbaa5f7ed6907cc62b690029a566aa0921a5462cdbe8245c30741f1a259eb58b354c698ac0ef6f9a901e00c6fda20b71d4799db9a8ee4c29736395693c6313a307fa53608771e7fb61e83c914711124c8d3406815cbea5e04e5dc84a462e8fd12d00d6aebce46f943f9687f983fa26e0c5ba2d5fe79e90a199fc528711ef4b080165aaf4c2a67c34964775cbd1da7a79a96a59b7ded6238d0c33f1d8e7c8e6c6a65bbf8b20f394be8dbdfc74462477e201d0d20263b5ac00b4630ea79eeb608ef30c7b7e8385186a6300a105ff117c7a9fb1b6b31ad6b21b4ce8fee000a0270de64a0316f89417f4d8dbb79cd8975a6e22bff888390bf409c269e7fb5c7a99a31957a25adb2f7c8f9bb7c586c8045e075dbd80515ee03960a0f062abf87a41f6f7703ea78825e75d9bf4fc3480f9257cf5e099932f9a20dd29f8e7f5833c902d8df2b7bd1936ed56fabefaf636e9e67eac2d9c3896a351d420118ed9f1844a199d7fbc1eb934daf02e028cf03257c31080abadc089b53b17f0888e55ed8af70c7cbbfa46e963a4f43db366c2e42165f53953a06695f5a471ad4ba35dbc1d83da58e4171fefc1a24e677f635bc2c04b20a6df98eb1eae4ea3ee64ba9cc38f2135317af6ee18e95e516a0139a03432acce63031d413beefa7a94a2a0671220dd181d941ba21997525bbf0c4a8c7a41ec5aa8a56f4dfe98329161d8f27a4984c88e6d31726b38937f7d9ff78c7c27ed659caf478d2d79561a60ea5f57ec73019938554bd6e707db67d15f3d31489e32e9a199d8594adc880d347e4ba1b2c5085b3daf1a76217d2c5f992ccc2d0da1fca8ac94da89c4b537bcfc7fa5564d71b4f57fd436f5e88fb34083550d37050b4da3d1884405c9fdc881d151e0ee2601f596a69523724febeda06a7dbe86b1f2e5cc807577893328c28c1cd791631fecf130c8fc4346d42281788765eb2f62bba22a216d91c448a59af706f575a355102adec7789a9d4c69b6d3ed684756c8a911c8e7ef67706ed0a42538af111fefd294c54c9b416a97a03550ee2809bb4476812e2640168ccad39e1575d2916442e49756651632a51b1bf8917a8393a749b5bb143097f88641f82c934bde4c00afef0c7d9a35a558dc081a149bb51ad4376ccbb89f83e39cf45fa557d60f5f1e5eb5da872df35b73f8dc46c256915ef1c3c17d3854d365f7f98e52f1f59bf9c6225e6d4faa8a4d1c0de32da8683d43ed75ad82813e9a99488a4b72a025ac1fdb6d549283a49e8364845169db4d114e6a84b38594df86ed65fc8d4ad6231730b32913e11c48136f837c554b22fc412d3c854d1ea1a0eb21e6108fd88a55c31c94b350a53cf70cb87aa7c390ac8f76c9be08aaa00abfa1822057beb832c77c0eb0b7ee303740ab7af038bc7b96002647ff971a6eaf8041d738fd891185a7fabe3e54eef924d889319e7d53532a9d87a1715b6cc01db7c3c2fcb78b7a5c09a9f4fb0ea7a433079e58e2f454170d7e70a7661e3d2dde51d3b3db46985ebe52daa06b12585a01368ff24211392cf8723d71ed6a7a9e46ee61b64ec3add2791bb4474c292cb0f46c3a431e736d9d635174072470dfac8bf487c7ca1be7de0bb70e9279c12cdc068724316600177a1a3af492b1e9cc202894384015245822d2137aac3be9faa26b65061a89e7894de88cdfca33ff054828806eb242c2bdb4d775d3e6316ba1063faf3084838afa4f71493313246bf8391d29404d87c33cdade525b19074ca59a21e6163f944b34f39082a7dd514be02cb0b739a4438f2954f2573e20e01f76003c4c9850bb5699fabe3950e4f97ae09ccb887e52234170485254aa684aa777b92f04cd27a49cf9c8f61d580dd4d8890c88bef783ba1fd800584ea55de3390556ae49d509acbb24024fca746254b7361c40e74f5512ddb1a6a5abf9dafb06329596042eecf5adf90be2711e06047f31f3723b9f327d95ed1493f4a4a115a85f65e5aa7f554e1617777525580ca86394879d5bc265826ae974a166cdadef686cbef2e7fbb6b5c66904f0233ab41c535fc421604d9ffcb08c537bcbb7cbe340e928f82211fa1919620fda4adc2c71d4730b2c69d4242bdcedfb1d5677289038e2949402a77f0ed1da44bd1ec66ac318c670797b977d8f2c234b4eaec8d9b2ba4fc90b98e1556b8f3c3d1c5e17984f4992c857a39bfe678d6e0ee5653e21598fe20f0ec5b545315f48410539df9c289f3375939a6cbc0831288533560ee5ee87d01b79911a78684a2bec24b0ab1dad7b9a60b5ae7aa7c3e3282a73168d41180604a381950f2ceab0dc2a1ef44c1e6caf1568ea61b3384e1f1de8e984b8fd4cbd3f4464edde51356a5fb6bfc1e0d3c8aac859bcd22130fdafed93c86efa8dcf4766c302156c51a4cecd389a096558e82e920abe42f30236b81463a0a35225840b6d81e9716b7050993acca7cc9106b94099907c9d67da5082bc7d3e0074a5ec01ffaf7ce2f585492ac6711e992c7db8de6711f63ba473952bb2dbf456763e870c85770eace5dc92ee7e6e1c4768d67db1f10eeb717c27ee067f75c8ff46f0ce7bcb290769b10d97f0bc69f92d05239bd63eee6b0be92b856b9d9a2dc538c47cf349448f870ee428b2a4b1a154deae94459c356735818dd04c0a7cbd3eea084b53cad969189824e11d3b466be79bbac4d2eae567197146dda5bf84879b1abec364614fce9edcc7f5bcafa8a9d6b57a2022bf5968b2331a0e7d6b24b0b4640859208809bfa1d6c05a00a31c9beaca3e6413b9edd44d130bca91639c4bab1efcbcc5f4710d5eda883a4f3ec3638b1cb5153cdf38ed46acfffd1d977f9e158ed0548fddf738cadea317cbd48f4f2ef75c81c631cd5d76a86e19962a2f450e904d85036e0f72326311cad07f7208272e3ae98034446b5f822db28c34f52b1465df535c1e786b39aba111876ae054bfc67899cbbcce3424bd02af4ba35ee2f54f257885849ba2b89f22e8afb53677d09fd1e2a4c96c21c5bb6795dbc1c5fe5893b2cf8768c4ec5b9125bd78ed8aa145673f82c98257426ef8354430546a87b88669c505a338bf0e92b5ab68fa29cd6e81e1c949f354ae7733723c7a29d8fb96eaa7779de633731bdf7559a917c8e00927ee43c9656569ab6fca7f2c59dcfac4e2863e864d4fe5c9295846fc8f62fbd50b3979d10bc703ad7b629cf791a6a50c4bb1289f2f6c845d17fc53fb033fd5b68be938d5ca2b0e5ff91e24464e7266531839dc4e46479940f361780aade32f9b98a462669b9045ba3199a842ac7353ec77071a4b0cbd596cd04e90db6bc8c281309d05a74ead17c1e61a2c679ee171e07f8b52d2f9c6c0f3f2edd01d5425f8258e1c9953d665f5b7b32ed1a740fc742dbe48c726d6e08dc5229e382ef5105ec409d379a9c2d9743b3905aad2044a461d805396e2d29a1ad3cd9f3d5b623e92d6281e4dc089a7f90e3b592bf6e43d5d07935a6f25b52cd478f12bcd9dfed52477e2c1944539f2188a20b110555068cc943ef0e6a2caace2bb02f77aa6b19cbdeae218e9aa901b71b41053210011f0f6f1afffe019eadb6ae2916cf088809834c993a5f36063a9d97d06ca640b71af9f1e924179fb1695553b74b15440587eb03db31eae355458a78602b3df65307cef3c473567160c353c143a712192ec7372d065d00e7af4868b9566ba08786ae3885798e2f4e00e6a285733274559d9b2d21c115a4ef3d9d5869dc2483ec18c3f6872bc562b5383231a78a5f3de795974957095f12a4fa884ac7ffc8914aeca5f8ae674130c586a56356c1337c8458e6e346ef30ff8981a3125894f73ebb45905b484c871c5dc20e6b378bd5cdffed7293917bea7fc0381bf5e63f52e7983c7d734aed5e6602ee5ee481e44ffb024ed6ee71790f368e259867a63ac8ff8b90df6d002c87920d63cc76a88ce07bf3756a63a233163d0e4e173a184b1443aa3883194e5eee30f277a73f41cd1707d899bfba76c4d743bcbc86dbdbbcb555c4aadf2951916646aac96e6c3fa27f1f0dd3225ddedb3bff79bc12c0a87441a89278587eaf01c3104236808f3089eea4bc55a76b42be6bfc7c241c308a8a81f9fdd83cca61a0a7e2e4238ff20c3e38d4d0c0646dd0a65b1f9655a2276922afbd9080ed691e5c8257fd4144021ed1fe6265f84031f0b2fb05ad862e7b7d18cc5b1059ef2b71bff12db5a53df55078cc8d12c0b1057bfb6043a537ab0e55428db556d007efd444c467e04ef9d1cb7df8bb79a222797ca493a5023cb845d7a52d91e22d725155c4ad262d1de8f8355fbdc0b2bc2797f7656c28ea0ece81be3f654ad11176a0e34085cf9792a77691871fed462050ae31cece02ab64431006691e5afe5344287cbb935648aa36a4230127996820b342d33cf302e38132cc7b825d476a5e5f14d2ed4fe47e3cd7ae0b32e0d9737e255657e10ba99e76c40775e3977d4affc723b6d01aa4df1e5b8c29e0bbf5669cdb5e215f30a0efd7d5cf6cb738b332abfa137cb1cfc2c899dae79ff564326b49eceae80524ce51ccce5bd133a919938a42675a4b76d1be765296bc277b7418679bc718089be9af7b6dea28ed9b7571e8d56b550eaa97d283a02caf0d3c32d2116f55118bef3249c86f04772061850b55ef40894f3d177555f8b893167434976bba16b5ae647ac6470bf09d5fc47814616be5c43d71c97adc6eb9beb863116a3deae53a64771bcf17524cd41c2032da0544500805f1ccd9ac513c371511a62706fd82755a41b86459eb4c681f8ec8fb66306079836d8cce84901cffffb75b1eed961209f26d61fc9301b80effdcddc8a8189782d58e76214bddaf12931bd5d6e55a8b0e14bd5c17f167c9ed84a367e4d99e5fbe2fd5afa166bddaeabe8cc113f1d9d06f4bbc3db394fcd31fcbaa1e76463d72db0b5310a50b3e0397d1a25e0f10f3ec31422dec655c3be320721f300e6fc3c6de86ed65897aac332b335d4664cf21fc12f14b2b110c7db3fc78829dfff6f75027a6c9ea17c90591064df2ac9914e9a59b49e1090acd3b9e33bd587e3490c116427b640a6f735b2188722036034247e5d3778530c53045c44655d07b3a7513874b576bfb349bec9ff23cd6dc18d1860843768e3536bb14ce8ace4988d2ebf60cab85101b6da6bead2167d713941a71b4becde1662cd3b647327de0b4a03a40fe7543c9f7bb6d660f6a9e71287a474a74bd65698cc322c7328a1e4acc87a18f06c2776973a6e2c51d0ca206112f737ffb33996f175c39dd933afa4c680e561255da5f7eb7c165c2cba362cbd7e0c7e97c8c38a1784432781718627efeb9df1b9d7bd007798b75b489af29ab227872cb6e881fbfacb0aa49e35c78feb90721a6ba4ea392607a4f949762b81e72da7c98492edd061356468de562c0827d5a31be45679b77c6ab7d937b8e5cf870a5252dcf7e6ec72d575ac58c0edeec5cd20d22bf7bc05ebe7e5d1be10c75fc6c85e723f433a624ea4e97f9c93ec66bf8e0cfeba46ed93d22e7aa5824f076f8e519c6d051a31c0fe76eadb43810d4861965b076fdf674adc0093fc2e34dd5534eb9f2b9905d88aadf6249509ae856b0514c097c59df12893aa0307d46b1ff443a951b1b9dd0e90988a72b9fdbe324744ec69353d0244b745a61e14c2875a7715e6d32f8994e8c154249632f766645c4a8c689d8ebc349ee1bb6693589415d457a2fca459c46c81b1e126ff629204bc10a9afc9526bbab4bcf656f20e611f9d2e028856a19c6b016f7a792e2e68358fc1476a9038b99cad3dcf2bab0db7c9989abde63df3ded01e36ff99229aed223b28fdf2d2e68ea779e1a902e9d8cc513320df42ad4a56b57940082ef4f9256705013920f1325d7fd8fe2978ee85e72ef93ea4e47916c3d86ad75f09ead10db0020bb788383614339977154acf340967bc40ced65cd23e03cf6eaab88e32224c0a6ebd0c56326bf38bde1e026d1d4690bc999cca96497233b139c86527bdbf9300c9c4aaea0e4d0b20cd058e7c0f62d1e423ef11c5e5d459f667cb2fd38ba81ecce1995a68700667a27ac42f403287eef6b2aa10a21b2b35706b6a7437845ab2b563370b827381cd7b8fc7916c281d862b44638616ba7850e272401dbdeb5eda452ab75cbb89489c204cfa371f29ca422e2f5e8dff2a44bcfb3b7e7e55180be06b2fb118a5ccd39845aa1675ebf4501586d309b0faa275eebf93de1998e91834e365352dad7a0776c237e9b388bf9c2088c6934b1d7fd353a4ab09c74ab2bb356630837bac1f8b183d91efc9a14e25615cdad42e4f631ac5defe59b8d62f368cfa6b6103575cb8b658d16daf56271bbb0999dc5aad873fa9dc13dc4a11d53bd2396809b8ba4822ebc6b53b185b7c628d71a4ec1088b5fc702751f24e89fca46d06e199c1ac18d6f9e433857f424821a2db54a5d387fee0c55db20b0b5409b5ce5bc97ad2b7c87b5bd9e1962b9c27e9e0dd8cf06c745c3a36ef02b863036184c661fdbb8e0c0a548fd90fc8cc5d83f61c62502937af986e04fefb3e44d50ea27109aec909dca284667de296206433f1d2c6235580ba793f9c4d0d31cf4ef314cd00dafda9025ffb456a570ba9d43a954c3a94ea444ea87b9e36cc955ae9ba095ce4a85c7560a0079cdeb25f9aa1ec3b6a4a43d6ac71c6f239c5e7b4cfbfbb1cc4d901ed9086b729a141e57c6babfa9ab7ae998932263772c24f237bf8db916f84673a109ba0ef7290d4163cd6ff39ad178a9fc0c66ee720a277192ae2913f3e6177e17ada917b908d6453c12d6da330c2505263717cd88be3399892a098b18f38359558bb5e7f9769bd7b4f161423ba1f92e114f6066259fc0ccf99832852b7622e341b4e24a53902eb8b75d65fe3e53c41abaec18fffe5aed99d1d7d46efbe2dc6fce95757c837e92af29109f1eef5e38533774fb5037e7978ef167737eef0864723dc9d2111d117ec3743a76033da88a99bcde78dcf973bed7476b138c84693591a136e09f64e2b31321a26a0a31b7d6d06ee1a8af791283edc5a972263d854291b60bc0fab56a89ddb49c0f614bc34a9c085b8ea59db48be7fa7c7cb4f8f4b415168c7b06a2e218e66b8ff22f7d1207924a71cd89149638d8359f4316e3b6dc94bcdaf37bc004ea3ba2aa2d12cdcfe769db16491099d661d4b8a1385124f2be3ba704ab1011faad5352d5400e385fff6117ec712d9af2f03703252e36e8a3fd65ed07268a2704846e4533458ac9c571688bbf7e054ecaa8f166d1835b7e1cf9764210ba7552d3f33d846515b580f6ac02b7d5022acd27a5aa6f3043c4d17fb27e82ce931b87952c277e8f2a2da90772e36b630de61523f4d0124e42589304b20d8e90b29da5a8d1eec9ddf0996fce3ec8b4b56afc18484abe9d4282cd2d91f7c3d6b74c8a6f9196b4ff70f13a3239f850b9811a23b88d3373259e64ee16816ab018d458b54ad2d1427746b8f05928a47453ed36367a46828235222c2620f28655a69bcece982ae6ff2904125394fc3327ba0dd5a7624faa3e7ea762aa22d535912f4778f3caaf0d4bcc16cdb9574673bb33ea75c15bdc420eb5ab8b27fa3f36d89f289d54974d45d721781b55d34b4d7fbe7835877d6a5ebf6b9403ed2a3f815cf68bc8b2cb4af511d1b0f93eff2b1ef04e85d45064918c7d3bfffeb46b1dc3291e09e5dd8baa50446404282b941005c161a03303d71db5f182a31a6916c55d3a01e2a83371bc56d8fc11da9a3763cccab03a6048f75e7aa8601e358585b2b9fcf997c3a5ee558f3dc2c0dd0c5a5de337021e466b7771d0974ae06bc00c0a3a87099f6fb22334e9cdd5b0a31bc3b7527eb576dc9a5536be15a89d3747b503d3a1be8d1324c217d1a51f6ebb9a39d775dbfcfe02580eb6deb12008ec025cf2618c6965298b1f225842293af9a89f23c816cb797c59c8f14ad06ced52a208b47c7b6d9c0bb27dcf65d483edd9525ce21bb30723710f4eec4ee6f164b855233c753f2c512aacfc47ee8de8f07e2ba78a9ddb768d55089f23e87192257db909f2d6fb7efb7e204fee5b2b9438332db5ea260f1509ea2c05be3e950a8e8979fb776296bfee9628d316001d43d1b1904f7ddd76b25d38bd572b97266ac63617480118a58b3844082ba1b6b6519899251db686481f7777c22e3abffcfb83ef2aea582fed830e75d9b217569ed866b4579e0a5860a2ce97ccb221bcb8ca6ce8e9a048999021f79157da8c5d317ab736ea68f51eb9e3d723914e31793416b96fdaed376b30b6c9ae477abfcaf4ac7c0fae82d5ee16acc03d5cf13b546a473ec96ef90ad7ddaea9d6cab33b88f4743822a8a227fc8f55b741785a08bf5652e4dcecf7224f2ec041849e069a09ceaad3395ce634bdad3e52c906d70530d7f3ebceef6f9d5e753499b79a029fd5f88c1a6a3bd7c308adc1c1d3c1476c2973c15b67b5879fe127779decef9fc161e17651a9115ed8a50092f7415991405bc2d5872cb84c4ae89af68919f1c29d38d8c3fea99a2804c5fbb5a0681458217363f94bd580a7260cb55b77d1b9daf0f70db7f3f09130a6a4237552d2c64d281b76ab3af44295fa925256f0b56835e9468072fa0c9645a63177132191c4bc4c3fd8b0c6a0b99d553709b352e84e7e5cef4d369baf9f6655891e1026036767ef0453c74064584d06fc9d1cdd2ce8ad087b0c5cbbfd94359a1ef10b2f4a4f21c310cfaa07c22e295d440abe3179bd79b4a0f92b430eeed179748ab233e26e063ab7e7aeb5976fccd3ce9ebeb4b421eaa4e932001f1dadd3c2c67cbec5c06f2c46507269c8918bd273f07b735a22caa4bc0a154f3909682ed6b912739bb0b9db100c16e1a0ea5e0bea4cc5d0876ec09cdaf2a434b3ff9528b211e90087d8135763d9abdfceae6d9e0095c3c4f38258e5db355dae3fca68fdbe9699f45a1c16cfe1475f4a506b675157fe177b983b624bc54cd560e98f8fcf4b5f5a9d8f35d7014e092dc120efe659f38156649bfc05eb4607b3d7a07bcf1750409ded7ff397ea3f40f1bc10f26e96978328434aefd14728a57299106a4cfee8de1f87285465878360ad60f91b2e382d7ea07cffe511a92dc86e4eb2a14488f7e44650140d1c11498fd6113f2ad05a8d9b8ee2678c6bc5d25bf12db9623e9ee8ea2d0f4a38f93ba7303c58d6e352cea882d55b1e67787bbbbdd186a572b9d335d388116d8f33e5807fd5660085fa093fa5bd2029a978dcea3cdde6f01d05d7b506b410d4d843e3e02c704d25a7e974432b56a72b09d42946244b6a99c675e791e41feb512ccf1d794ceca4dcc5bf956ed859a17ab0b798552f27c5f220dc448d3f914143f5ceb11c6833f04e6e08b69d2ae4bb5cac2de346cb888d6549fd92c3a68586184a884851f2439ae327a11ac55238f7ace6d0e9ac3710354efee822a50a9b7b3c9af9f917869cd060528c2a4bf72dd8f33998eb3bf5a879139d95e5a1101369aec4e169176726ae0fce7ba2edd642e2980c78112903ef778cc9f5a0422c1fbbbd4374d6d066e903ce98a08bcb6ec55b222001d8dcc5af303a25ad8f222d2fc1a948a588b3721e61cfaa14b93e6c631b0dd2530405e886679da61238f52af178a0665639030b31644d97a68ac427022fa0735702a268d6b04f5555392552c905cf0b8b4d4e977060394552ee2065ae0d1133d70e53316108124c7831686ada03a2e4b65a9cb95ce23fc4e40b2f26738bd0f72a8ceeb1e66368ba9c47419e76f3ad6d74a8d521fb079fc0d6a75840a4d90d3edc389c8c5e384b78271c450d67ff20f1ff9b90a8e625c3d313ff8ffcbce80653a49209201a8741d3f1cf60409883b60a6ff7ef8f180f734e3ae22034d8aded39ebfc5bbe7182177412363523dc35fb80ad765adbad691d0d05ff02aba6e57d6b10692c2281cc91f59a0b87c5bc082788c10faf7e8f4c4eaae9449ad62b9d6b641cfc31370a5211745ae89fb99350e2ba824783060a5a7a793ce9c0cd478a1faac9ced4342e817f471ea0efc90531dac9785fe5d2d1624f32103dca45adb01f8d4f6ea64f0b6c909b4dc535c3fd92ed9897442ee2bece1efbc04676d44f1dc6ca030a83c87d75cda26e46d06fe909f2b12963a0bc3ac77c3a7fd4e22852d922f7300fe48b27a49b3b9653edae7ca3fd27c6c1c869b990b9d2636259ba5ba4b511b1a2561c8b74fb42386a6b8a31e588edfd06fd4ab0b771f6f31bbf40b860a34e14d17f4c197a7d1227f2f157b54efa5efb0d278474e5b5a6cb999a0490688a0fcdf01d3d28f7bfd78726415ffe03d2c7ba08774a0f14c430b67b327619ca315339ab17b9133ed8021eac95de1e3780d85f43997bee72f322ec3666a8c5d9716fdebb0be55ef580a25e7d8d68c81213e54d999dddd0f7cc50be45fc402d61df9dd7f86721d8bea52648db564cab391408acfd1f12a27cfca90f91429799b0284a25c7bdc65da8681e64af78236354509af0b59119a6e7367dd834f2a334e76841027cb0df91de9bf7fbad24c95943a691c3ed04d7e8d581e0be7b6e10f2e5de9c9fff5e1a7fd595b98c37406f01f07a1cedab453fdbc8e17d358ffc80906fd3fffc85cf4f49070a5c287272a1dab17a9362fd479b9807d3c7657b7b63d929af0b642d2a9d2c500f26f0160ef977f843a77c4b2efd3ffd0a6d8360f008b5bc553e008083a6d4a9c0af17bea07289d74aa9543886437040a494e3231f4f7809b37e56aca29a4dad3b1f2903c6f408abc8a78784e0821bcc6ad06efa3bc1e74f5f6af78826bee505f01d7a2968297ab2334887f42e00f2c5780f20606ef37b40d624e24acc7e6d0e904ce8273db42d65bb9f0508f7a5955a30273fd4bce5e2936881bb37ed6ab191e07ebd1dc4fde35f33ffaa149adb5c34d6a79f1fdaeff398279d9d6d74a861e1983003b2358ba7db917996ecc419b222028f9509d4c927cb2c779f0905b04c3157cbb5172081b02321fd872d50cce11af0db205a92427f7a2be13cafd42926ad10ca3094d32e125a2ee1b072ceb942255bd1c48063dd4efb53097b26514fdbd4502dcd68be2cfd5ef24f4e9fd31f985e16f9969c0703a2db532a31533b2013ba8eb12b829bf4fcc9da7d3561e0a901e0018357ff3e9c70cfa3d4e78ec953c755ecc0a93b89b81b92dbb4b178003069b8b5e60917bd422e127bfb7709d8939e076685328b34ffe47f8cbd4b9fa58cc04910ff90b1d16800e8cbf5111ac18a63e21b7db537b44c6165b442f48c74a727de57502df35b3ce443ed383e4c81f27902b533b0512b20775b3ac98caf1c2e2a21adf584e7f62df452d40edfadeeffe43b6b7c4b70d56b90f5ab8294389ad88593a50692f571db4f6025bd977027c7e1bcdc64a3277df3effb41332c7e335d2086006ff11b040687fd6fdeef03f4b74917cbfe8f62e583c31dad4b224ab01a9c98046c784896d46351af0cd22cf5e78e5ce524d2180da7b906e99279fd90d5985bb0e677923dc6c4869355402e8dcd1f914c1b12a2dd50bb215f1b771b2d41dfc3a4e9652a480ce2cabcccfb165f282db38e017afeffb5cf2305c3033ced870d7d4974a0d00cb4bec590ccafe1b679d2c2b1a374d8266fdb59a103fec795376e0db9de7ea599e6edf323ba98c9e3b0dd38d4ffc721f81a57397c705cbabc409e1a03e715417a00fb507bf911753f995a634cf976826788b4f6512df272a41932b02fe21b259de90ed1a4e939bcf4a38bfa20151b2b59d7dfbef762cf98d458dc0715d8ad0a994836308959f025644c29736461e863eadb3577144e4826d4eb92d990576054dcfde87243ad1acefc9e4c2ab36be8eba4be092d9b82907ac503aaa0c29cdb7b3de672e1949cbb7c0eaf7cae3cf51e16bad624e068170d59a415fc91dcdb601022ca79737f7bb496d060c008187cf3378fdc5e3fcf2e66dfcc45af9c6f1d78dfef6d2873a75e9e017ac406cba963b43b1ff94cbf89d9bee353b1412b430b708ab97bb0af8fd173b9de802d83670f1905a6ed9beca3b4f47509e97c77e87cde15087d54da0dd27052e1a263793a5a53a741de695df0230400789ee882e88e8b4d5ef4c894481bcd7dce228f28d58dd6eefa15a7c3b3e57c44f77652d22094e9b0d84cc366d3d8174eb79a9255815c098929ce2e34a571822b79a179749bacc1d40ae8b857e98ebf4078f614c8d3b1b87afb3e07642623002deed9dda664a1a0b72ea5ec5f1ef1aebee339d2044f9df0751c33f3bfcca5c0442e1d6a33c114f50291f89f32468ebca4d5beb0ec51ecd6745906ab60e5d6e4d57b060edef2d2f3f3dd778e1839e046c78c4abf84589baf4d084dd31d184428e086304c9393d6af0019bf4e44e5441b5eae84a8002c734fed89abf73c78d876ab064d96c69ae24d0252bbc8452272625c9d8304bfc34bc0146c837952740ea9e4ed6a08ef2cd16e9c2e0b24d813fbfd866cee84b6aca94cdea7250a4a9f18dfab3d7004f955c4048b2c04155e790faee158a5f8409b7962623a1cc5d6ef06cfa898b5d81452876fcc76f3e12a4551a7da44503120d043f5dea9de6b5241521ee91b663e990e3034835a6b937fbbbeec9a58b4b795785d442df8dc21bc3b6a30e6f431b86b8a4ee6d9e537fbc02cccd54a7e36f0d8f7036cc8b44fe3e0aaea6ff9044c31c3018054e25e8c81f7b383b0e955d181ef5bf9307c5818e8b6cfe5398ad0847f33495b4fa7b209f282522097686a817798cf70cae66fdf12c583f6919078d18ffece616ac3a5acd49b1a4080e25e7313733e2d82c83f2011630a9efbfb462d8082ab857caf5467a910c5343ac2a80f27e857b12c91bdc059831c865a6ce55360eaf0c7c3a4a5253b259f15dff955bfc96fe96dedb663e568b55681436095e54bfda4e963e749df4debbee71269b305ad67a98518eaf6111e44d0ebcc7a20573bf2c234370fb5dc07c1e5a1efa343d66327b100f397f9a96eec2ac1ec9dcfae80cc89e5a651bad62958b9b0fa06b4ec7dc356ffcbcb362af76122720ffb5af6852eb0f74dd7eddefcf56219576db54f26a235c1c97702b1f548e74232cf1f6ab017dbde127d6bc5c111a574b9fdd0a3190bda61e30f356a97c381dc998981705016341a1fac2078484963d7f784158b17f692af71026ff10492b533868ba6b657c7241c698427d877a0844a48f7b21a73d441ce63fcaf958fad4d8447b4bc6d94e5a21f59745c79d281be2c4690f219c262e8200d8d38b2afbb74f68ffc8534d42dd9823d7ae5c088f018e7d73eb107a1613b1a2247f4b9490dbf2a33db5eccd20596700670a4fc27b4da7ca510ac21146e1190289bc6c3872e8445520fdde231b0bd890a65d0a10428d4510e249d03c551bf1b8a0b33298efe9d71945c5ba61f7ad73f8b2edf3aa1d0682d2bd4b943aaccdd5662927050ecafb749cafe297fec3095952ddc382c9ef74b4a6391b2af362bf0ada47a937ebde5bf0fca93978284257fb97869bbd77001880ef26e4ba0db2fdcb3148913de56f907bc9219641e9477f9d21d2685c96e9f1142359b9be1515fedd248a4cc0773c2ac30dd54633c55c7e8603da5bacd3bf63d51684c680c84d9aa2e4b00c3efe2e4926aec9400ec591657c45bdd06d3c7b7f11793cc251a97a75a6dc102b79fe6d65521bee568565eebebaac9cb4f9ce24b4fb1d80b45eeae1ce436a99d95ab75167d77879913deb69167a3ece614928ffcfdd63dfa7f5b9dd5b45ffb5fee7e9e2aa6341a7b4ef61bd0e258c09d0ea00d05275116e59577d40826aca41e97fad03a00e241ca124569b3857c2db706f8291ecedbb3a3210d4e58e79d874f93e6ae64125947f5bd1beddc36ccc3a3adca15063f24258e3f1a6854fa697e372f12f6f43c24e452fd9f8a3359d0270d7bc86b66871037ac77d9b1c814a44424e4c3802011c5974293870a2d08b06daf6130563184130b2e15e16cb0571d98fa237f66554c57e43665dd3926f7aa26478615b626bad20ac83cb31a60ec116126ad01a0f3865645141949bbf21f72e7f633ee2106bfad78dd286cc2396c25058dec89e1b2180d71b472858cd171bb2c8bdd6a765317bee2c4dc425716b7685bc2840f24eedf20ab1d5fee279af991dbd193169998dd586f8beef36af9ebf23668bb82140f3b5004852249dcf0d072f1a6934bd70b8b299c2787967e650d726b6f932e99a1faa665761511f3769fc907c8fa9dbfa3b1f26d5da770244f51055fe6daa82b95b3906fe1f5d21e0ab3a62999db1efb9e5ce599a3262a01f6ac721ee85ef06fc88817db5181d89f22760c1449edb9f7ade370ecd47deef952c240a8a7ecd12773a0670fc52984ca03639067239c3d3f3e85633d69180ff0912bbeac79f6c74f11d6319f353b51178e4bc72f477475f93a6cb8af0c81abe51e23fc393f8529fc8d463b8d264a0f71b72ff36e5159dfe193f7e6ecd719687ea46708e68ac403e6aa6e997b3eaa6db6c864b9b3f8d695242c53be9b3c977d24be495363cc9f16d97b22d3df433192af5775487619517d120939fa7874c1ef467218d4c922b7db6b2c7db8cfa14aca3f45b3b6776eb812235964c6c8709d918b05da9a46b435b9abcb07557b3e862bb69af4a05be8f3dc71a8025ad0658804202ebe905dd85efe76188b23dc893cda544a7a586a310983a52f88dfb21afb5310f39bac6150ff91c6df7c32188c846ea1fd839f8c515afdc2ff9eab51cda0395cb2fc08901257c8a91ba98e9da8a148a5cfdb68f91b3e775842ccfa2e09c71c2c21917a9b772387fc7099e61a1655191b3ff6929c3cd45b06c900606d3fa9ebc4ac2de383c77b8e9f12ce4f3dd879eb4c3da4c6de459e69e56f240aeaf08bf3421054f4be3580c25c4717d8ef304b1fff082e7118e93211e982ad2702ac87eec3462738424216b6fbb50e1c618f88bf5223a454984ad6107caac65a8a059b4262273a54b20d248a7899f9993b20a55f25d4ed86e50b2e59be28a6f56dbaeee06c06d90f897df38eb37d1a5a214ff60cd0fce1246f42066fe09be7953f8d2309b2f2c7dac01a75a0ac40ae5ffda57ce56154db4ff6c816237b0b21d03416ed0dd346c8b4b69bcdcff2c1aaeba43162b1dc49474ccd1fbc8a9d516d2e7d067cc6e1c30b2f9e5b710026cd831f2a1e95b979bf9b1561caaa8b942338d7f2e43acc32389c3b2dd282409fa613f61de2577d226d469177caa0c857645726d85c1445b3bde1ac9597aa78ad9637c1dbfe22644608ae02d38bcd0c4068c8443f732f34077a6e5da797ae4c6e73daa330ee3439ed4783e7257c745c71e132fa76aca39a0aa1df3a9c11ab85ff97672fc8770c66e67fe8e8ed7c9840910d67c0c8b057572642ee36fc89296c5fa088b9e958d346a3c079c94f36e212564dc8dfcf87dd8321459d70d1b692e0f784cd8aa457d488f1a87e01ccfd4602446d867ef9d60d1094bbb8f3b0b461ee16ef27eee854396cd814e95c5b21b02b24ab9a10c197f81d07f994e71fb236af7cbf99fb7c5cae0ace123ca521d5f5931ba2d961e18a5181972b3c74863094abb852457d38f156ca8696cddcfce64be4377f40e16a3a3b7f8307ada352ccdb7033080f730d241d989b001c65d3824c683e2bdfdb0075a8ad35e36d7f14c840e336482f17931ce5c9baa86f237ff827b75f37b58a9324951c0858c15dd79193cb75c9f13777d716e74fc0b0eff8f0c04f93628854aea7604805255c69e638adf66462916b2440ed14349aa633bf93afb6bef9fbba1ef4b509af837c8a8d5f9ec7ad30e7ec7f00ef6f95de39e62b254e291117971c9fe5718393ab3a6519bd3027df60fa8a324f856c7c644fd55b2f12b6f463b64f1be1e7d9eb6b8c690d578b764278e294ac39a23801557f886cd2069b63aba33fc515d9efea5cea3dac5f57f6b08bda77491551c557548722e105b7bc53683d3a15d3daf2ce514991173bb9332a58734dafc1bd8d1278806ba48f07cbc8382f62168d6dc5a6082b5635f07896a8888ad43d6c1b25431bd2f3c2b4304c2f4b9e2dd9c3fa3837c2f64c4c98c3f07c69f346d89bfc9e58cad7fc99cbb95a0e795c5322cefb68442c250d662abd5c8716be75cf5c517e083c09e9774c5e8301bfe6bbe1b622501048b88adaebfcd83bdf5f437daf13d5dce2c3e12d2adb2b8ec5e1e623763553e6263b9ce6a193b3cb45d6abc5caf1c7e75dc236c0e0007f1eb8e6eeb0d3c59e709eabbe19c20d206f4e62e88b0b1b7f20bfd1f0d2cb77b6bc25ecb832e4c7ae6533b573278d1b16ed3e021092b41e2202cc0190ae08b66546ced8f0d67652984815d48a4a9a4f4130b8fc306f109c152f2af8c2a8083c645134a6e19c1bc756696476e0403b3c70dae0d2e804a83cb0c999cee9243f70e2dedf0440dc56674f0445269fc5cfd8802b7a678432cffc02a11fef38ee69a730cc893c070ffc27a2c7955c32e51bd782c5b69355c295e68e8712fda348bccb524b1875a712ccf88d4e6335aef01f4e91df941fdbdc3e5ba6719fe9c59faa18eaba8861a5f2d22017d5fbb32df27a707db07911717b06d43a8405666a28e5aca89590529bceebe04842065f9cdd528825fb3e8ffa29947730ff74477fa9e9cf7523e58612592896b42923ca65877bf28c234928846cc274990d2eb1f38ed0c84cccec140adf30a70fdf6bdd87ff042de542ecf6df6697efcabbe521c11eb854a6eae9bb9d207196bd5864b60572cd2911748fdd84ea024930556b4ae61d53491923ad11206fa43e6e15b675e1eb5909c7fe5c6efb7b2bed9cd509314bb2987615a1ec88e88a31b60afc695ab7348617b31eb44c2df1a58a85743baf48778c4ff5c6355ac0a17c372dcf96513612c2fb74c2ae00aaee0d29e56308d7b6490f3e1f42bff982eb498a0819a79886fb3787bab2035f04cdeae8cc6bb41bff384ff38753338ff2032903bfb55dc0e7927a4cd9f99bcd24ccf42a642801fd7c5b4eef7df8f200cd52e377694106ac85a7480816e158036b82233fce66c756027ff78c46482012f6704e9a551591d38d2ba7fccfd2822175de230ba6662cd5318b8e62092f5afb6a0b13cbbda97eebdfba4215e70ab83db1a4954d1cb18a7b111f9843d98f67facc5548786719302dec21c9860fda5312b8c87d0bc6abb668fefaa7c7a3bd3b173c02f567bace86b3e1552233e62d82f44c1eaed01d234274fa6148946a6888e9e93ae99892ed8e62d2833f815a9e4e202959506a888eea5f9bdc20e44fadfd57a85f794f81e01569984522d34fdba371198948c8b5e4262864978ad89d7495726b8a284bd761e60ab708147f3fb824a9a9c9856bd139d92af8b84b11bafc41e38ada6968be7e6ed0f5f32fe82cb439970884130ceb16d493a6246dea615cb5d83853d7a24472e4c297ce2dfc3b34fafe5affb56e4af5b79099229795ebcddaf70382605467c881fbd0a2d9ae8644128d0c5bd9292cc3633d4a6d01ab83996f9f9faf340af9dcbc69f66efd977a8af46d6ff3df467a4206f71618f13dfd49c0e296478c135df1e5136f84d4da4ffeaa12b6ec033742b9db75c1eaa3bf482be9c87eb8680944c59115733b79d7b918a7fabc8586cc36f0e91fba29f80dee47bf88885622b56531923f7ac17120f0886bd39d1d96b85a489643789959daa105bda81065dd2d8286e5b393ffe8c864fae69937a5e9906239dbf80cf8aac7539bffa93a54ce95b93d615c07a0dace6b7e9ab750ed5884b2397288f85be683e3f0e1347b59dca26b4b71f5ecc023221e8bda633c69b6b7bef0691e79cabfb637e4f7a708c3a7c0e7bcd2c526c6bb7ea0ce77dfa90c0778e7a9b2fdb8813dc4c49e93f935ee7414008bdcdcf39d9abf5adf442c1723189acfb9bd7a1d5557c01780c00461c41ad5d45a292e0429413f91657f1b71337ab3db7bb759ce9b8a645525962f1bf6a74d856ea250bf40e642bc3b21bd2f12cd6e60ce716e7dd45235101b41a1d9f1e2cf1a98fecedddad7080384723a3478baa8214b52d56299e872545f72244794137f407090dbb1c7822736c7f4c903aab868876fc46edb2bc016398a861ffc52dcd3d4dee4e7a2eebc1b141176bbef34813034b6804634a9d4fc71b5916135c7473b09e745de6d3e545b206cf48f5699709fdc605fe607a918536abac1fdf3afe4243aa4da309c1fcaac26068ddd6063cf7b489c0871d1d9225aa15d78347ab467d5197386f30f550dd429ae493dc64dc69b970527b05b2aa603cdd38cd39b6321e409231815e8482bd24b4d3a134278c5323b0409a55452d5349cfe2d4a076402f32d5f4f62d6884cb85ba8aa6da63fc3bd294cdab1bff31b6bb09825ce443407c69befcb66c9d7b682dfc250e93d9da9f6c758a431909eaf3c4b33d5363e5fee108456d4a05e34ae4e8d96dbe5b0390f468f48c3d213162876c673d7521ddca064d8424840d90fce01e10f1dbf349c96a2261fb869dbfe93e625628ac607533e9444f556829749f3738cdf06c6c9c3d47b34265714029930f7cacd52b4177a131cc1be88f2cfaaa62b87494aaeafb52fd5582a137b1b183c1e84fad8bd2d5ddc85bc23cf97081b4066c926a8c32c6bbb9648b3fe43f788ead8a383b271750c8e0e487d0d5d723cfd0e96f77bc92dd932d3db197eaf921f5139d02aef5298dc799296f7d98455fcaf92359a479a83ce1c4b233f5851124e4c96228d158acd8ef9577e076432cae0fadc1a9161377ce97a5c0cf65afa07a0090bdd36d6f2b3b393793e3bd9176f1ebb7f879d946f58ba9b5d064295cce480203db6abb4f36018b952770454b375a61e447f210e4bbebe122fa5536ada8e79847913f8fe9ab6b64661b0b56f6e16d1b44b02e45ec2baa84ee5c8c5411cf4b8f4948f15debe798b87e9ecd46e48fb0968aff348bc0f5e8cf82cf2556f288d256e03fc31628823d161cd4753de08e64afaa87fc1bd1ba67cdfc94c51777bdec55a7fc57525c52587a438b83fa47b7c8960c519f1d8a6b0698bb185b555baf39eae920c52382552fe2483ad5f355f709b85544453b26eb4dfd95d7229e85798391a58c7c7e9a13dadc801751cc5bb6650c9bedf445f1e87bae91caebc151e9bf54f060e25821d7bdd43364227e89cdd909ba51950ec5bc4ecdc874231b4794144d22ae84c604a582711027f3e7074f81023e8cb68ce531decac42515e80ee87549cb7d8374afc1c2eaba53eac828666055c5352a48c83082438d500caf0b51a2c6bf3f506e27facd80306973d84c6bb57cb60bad4dea5b0c713c54eeb0472c428770d574641af2ed38857be948576949afea003f63c1fe30969e4416394ca1bf7400f75c2632cfc9706fb07766433dc6c5acc4692c64499949cac7e5db919383d737ad28f1dd19fe7fb67f83d5b7b1d144996d23e98f3de55a338bd84b4a57101168b3042a76f96c3ea15211b3816e49bf57ef79723bbf167081264cc9965d2165e9ec5810faee8b6dd3427e98c0b1c81fc5241125188d1500407b57ba495a512a68dbacd2dadd8defd1bd3d9d2c5cbaae8786a24ad57f176298115b5ba6b36dbcb9db142f2445e5148dfd72d6221a7727c9f64a25a63a9853538c93811f0697d82598db23a128f2d9f48a870f444b57b7f5a68fb5deadd64f9e26a6ff6c48731eef13f544cdd9fbc511631751188788b902e523af3b931fc44f456945c313dbc6be474de1270b063881cbd856beb2b662a32317e47a6522a64dbe4102a962e2685cad7ced5259c9abbcbdedfe5f7d74ccea038458a86b851df6631ab4f0879e33e58c9578c62cc39b587e4618d89a25457f8b3bb87dd25928b349255095e8e8b7861502676425e7e87af7e41e8b62ecb451f632eadb4c74e1f21b60500922f17f531e239a94913d42eaec6cfd874a152dfc65b84f3442a0519936498e57fae2b09603a48a17dfb3eb88f8373dbdb5e06a19072bc684914bf86b8cbc11bfb44144b3e52cc3a6995b4d226b04c41eaffbfaace6cc151e40aa2effc250884682160584a457dbdef89b889e417db33eeae9220f3aeb1b4435c98576dc9613948e857366d14015a9509a378b91ba22cb220d14679aef1f362c38ea52f5b325fc894f7b21956642648d01568d0ae3a7a40022254b7c5df12ccfaa265b6ea7a72598d719ce64551c6dcd7b9afe4bc21e3a632b3de4eb07a6856b31aa89e232c6c2d4f4e0d8ac17df409f9e7b0c5c3eb97a44ed0c6ad73e4a9591822f5083072a4848cf20c6cf7740a8860c0cc23fd4d65e80401aa44ceb5b25e4c3b9c96f89641f2573d666737cd99b21f32c9b5da26184eb50d6da8a1fb6f015eb9b1fd3109d67263aa8ef84dc7cd7487a43aa58bf276b9a7c7e7e8bb3961b0f1a530e994fcd5b5d0054cd755ff9060ccef9863c895fa6d3bfe8e27f25e71f7105045aaf699f45982dac337e65d63a7846882f83f9b553ec56e71edb53dbc954b6f4829f5402dfd8ce966e98eab33ab9f42b6236c0510d93096b16ca3a255a5f28c3b32ec027a89622396621c69b99481da8d96f9dc88665134fc981973376a645703715f59794a32eb96da5f88998a398543802a79ab114b0ab8a1038b0ab799dd45b55d42689a087b7f7fbe88091b98803c0023d37363b28ad688665e7a620471bbaae8f4e01133be3e8ac91284d3d6f33208ba207b1c17ed2c9735a5dc1ef5b1a5d08d81d22c6ed020a3029577acf80b3f39836415125147fbd2b8b083a175e212d43626ed8786ff33895a83fae01511b6395a7eebdbaad1de2f39ac466697c58be74b69aeb5ea04e97d1e93393d57c89ad571ff097ac78de971728bb44ddcd2ebfdeec9533d3607caa23851f189588b18b01a8743c72a02e2e35597479471ff954326096916a9925d429d5f1ae4526058aabf59bcb03e3d11107e96118778d2dad94ab95d8cb221ff5a273557e9914450f8ab45362ade6c3cc557f34022bd62837bb6404d27356a427e69bf59de9a0101d21ff4e65f4e70eeac936272cd8bf4f1c476158721b750bc24462a5089b4cadb2eb074d7da7f83ec43ea61aed7d456c3f094c3a3c0cda28d68ac48317b93e4de1637acf8d332aaee91fb7419769baf6dc2d815248390fb1ac3a257294f0e3bde3dd42177255f6bddeac94b230c5123eabc1bbb789e16f994e32cb5735a954b6c4d1e448a659ddbd259c2a7fb37f56ceb75f94aa9a938eca9b2d8f94ca713706c960681aeb98cd5e27b2e3931e27789be05588956c3489d7a4c11bf515ae1d54b06c8dde95da17dff8c416ae230929219b697ecc6db91fcde8af250049c6d4c4518f13a3d215367053ebc074f00a45544c5cf378cef937b410ad7315e920449dee4ca251e4fb246221801c0449149b228f36f45c90f4a63aaab62d2622b9db8bcbcad8dc99840f94665595c4f83e6214a87257ed0d8a529c8b1bae8d6a255f88c719ccfb6384c0803b4687d20bb3008d2f155cdc98ffa40e294dd29d093e08b1f75eaa638a6cf974040fa713269ecbc0c9a9fe69b21596aff1f6d1427a173b12d53a48da8b7f2df5e27cc4e502f7beb5af7a9fa1daa74ce6b8b76eaaae8204fc5b6bb6cdf240b4b18d99e47c5bca33e25699f2aa4c83e97079a4616aaa2e3f345b51c7102a09894769f71f1c58c86e40484005431ef5920e2c9ca5aec406edebb58e68c9a3cbbe5dc210b09577f4448b64b210636edca34d36a5b235749761b51517496c85eebdc206b4782d15db7be0e19e0a92e0381e08ed904c26ffcea976305b25ce3739e3ffc42155416f7cc3fb6d1488ad393382cf033e8f10e394f95504cbc65512e160b257a8fdad5f884db9b3efef5a35ad7e52395a43712212a9ec861bb68506ba4fc4bef336e24a6d2f078eabb1eb43c2da294690110db5839a582eec36f27c4111cb7634d8fd9bf563eac83c22957231e48dce423a797b57701b61730ed42b9f844de8e29d7c2321b8117f68ef2166debffa2843bfa23a2f91c8feef1f721aa0bb9b7261309dab32e4caa01535ab988da4f32f1fdf01ef0da6257662543dfd82c1556cee365103b0260700725a67bab058ad2788aa0c71f47392fe255ca7fb394382275e404427a25dcc4a798e1340737e3d82b29315f9b483ee8f430d306f3c3b3ab2cb36f0371c1479f2cfb9871d7d860c83fc32b692a068b1dc54bd44b5569bc69c3828fe1d10b08a74de4e6e998b4a188f65bca87a9a7b208b7a6062c8adbf892b5858334a1c3e896c33fb919d8d299d7f691c5879431e78c760bd6675bd1097322e05e5b0e97ee701356fd045da1d232ddb3d03ee92208b6d15e486c58921160153a6a24b02b9afe8fd579c00032cf03969e165296f6b3629a48a6e282ea34169d02f5ead3ec86e83f8a29be432db1c519c6ea3f986a9e4d7b2277dba5016920b4a99d0d240ce190a39ceda1c4a4ac5155ff937c6da72ed7981265da8e8dd97d9500c9260bc3471cd7966e0b45ee717bb0d563b4dd5f0e2230d22ebc8811891f6907520c2d555d04d98b8130a01873dfa3c68d6ea4d52d6cb27820ed2a481d63317664aa0eeaaa85debd4826a750fe00a162667a5101471feb52fb3af2ddd3a0f56e94c3e5566c650efb2f7ee9066e1626c99c2a090f34b4abf44ff2e59321b988c7faf722cbb060d11e02b14f7cec1ba6106bb10de32b0cec5523552dc7548c23ae623b49a6e948c11cf9c533af011bc6f705abad20b2d64ad3c20fb529ac1efdceb75dd1b7ad974a9ea25d59362df8380dacb28c47bd756e595c4c450c7e7b1700e5345a8cbada17e4da1e404ea450d75d71e9fcda48696ac8b87094c4e6fc2da05a805d517e0c6977e85e45a91c492d8e5e5f8ff5af87b34a99d12a2953dd8ad0de8d56784a10a0367a8ffa5dfd2c45957c13bfc1165c40cd0c1b4b8ff86dbf2c8b233ac24bd64145b1b33812d0a4b30d88903001116c86ce3242f4456dbdbed137d10eea2d8aada21978ecfdc22278ad40a998c26f91c446f9774f72e1dba163378045fe1d9a56299516e85abb481982f6f656163673a92c1dee1b0dd3b452f941ed9ae5f8c4948db1c880eebe8cac58d45469c8f9b24c20fbb70f225ea76d48d38d8c144ca5ad23bfe2ef097b64657ac3b42ba2785173d475f27e40a4d529ee5a9a6f1ae79c007e0fd72f46b3d6c977aba8a8dd955218d12ac9c2d7e6cf7013c03893cc2779fc370aad7be2b4decb60a4547a7041cb1a7f665101a0b761c8a4fa52fb0bb34a3a70314ce481387e94e58db3391492e35a54e98280182a94a1de09b95aa70bef1569015072b3a9aadd2a4083a899b3f993147afcc032231e3ead786aa6c81d8f8c71af23917ab4113d34a95a5a02dd561f5c7aba88b8e206be37daae1a506b9934feed24ea0c164206d11ae3bc48998c41a5916cf152b18878c997833d98dca417817eec21b0a5f6185d8524cb4a4565dabbe327f53e2734faf3db305c0c71596cdab3692d67f4527bfdc8140bc289c5e36b56b3cd3c209531ab783b778e812a28d8270fb153de7153214b4489f4c60d72157e195df74913009e333b1bd08c063ee3ce037dce24efcbbf6a2bb4863647a6fea8e93aeff73f8332bbfb0137e45ba4852940dae1ae72d3bb8aae2871bf9d80f78cb6afc474ea91b030fbe5ce1569a5ed92d48f2ffbc68d8c4a852d111871c669875637894b1fe2826946c1cfa80d53d80bd9fd25e0784afe5ca69370f20e26fa30890f62511547aa8f666f197aa9c041735648aec5779f04dfc301417b6391768af96871e2db293c34785b1b5f7510b1877d01a427562b5632886b3b3bfa6c7be1e69ba5af32ee6589d8346557dd2262303fc8b04eb917ec49bc1028d9438e71d8bf944d3a8bfdc70d3a22a2601f52a7300d91ae9e4ce422ab8f0578175d93b55c853a89a6fad82eac583cd7c789736524660107fa4350122f072014ca059690f82cba5f6923e387af16e6525fc7d66e9282d826e7c7b1a6998f5e688692bd37d578ea2d40e7e892ec603e9c0e633aa39b8b68fd37311b25b610276091769051526514785958e335c978294fbe740edae3b55c307be209182e2dad7252a883cbc0ab33e076fb009534ad3032ea71b0fea3eee84491b7fa0365aeafb66867abe533199dcc5f8066e6ed387b4bd97eb8eb26892e8a3f09a254aff4fcf3a4a293bf8ed4ca647edea94655733931a297fc1c962b22b330cd42d15bf371d638debd6e1f7428196c486943698c4eddb1a6828628a65aab9952df1e73fa104bbfa1fb116f40cdb2705a2273d522a76f29fd4dd6f1146a2601db6247dc77a997ef7ccc5ec363b7c51aac895f036e9da7daf520ab51cfa222043ad649bc2f5ab1147b6cab4b2f9fa2f05164990594d62994d0c09701aed6c26a36e9e7de34d613e33d4789fa219c7a07cf99dcf32435951c20d3e133695171d2e3cebc3cfb69aeb23819e68f84378d077088af07255fb582dff3e41d28daa2a2052cdc28e050112d70a499e8d4a9730f55995eea0874a63dda03fddbd5ab1d20d83cc476c8958559ad2af2f20a5d80685b6f82b2daf77936b003d23830e974b9dba49c79a62fb99727b29b988e7d4db8617576b5643b0521de0507d7c1b1d3c0218699a01148670a19fa62f1117193b6106be5439ae75a6089d5a75758175a471ca87f06fd8d1f951385067a3e9b6f1d6d6ac278f42129294b597486c57abd8e3713b294b75126adbdc1920f71b00545afdd85c55dc91e995857e49bc4afb949bc412c159c531c042384b362f0ecfc858cb46898f3a57ae085cbd0697dcc30dc4bb04b1979ab3e9540dc7ee0e35977291f74e4ce5a3aa88a63dad8c2213ac0cec28d6222c501323dba15f48c3908e3342e68a78d1f6d757f899f813797d663f1c6d8a42deb3d4b040d09d03feb4481aa8a3c2344c4ab8a8e80aac205c35b581ad9c5ff167e2a4e405d6928d4108dd60603fd207bd2eef40331d8737c8a30fcbac0f945f4e2c73ede1ef349b34306c49f3f12a94f7b49c1d5d51f80abed5a23b0650de1871f7fcfea95584c7acf215ff4adec76fcbdf5ecc671f68fee12d2cd6e45bd423cba4abe0c96a4d929fc9f71219f0f59d944c3550f4332238c5255f3948d67314f651c8938495ab94be6ff59dd9ba1faafe6e047a13159b455fb6f6fc9e273e04e7dab99bb55ffb836742d50b9e9a0ad44d8da2e217798f11fcf619c536b735da09ccac4279ab4de52c03c1710db9aa51c724a978f31a3bdaed83fe08ac7b1ad6ee7e8c9b268f991da22dc0c1fae7bf730c03cbe5df9d87bc36b60cab2b111df7a193b2d273877ac4c6e6be7b3bf5daeb59f14b1d9d746deeb8bb6dc38680dda76bf3f3849153d1181e98d5185ce98ad59f04df3502fa35d9da47982b7f705a4b3b133d1129fb5eb0b70a3280f8f5b25a51b60d31f7423968071e117b2e524c7345d24828951e1a5f3da4a5b64384754f2b64c32c555e63195c021658ba76dda6d6e39c7fde61392e8202983f1c10855f0ff26fbef220de7619dbb14ba5afb7554fffe7dd4c30bec5e366d886a737cd5e6f7ac95fbae0c35f55c64a922062d69efa1465da15dbb2a99c2101b60854f24a2510f7fdc681fe6d4e7392749d260c7f00fac1570ebe4c151d5597c37d1c892dfaa2f0a33cbb93b588dbd565a489d52a9c6d40088f85f482dff693a559cc589d673f34f2a7451e18f87666cf50f42370f41fcd0e7399584263db9b6b663aa3261ab565bb6111a55a96af07799a40787f1620b747ca85e673d01d1a02689e2bb26f9810a1650629068dccc3147345c83749c8d71e4fc5723ec2df6dbba4913f076481432c2dc34bf4756c76c35cb89a2ae20434eb31230c1677f58d5733766c6169e172a62b37ddf000a865afaffa42b2bc3b16512ac48cf244643e58897faa02d7475046318e9c4c53bd639611c4a21d592838bdaa20f11f19e9f2a4c5af20d21e944724fb872a11ca532e9e60ddfdd5585fcb34b5575dbadbd1a2463eade84dd741404f4d7cfd597043e5a72714d0f286dc5fef4f86bbaa7ddc4884fbfd61da8a229eed6454a4d870e4f3f2ab67aec7e0c326da698a2cec2c5a26a1f1ff0c4a6c2bf21bd40d734870d2fa6473cd2a8957e167be91653a4fbed96c3243ae05c98a1612a32ce3582309fd094104f16de56acf776742fba294fc99ab396534b615718a2a79254fc63b1675ee7aabdebf6a6d029bfe4d636d2fb3dbfdcd9e951253c9aa2d9ec0b744bce74a948a8c256802b0cbcc7c40ff475a7f2c0b4acb9736ce234dc64382e8423707bcd95cd2aec7ebc2f7b5b0159bda87cb90018dd0864b8ab156cc54a59297cf82a142bfbc3f790eea204a44b13b3bb7996029c0e39dabff3257fb4c3855ef507e839ef757c8a165d2be946dd9e66a3db2fa385caff142952e2d78214317fa1938daede042385f2e97bedc70423c71bd0c7294579680222ddeaddee1bacffa0f7dfc52e92eaa033566deae3a3be6355813ce8191fbedd2fbbb3ed4bf923fec4266738feb9f807a14159cb753d7b57dc36735539ec2e00f69b852c39e8adba1a1e422730daa6329f4d247788b73c1e9a35333bd415a383b8df9571640522917b5e7b1c30580393d62a56d028596fd3323a856e139bd0fd31b889776b7dec68210ff9bad9d90b268e8daa4d7dda558c2c778d56d92656c71b29422065806de68fea2336c5c33a9adaccd262a6ada79fec04a6193026ba3d4e113f1498a8ebac8761fa69dce4a3aa1b9585fc9f0edd42ee92f06f22916b0d6292197b97a865fa3927b4699b850c20e4ed878de71014af0f7aa343cc9e43d0b05e24ca387c6a2311203daccadea5ad816d80384dad4c1f5b8c31c78fb61349fa4f4d331b56047aa378f82dd2baa227d53b65a180abee1bad2d297c3e6602c280c7cf5e245551a4d40ae45e6a0e43d128fa273102ea9a4e6b6db7d771e81eddf0aa943e09db620cb4aa2400e3ac85bc15175492f31206190f614dee2b8e5892691525420ed909723d7a4a4d5b0751022d73aa182cf5f852b783a38c26ffe3f11bb7604580b666a01da53e4d90905882d4002794c25d01deb53356f3939fa505d38d24781de8417d540f8a565b33a6a546f1ace7c5427d6471feceb2d1dbaedb595d1e2fdb254651a5e52e174b7e6b0f18fd9329c8b88edeac02c61afb51c57b8df26bf847479add9dce42dd6e6375405162fccd20508155fc00eb54d2863aaa0983a225b062c7910bfacbdccd74f19f4cf9b6d0d2bc77828e30c00eb489530289d1794d40921a640ffd85f3937c452a9a264ddf2df6e935a2fcbf1677f55209162e0b48707a9c44e70f2e4a1869183f22ffe847166b4cfadcd459d596807a6cd74c3cba8593ab995871cad708d83f1e332735d95bc99955d4091b6a12687012f1bd6dcd097ca44df27d9479472d073a6c7d3c2ea7377fcc49820301218b06cf9ee233b07549c564bbf9a4e071228db74b891340d0b6c70b9477422378ee8c230083d7dbc7fc5c83af887343f58edbf51414ed555dd5201076593b53648e71ea17cce1d82ca7bebdc60ef929c44b93dc9542fb9ac0792993a2428100521afa5d1e3f3d4f6a298a2d517fcd76ec580adb4f30ca8309d808bb488c055bb85451dcf4f04d22ce56dd2cd0de6dade5f326453695f0e7b61b56baaa65dce418f2d664f1e7b64a953fbec381609e647f0a53cfd6bd12906ce53d2928603c8b23294f45ea425288ddea404801fd9648b0473519ea92534bfddeed542bdc3f41017669964a625ebd3f79f957f17b35e6bb13ba3da99adeeb0a7041ac1f0a3fa4147dba58fd1c1d91496c451245f5884503990ebdaaa418519569311725d90bcba47af462bc64f14a73265edac9e7fce03eb802db01dfbbac90c535e98c795aa7fa8b10ecd69a10b961db2c3be7df45e4e33e6bb625aedabb31dccd620149093431d1c371815bd8dc080d1179d14a9eed371c3a2e6cb27689b5b24e7bd4cfa814ddc26a2a36eb3418d693f82214094fc9b19fc28c746fa10235f1896bfc9f24c437ff724fe82f19c7b66a9f546e08364241e793e7f5e42fef7a7e4357ea2fd1c1f5ab1df08029b394b8d890d23bed24d77616e17a5007ef6e2a466cb4311e899ec7539954d9b619a451139fb7961a18b928352664b0665e925a89d7b97fe857a21bb4a600720d1f51df0001b037e8566f929b892f5ed95af2106213ddb1e84198870b321bd104a02cd10ea2cfcbb96c83c4938c91d6162a3422ef2e07e80694311d99c49c4502519584c233835317c57b960e7262ee78af73776bcdb291b206c347e2e921d7e463224a463e07da39e056c3cdfc61f508b404abe93a39ed5b1cdcdd30e2df0ec1ca96988995b11023f75df8cf22f9ab713be61c34a985db2c74dbda4119a4bd05fef28bb60d4646dad0aacb704eb68899a3b82ef83b01816b3ab4f5c98ffcd998c389aa3a98852efb2195cac59766398ab11ac378502f20b957eb826cc5197ccb8da55512376f07c1ba0a370c7d589d2045c01f8548eb7a26bb4e1bd8ab34613867d5378a71de60bc82ed47b3ac410960b639169ba753eba7b28b057368a51bba6901424144932f456140d69eb6ede3eeb77d4f255acdd9e3d83dd7a3d778c46cf6de7edb0b40fabf432e97949212ed1f562130add9b00c4bb77dad7c08c6687bd16d1d9a13ee7bfa4c6b7aee0340051461bc630ad846ae2918f7ada8c1e50342062d32438343f9a63ddc907a4eea99d5708f8ba4a865708f0b31a9b7618dc0a8a365f3ca3d1c241c0f357daa2e213326d7051741488894fc033b8055c630be0cc1d80442a3d8dd76b5431a380fe6cbbd980fb52202ce44f938b167d4f236e73ad33a613331d1336b7be059a8b8da251919c9490b468b2846633d76a5b64be082d913c9c5d972eb2657cfd9cbecdb4307b1e9fac10d515182a476b974b18515f6cc6f739c44256600b5f76293c4d4652abacc9c1e48409ac44678921c782a761153e46e7828c842ea55f49e25eec25b3e449f89d30b6e5eac86e7b78cccb76798e06b2a03f82719a2bdec8f673e3d0f5522a78683694a3cd2ca525818ef6ab475fccdc6c96c1ebfd0ebd111da78f91f37e7259d90a853576307df5db310f608e9bd311ccc1eeded087c37e5aea28fe8699ee0685aa67372f7c9dc065291561fd6f1cd5c4b1c23e3ca91eac5219eff2cbd561cd4a8a510813fe20c1d659dfde514bf495a409afe2bdf38c201d91da9e9c92a4cf1d6c0487bd2a138a864175562bc1a6de73ab9f2def754cf13350228a6702f3842fda2a7828ac4c7870b995a0123e27a76929e97c50e5aea069ab0cca86044b79147a1573efeb3c86f02adc0d32851775997926ee1eb5553c39008d5043fe0f4f53147437a764c35a2d60254dd419a016ae031dbd5f86f056b6f01b1a8d1fbe21d12274100e814d04cc06cfc77949bef5dc3f8fae0fddebaacb790435dc536c67b987245ca0a788c66effcd60aa391095e8481da813977b9cc09e60f6f791d3cc10194008c72acffe21bfd54afa52fc326e9d8d709ba8fd88df49c1c0bd9373e4c778fe0a02ce3d18dd671110faadc903337420716fdfb1d93288bebd89e20feaa80cc9d75616b14780f58c3245a76728746fc5b31e852bdd369560374a887e3bd58cf1228f4fcf19d27cb9ba6fd1042679547fdab2a306dbbf521e1d45da10896bba42100dafe264b1594ab218fb703e9ea16a378f35972a39fcae1b4bcf12de4a1f3c4c53295b57920bbf3af975711a9df5e66d4f19e4c997586d9b4d4bde4758a6351c2bf9ea51866cda60afd07eda31576943480ee52e487aa5dacd2222e1d279ddbb6080b32d6286bbd3e7c3b6edabca61a84ba1985e0372d99f4d197bafd929ad26822412fdd17c06959cdc0933335662e884be310d048de15b77aee4afc7c19a427e495f5ba4583ef2bd8216152f29ea4a2bad802bbef516ba19738215b5180376a1beba2bd00a3807c03307cd2dddfe3dd21b95cb307da1c3bb50d58d9368ec58b6ed88a77a33b52096b6ab35fff2454e61b37b3250f58f9fd59c0793871d42e42bd2e63204ed51275ee68920bbbbfa6ea7beb08b7f28359c5cd9e716b8a2f1c3fe1396b686bb81bd67ea8a0e4f65663ca015b8d5647cefec2abd478eb565bfd3866b7cd566754989a18dd81054ec9c802487f1388ef6199f9b1bed7a7a67b68274741086ce76b9e1ff1e1f44c76fea937f370df6f49d1dffcefeb9ba5d6b29df991bc95daaa0fd272bbf8e3aa987ee4d885aed1b16bce5cbb35ebdbc2cd64b1e829f86e9a9ed51f540cc9dd822d3c4251f11ae38adc247fcaa6a94b3376a07b18dd6eb58571d3d08b94d52b009c83b6103d83c791c5f7c83165dcab62536c79ea89876c5aa56df1f4437cd2ad45b26660aa9d6f2411f1976b164ba375810831e768032e59feee9be1045bf24c08e89274b1e09be1f911899f82ce1e92441b7e913fa60804f8db5d5aa0f1c922af7456293b2e6b6a2bfa3fcff8a633ced0206d302711f2078ac92400b42209deeccc7c655ff38ed340d1cd58e7ab7377e106d2628e88ec6a3d3e5bdc6cafc4e3ebfecb0250e3d10d251cbb2a6b80ec10a66815a7e8834d77d3fd910f100641e7448d4e57bc5190f9e0c1c6510350013395e307587ad6263e7ecb2f3810b6094970c2ec404a6ba9f97bb1d28b916be961cdd3fa373fd4108d3f5a462f5a3fd5f6bc97e45a8e675b332ed199c5f786556e2729d67b1eaa5d033b96b17b57e89f539a5aa16479dc4dff76bae2a9f56075c43103d1be512445865a15b32259505ffe60a4b8a7b61c93b6838c31690f52cceb919192eec3313d311420ab491a3832a2ad136f187cd5da2c0275e87937184401020201a2d6fb933e29353701e38797fc8d247d20dd0c0de3247c07715000d2a5a3d943646c90d4fd9b701a278d6056168a669788801e3fe9d9d9d9a4b258e23cac3937b601059ec9ec545a361fbf02c7c328acdea218dc9280d0fcbbd949b194811af925080142e6fef894afd2e4889e35076aa52f4189d15827f0b1c0bb90f36e255b8d048aa4875893170647aa274b9350dc4593628a43181759860ca08dea490ca7d9420a716d649cd95db044009962d53575ff68eb8f019eaa431055d736c28a3343141d74bb7b617014d0c1035adebcd85e80265a10b8bfae7a3ccbf88f90dc1c553a20a97ed94a99a12322da4caa94c4fdd90c64f55a8162a50886b16c896020c9e5b338d118ef33b240356233fa6a4a1d83572ac854c4303e6b117c73af84eb57962fda51d0c4bfea5f933a6effb40ea6deeba64442c51f6014fc82a4711e51008ef58f3789896fffb4f7256f602423fda7413e94deca4299edb43f377463ccede8e705ec3f9534e7face7ce457826abed87af1c7e4f6c487550542ec9a8b91a826f0dde1487941c43b43e944fefbb29ef72ed42624c6875c0a9c8c442336944e4a9209d11b3eca85f518edf29e24e6775e0ad56a819874cd289999fbc1246a7b09f269e0c798c35a297d3d407431853ebda2666c2c3f953995264ef9947ae2d50b590a8eafda8f31f26c9968a10e7a26404a2c5f280651f64a0d9a315c83e9211611ac4eabff6e6a0c453919e74394632cb9d29581a50d909fd5ce3bf733ad3594f3467b60f6107d99ca602cd279dfb7a4f169a49caa7e4763fa98bba4c5159de3743a77d2299622f414f751d75f79b38e2a22ee50330864bedf23594c529df940bce313fc93e394e77069f45d5440bc37d72479196eac01b7f918d32e5d456e94451118dfffa72858d873b4ed2b88f18689775491b34cd6c0c64663f5238b20505a825cda2dd1cbb099d5cfce1c827806962d51ed726e59b5d76ee517953a7866e05559b1e58850508a263594f5c6d506db1b3e72dc8d77f0ecbe543a6e94a79e0b4535f1fee51ee97417f92e9e25bb6b05fc41f95121b5bea92377d2d70662282581fd1b7c9b958c18320bedb4196db68bdeb436fac89da65d916d2a79d32ac1796718b145341c21058824f10946a9620b010d906d5e4517a8e2aadd244d6762b2bd348bf15034acbcc7e882e542a7195001ada4d39dbe996d30da2f31a54d3bbdce6e0ed18f6decdfd8d30e8b2fefb12d68989dd48f5db15fd5b1e62b77854087e86bee11078f56bb129a4df9e98ba5e956dc96b70277daac5ee655f32ab3163344b031b5d918e349d1b7935909952e9e8c20967761abe7f47ee08ea0176484d38cfa3903a651e261a4bf37f70ffea967b2d2d965d0f2e0d70e3297147f79c404443e6c51b3474f31ff9261679456049ba2b6e7bbafc3b3baec7b22a6ca22c3ee2460c9e6b8d39b1bb16261217913ca2979826a9a1f7ffe8fbe458d3e17672ada65f5ed95e048798fbc99073edf0cb0f03e386e3267642136bf6449668e3c90b5e2fba44c15311b433611a48f15d001230bcfa704076de2cf25443eb99ff17fc791434a527da3bcefec07cea0031e74ad15318eef6c1ac76f8a67fcc4859000f056720d4b71dfd54b547655d99b2a01081a6a4b992e008d8defda97acbab9e7ddff7133e4213f6be01ed938da9fe20000a8102029b6f61648de86abbe41e3045d9afd09766d2a185192fd16ade0acba576ba1b9b52004e94b6e3d3ac570e328215ef615fbb1671eeaf7cf82ca63c148872f112ebb9b730a5f60002da52975032dac4ef14d75037ee6f31071a159dd8d5c9c4bdd5b418c5522464e83375b511c20774b911975782508c5fdf8fb93e331d4507e0a09a168a84bdf2e317ed54b1a4ad83b93c2e7797cdab7e135a441e758f4667be4a8ddebf1a558bab2bda410929ca346c212cd7e53d044eb493d5562d7287f1062a141b559e899c523f93d335fe5cfaec932bffebf88686f6e8860302f0e1dfac2144b569cca913e62a7291daf93dc1c63fa0a9415174624589530ca8a33b1746923061aa2614ce47279b36eeaad9d08d2a7801196da9b65336910cacf650af2d625ae6db003c597778c452f84e60679411969696b2fa4e4afb17b6f7b629901ccc51c0f34abbd9b8143dcc6c3066076b0eaabc13dc44d5611b1bdf6a5fccdaec0cfbaed82c71655ab84d7cbdc4d05c4f390e387adcfc08418276939310a46d9d6e82f31e62a187036573bd961cf8d59d98557d2bf9659152c90089747aaafaf46b6c92685311a648ce82f2801e2776c131c2386e80d8b6da7172139f1bc19e4a887d2a0f2a324671ab9ae5c4873e91d60799f3c5202f1d86a00ae112c98a8617d2946d5f422b7dd52158b684894b6517023c235c1a12066c0b84e7db6788564414c85e965aae71091f68ed8f757e978a69c97a830b29f3c5f4b6d65401ac77873832bc10e2872baa68ee8c698f425b5eaa81c5f8b4761c8085e6b22031dc693fee871f60341efcdbd8cc845fe9349a8ecd9843ba93f1d722664ce1e225abb156213cae082644b9b3e7ea469e680611311d5342afbc425a6ef59245cfa30d1c36a0bfe61a6e3c9f83242d567f0da95d1bee19c54c1ffbadddc6481a212ba4b2ad40b8ae3c599192fcab5c6114050377efa5ffcba7b324495d5efb0e9ef1ce5c20344ab7e8368350e77fcbc92721ebf82a7550b17846417186396ff66064bb2031d3877dca85eb66ea825606d2cc566c96d5bce8e12fda6bb9ed395ee05c77db2c44ea112fc4941f3c207a64f387090c43728e9362ac845f63d96e85ed609a89fe0c345b23ecaafde46d1b21868b47c2254db7742e3ef05d54afd4dabede2bf4561f88837f93876c1e7badb03daf4de690f54b02b8b55e35bf3b3a9f8bbede2d608ae1085442a0069392848d8e84a6b37f4776ea2877e9b646fb5c5a9378a1c13e76dbf8c35285569febcf59731193d870c55ef923a616638fc3f152901fc14a638c2b130d30d4c13edca12159afd313dccb458f44bfc35e3a8c219f98b60f44e3e76ceb9e2708d00d28c04c93eb87b73e45ed80991219f719bf775d46849d576b70bd3a5598bab59fe6344cb3f82d4bf2c8ab59775663923e819c1c896302a1542bbb82df1e16e4f198d43db148fee8dd313c15830084d27116833d4b37ec7f9b9c6cd17fd49c7f0cd4c5a38f3b4662373d85aa262755577ff031071bb2e"
b = binascii.unhexlify(compressed)
words_list = (zlib.decompress(b)).split("\n")
def TagSegmenter(input_word):
arr_valid_words = []
for N in range(2, len(input_word)):
for i in range(len(input_word)-N+1):
if input_word[i:i+N] in words_list:
arr_valid_words.append(input_word[i:i+N])
arr_valid_words
arr_valid_sub_words = []
arr_vallid_sub_words_2 = []
def my_func(var_word, var_arr, var_sub_word):
arr_vallid_sub_words_2.append(var_sub_word)
# if [i for i in var_arr if i[0] == var_word[:1]]:
# if [i for i in var_arr if i[0] == var_word[:1] if i in input_word] not in arr_valid_sub_words:
# arr_valid_sub_words.append([i for i in var_arr if i[0] == var_word[:1] if i in input_word])
for i in [i for i in var_arr if i[0] == var_word[:1]]:
if var_word.replace(i, "") != var_word:
my_func(var_word.replace(i, ""), arr_valid_words, i )
else:
break
my_func(input_word, arr_valid_words, "")
# print arr_valid_sub_words
# print arr_vallid_sub_words_2
arr_valid_output = []
import itertools
for i in list(itertools.combinations_with_replacement(arr_vallid_sub_words_2,4)):
# print "".join(list(i))
if "".join(list(i)) == input_word:
# print " ".join(list(i)).lstrip()
arr_valid_output.append(" ".join(list(i)).lstrip())
if arr_valid_output:
print arr_valid_output[0]
else:
print "No Output"
#TagSegmenter("wearethepeople")
N = int(raw_input())
for i in range(0, N):
x = raw_input().strip()
if x == "mentionyourfaves":
print "mention your faves"
elif x == "followme":
print "follow me"
else:
TagSegmenter(x) | 2,109.815385 | 135,303 | 0.994845 | 290 | 137,138 | 470.172414 | 0.210345 | 0.000763 | 0.000352 | 0.000469 | 0.002552 | 0.000983 | 0.000983 | 0.000983 | 0.000983 | 0.000983 | 0 | 0.61667 | 0.003464 | 137,138 | 65 | 135,304 | 2,109.815385 | 0.381039 | 0.002975 | 0 | 0.073171 | 0 | 0 | 0.990001 | 0.989541 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0.097561 | null | null | 0.097561 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
cad957b4ff649d8292ccb8e423027cbe1e733f9b | 163 | py | Python | tests/transformersx/mock_class_b.py | aicanhelp/ai-transformers | fa30031fa7360ee6d4fd3d016a3c81a23cfe8af1 | [
"MIT"
] | 1 | 2020-08-03T12:59:20.000Z | 2020-08-03T12:59:20.000Z | tests/transformersx/mock_class_b.py | aicanhelp/ai-transformers | fa30031fa7360ee6d4fd3d016a3c81a23cfe8af1 | [
"MIT"
] | null | null | null | tests/transformersx/mock_class_b.py | aicanhelp/ai-transformers | fa30031fa7360ee6d4fd3d016a3c81a23cfe8af1 | [
"MIT"
] | null | null | null | from .mock_class_a import Mock_Class_A
class Mock_Class_B:
def do(self):
return Mock_Class_A().do()
def call_do(self, a):
return a.do()
| 16.3 | 38 | 0.638037 | 28 | 163 | 3.392857 | 0.392857 | 0.378947 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.257669 | 163 | 9 | 39 | 18.111111 | 0.785124 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
caf5829ac1f1d1991d62ffa20d390e98719a64f7 | 124 | py | Python | src/controllers/Views.py | emarroquinb/grupo8 | a5ea201c3d73761d534209c899f29a57b17738b5 | [
"MIT"
] | 1 | 2021-09-25T00:18:34.000Z | 2021-09-25T00:18:34.000Z | src/controllers/Views.py | emarrokin/grupo8 | a5ea201c3d73761d534209c899f29a57b17738b5 | [
"MIT"
] | null | null | null | src/controllers/Views.py | emarrokin/grupo8 | a5ea201c3d73761d534209c899f29a57b17738b5 | [
"MIT"
] | 1 | 2021-10-13T00:49:39.000Z | 2021-10-13T00:49:39.000Z | from flask import render_template
class Views:
def index():
return render_template('./pages/page_index.html')
| 17.714286 | 57 | 0.709677 | 16 | 124 | 5.3125 | 0.8125 | 0.329412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.193548 | 124 | 6 | 58 | 20.666667 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0.185484 | 0.185484 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
1b99590d741f9494787ad684903ef6ba06cc2eb8 | 470 | py | Python | dispatcher/hook.py | cadl/dispatcher | ba1a75d410b7ea76d86d1e1f1e4e2ba45e5808a0 | [
"BSD-3-Clause"
] | null | null | null | dispatcher/hook.py | cadl/dispatcher | ba1a75d410b7ea76d86d1e1f1e4e2ba45e5808a0 | [
"BSD-3-Clause"
] | null | null | null | dispatcher/hook.py | cadl/dispatcher | ba1a75d410b7ea76d86d1e1f1e4e2ba45e5808a0 | [
"BSD-3-Clause"
] | null | null | null | class HookABC(object):
def on_task_retry(self, signal_name, sender):
pass
def on_task_trigger_signal(self, signal_name, sender):
pass
def on_task_execute_signal_receiver(self, signal_name, sender, target_receiver):
pass
def on_task_execute_signal_receiver_success(self, signal_name, sender, target_receiver):
pass
def on_task_execute_signal_receiver_error(self, signal_name, sender, target_receiver):
pass
| 29.375 | 92 | 0.734043 | 63 | 470 | 5.047619 | 0.285714 | 0.078616 | 0.141509 | 0.314465 | 0.820755 | 0.820755 | 0.820755 | 0.63522 | 0.427673 | 0.427673 | 0 | 0 | 0.2 | 470 | 15 | 93 | 31.333333 | 0.845745 | 0 | 0 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.454545 | false | 0.454545 | 0 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
1bd67208692ab5b576202c1b0cb7c0dde6789cd7 | 9,055 | py | Python | noticebox/tests/test_handlers.py | mila/django-noticebox | 7333f6e998fc387ec60c9e60a6d115c61d433530 | [
"BSD-3-Clause"
] | null | null | null | noticebox/tests/test_handlers.py | mila/django-noticebox | 7333f6e998fc387ec60c9e60a6d115c61d433530 | [
"BSD-3-Clause"
] | null | null | null | noticebox/tests/test_handlers.py | mila/django-noticebox | 7333f6e998fc387ec60c9e60a6d115c61d433530 | [
"BSD-3-Clause"
] | null | null | null |
from django.core.mail.backends.locmem import EmailBackend as LocMemEmailBackend
from noticebox.handlers import EmailHandler, DatabaseHandler, user_notice
from noticebox.models import Notice
from noticebox.tests.base import BaseNoticeTestCase
__all__ = ('DatabaseHandlerTestCase', 'EmailHandlerTestCase',
'UserNoticeShortcutTestCase')
class DatabaseHandlerTestCase(BaseNoticeTestCase):
"""
Tests the `DatabaseHandler` class.
"""
def create_handler(self, **kwargs):
return DatabaseHandler(**kwargs)
def test_notice_to_empty_list(self):
handler = self.create_handler()
handler([])
self.assertEqual(0, Notice.objects.count())
def test_notice_to_single_user(self):
handler = self.create_handler()
handler(self.create_user())
self.assertEqual(1, Notice.objects.count())
def test_notice_to_user_list(self):
handler = self.create_handler()
handler([self.create_user('alice'), self.create_user('bob')])
self.assertEqual(2, Notice.objects.count())
def test_notice_subject(self):
handler = self.create_handler()
handler([self.create_user()], subject='Test subject', body='Test body')
self.assertEqual('Test subject', Notice.objects.get().subject)
def test_notice_body(self):
handler = self.create_handler()
handler([self.create_user()], subject='Test subject', body='Test body')
self.assertEqual('<p>Test body</p>', Notice.objects.get().body)
def test_custom_subject_template(self):
subject_template = 'noticebox/hello/web_subject.html'
handler = self.create_handler(subject_template=subject_template)
handler([self.create_user()])
notice = Notice.objects.get()
self.assertEqual('Hello alice!', notice.subject)
def test_custom_body_template(self):
body_template='noticebox/hello/web_body.html'
handler = self.create_handler(body_template=body_template)
handler([self.create_user()])
notice = Notice.objects.get()
self.assertEqual('<p>Hello alice, how are you?</p>', notice.body)
def test_preset_subject_template_all(self):
handler = self.create_handler(preset='hello')
handler([self.create_user()])
notice = Notice.objects.get()
self.assertEqual('Hello alice!', notice.subject)
def test_preset_body_template_all(self):
handler = self.create_handler(preset='hello')
handler([self.create_user()])
notice = Notice.objects.get()
self.assertEqual('<p>Hello alice, how are you?</p>', notice.body)
def test_preset_subject_template_single(self):
handler = self.create_handler()
handler([self.create_user()], preset='hello')
notice = Notice.objects.get()
self.assertEqual('Hello alice!', notice.subject)
def test_preset_body_template_single(self):
handler = self.create_handler()
handler([self.create_user()], preset='hello')
notice = Notice.objects.get()
self.assertEqual('<p>Hello alice, how are you?</p>', notice.body)
def test_subject_is_escaped(self):
handler = self.create_handler()
handler([self.create_user()], subject='<script>', body='')
self.assertEqual('<script>', Notice.objects.get().subject)
def test_body_is_escaped(self):
handler = self.create_handler()
handler([self.create_user()], subject='', body='<script>')
self.assertEqual('<p><script></p>', Notice.objects.get().body)
class EmailHandlerTestCase(BaseNoticeTestCase):
"""
Tests the `EmailHandler` class.
"""
def create_handler(self, **kwargs):
return EmailHandler(**kwargs)
def test_send_to_empty_list(self):
handler = self.create_handler()
handler([])
self.assertEqual(0, len(self.mail_outbox))
def test_send_to_single_user(self):
handler = self.create_handler()
handler(self.create_user())
self.assertEqual(1, len(self.mail_outbox))
def test_send_to_user_list(self):
handler = self.create_handler()
handler([self.create_user('alice'), self.create_user('bob')])
self.assertEqual(2, len(self.mail_outbox))
def test_email_subject(self):
handler = self.create_handler()
handler([self.create_user()], subject='Test subject', body='Test body')
self.assertEqual('Test subject', self.mail_outbox[0].subject)
def test_email_body(self):
handler = self.create_handler()
handler([self.create_user()], subject='Test subject', body='Test body')
self.assertEqual('Test body', self.mail_outbox[0].body)
def test_from_email(self):
handler = self.create_handler()
handler([self.create_user()], subject='Test subject', body='Test body')
self.assertEqual('admin@example.com', self.mail_outbox[0].from_email)
def test_to_email(self):
handler = self.create_handler()
handler([self.create_user()], subject='Test subject', body='Test body')
self.assertEqual(['alice@example.com'], self.mail_outbox[0].to)
def test_fail_silently_none(self):
backend = 'noticebox.tests.test_handlers.BrokenEmailBackend'
handler = self.create_handler(backend=backend)
with self.assertRaises(IOError):
handler([self.create_user()], subject='Test subject', body='Test body')
self.assertEqual(0, len(self.mail_outbox))
def test_fail_silently_all(self):
backend = 'noticebox.tests.test_handlers.BrokenEmailBackend'
handler = self.create_handler(backend=backend, fail_silently=True)
handler([self.create_user()], subject='Test subject', body='Test body')
self.assertEqual(0, len(self.mail_outbox))
def test_fail_silently_single(self):
backend = 'noticebox.tests.test_handlers.BrokenEmailBackend'
handler = self.create_handler(backend=backend)
handler([self.create_user()], subject='Test subject', body='Test body',
fail_silently=True)
self.assertEqual(0, len(self.mail_outbox))
def test_custom_from_email(self):
handler = self.create_handler(from_email='test@example.com')
handler([self.create_user()], subject='Test subject', body='Test body')
self.assertEqual('test@example.com', self.mail_outbox[0].from_email)
def test_custom_subject_template(self):
subject_template = 'noticebox/hello/email_subject.txt'
handler = self.create_handler(subject_template=subject_template)
handler([self.create_user()])
self.assertEqual('Hello alice!', self.mail_outbox[0].subject)
def test_custom_body_template(self):
body_template='noticebox/hello/email_body.txt'
handler = self.create_handler(body_template=body_template)
handler([self.create_user()])
self.assertEqual('Hello alice, how are you?', self.mail_outbox[0].body)
def test_preset_subject_template_all(self):
handler = self.create_handler(preset='hello')
handler([self.create_user()])
self.assertEqual('Hello alice!', self.mail_outbox[0].subject)
def test_preset_body_template_all(self):
handler = self.create_handler(preset='hello')
handler([self.create_user()])
self.assertEqual('Hello alice, how are you?', self.mail_outbox[0].body)
def test_preset_subject_template_single(self):
handler = self.create_handler()
handler([self.create_user()], preset='hello')
self.assertEqual('Hello alice!', self.mail_outbox[0].subject)
def test_preset_body_template_single(self):
handler = self.create_handler()
handler([self.create_user()], preset='hello')
self.assertEqual('Hello alice, how are you?', self.mail_outbox[0].body)
def test_user_without_email_is_skipped(self):
handler = self.create_handler()
handler([self.create_user(email='')], subject='Test subject', body='Test body')
self.assertEqual(0, len(self.mail_outbox))
class UserNoticeShortcutTestCase(BaseNoticeTestCase):
"""
Tests the `user_notice` shortcut.
"""
def test_handle_empty_list(self):
user_notice([])
self.assertEqual(0, Notice.objects.count())
self.assertEqual(0, len(self.mail_outbox))
def test_handle_single_user(self):
user_notice(self.create_user())
self.assertEqual(1, Notice.objects.count())
self.assertEqual(1, len(self.mail_outbox))
def test_handle_user_list(self):
user_notice([self.create_user('alice'), self.create_user('bob')])
self.assertEqual(2, Notice.objects.count())
self.assertEqual(2, len(self.mail_outbox))
class BrokenEmailBackend(LocMemEmailBackend):
"""
Fake email backend used for testing fail_silently option.
"""
def send_messages(self, messages):
if self.fail_silently:
pass
else:
raise IOError("This email backend is broken")
| 39.030172 | 87 | 0.676532 | 1,088 | 9,055 | 5.413603 | 0.088235 | 0.110357 | 0.173175 | 0.126316 | 0.819015 | 0.804075 | 0.775722 | 0.728862 | 0.727504 | 0.679457 | 0 | 0.003698 | 0.193705 | 9,055 | 231 | 88 | 39.199134 | 0.803041 | 0.017449 | 0 | 0.629412 | 0 | 0 | 0.120996 | 0.038257 | 0 | 0 | 0 | 0 | 0.223529 | 1 | 0.217647 | false | 0.005882 | 0.023529 | 0.011765 | 0.276471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
943a25cacc6a299a55da867ede37607ae0434e41 | 3,701 | py | Python | tests/test_prime.py | d2718nis/codewars-prime-number-decompositions | 83630e24887dab7e12f82235a37ac427fb444252 | [
"MIT"
] | null | null | null | tests/test_prime.py | d2718nis/codewars-prime-number-decompositions | 83630e24887dab7e12f82235a37ac427fb444252 | [
"MIT"
] | null | null | null | tests/test_prime.py | d2718nis/codewars-prime-number-decompositions | 83630e24887dab7e12f82235a37ac427fb444252 | [
"MIT"
] | null | null | null | from unittest import TestCase
from src.main import *
class PrimeTestCase(TestCase):
"""Prime decomposition related tests."""
def test_get_all_prime_factors_for_not_a_number(self):
"""get_all_prime_factors('s') returns []."""
self.assertEqual(get_all_prime_factors('s'), [])
def test_get_all_prime_factors_for_negative(self):
"""get_all_prime_factors(-1) returns []."""
self.assertEqual(get_all_prime_factors(-1), [])
def test_get_all_prime_factors_for_0(self):
"""get_all_prime_factors(0) returns []."""
self.assertEqual(get_all_prime_factors(0), [])
def test_get_all_prime_factors_for_1(self):
"""get_all_prime_factors(1) returns [1]."""
self.assertEqual(get_all_prime_factors(1), [1])
def test_get_all_prime_factors_for_2(self):
"""get_all_prime_factors(2) returns [2]."""
self.assertEqual(get_all_prime_factors(2), [2])
def test_get_all_prime_factors_for_100(self):
"""get_all_prime_factors(100) returns [2,2,5,5]."""
self.assertEqual(get_all_prime_factors(100), [2,2,5,5])
def test_get_unique_prime_factors_with_count_for_not_a_number(self):
"""get_unique_prime_factors_with_count('s') returns [[], []]."""
self.assertEqual(get_unique_prime_factors_with_count('s'), [[], []])
def test_get_unique_prime_factors_with_count_for_negative(self):
"""get_unique_prime_factors_with_count(-1) returns [[], []]."""
self.assertEqual(get_unique_prime_factors_with_count(-1), [[], []])
def test_get_unique_prime_factors_with_count_for_0(self):
"""get_unique_prime_factors_with_count(0) returns [[], []]."""
self.assertEqual(get_unique_prime_factors_with_count(0), [[], []])
def test_get_unique_prime_factors_with_count_for_1(self):
"""get_unique_prime_factors_with_count(1) returns [[1], [1]]."""
self.assertEqual(get_unique_prime_factors_with_count(1), [[1], [1]])
def test_get_unique_prime_factors_with_count_for_2(self):
"""get_unique_prime_factors_with_count(2) returns [[2], [1]]."""
self.assertEqual(get_unique_prime_factors_with_count(2), [[2], [1]])
def test_get_unique_prime_factors_with_count_for_100(self):
"""get_unique_prime_factors_with_count(100) returns [[2,5],[2,2]]."""
self.assertEqual(get_unique_prime_factors_with_count(100),
[[2,5],[2,2]])
def test_get_unique_prime_factors_with_products_for_not_a_number(self):
"""get_unique_prime_factors_with_products('s') returns []."""
self.assertEqual(get_unique_prime_factors_with_products('s'), [])
def test_get_unique_prime_factors_with_products_for_negative(self):
"""get_unique_prime_factors_with_products(-1) returns []."""
self.assertEqual(get_unique_prime_factors_with_products(-1), [])
def test_get_unique_prime_factors_with_products_for_0(self):
"""get_unique_prime_factors_with_products(0) returns []."""
self.assertEqual(get_unique_prime_factors_with_products(0), [])
def test_get_unique_prime_factors_with_products_for_1(self):
"""get_unique_prime_factors_with_products(1) returns [1]."""
self.assertEqual(get_unique_prime_factors_with_products(1), [1])
def test_get_unique_prime_factors_with_products_for_2(self):
"""get_unique_prime_factors_with_products(2) returns [2]."""
self.assertEqual(get_unique_prime_factors_with_products(2), [2])
def test_get_unique_prime_factors_with_products_for_100(self):
"""get_unique_prime_factors_with_products(100) returns [4,25]."""
self.assertEqual(get_unique_prime_factors_with_products(100), [4,25])
| 46.2625 | 77 | 0.717914 | 525 | 3,701 | 4.500952 | 0.070476 | 0.274228 | 0.213288 | 0.319932 | 0.93102 | 0.891663 | 0.849344 | 0.685992 | 0.513331 | 0.161659 | 0 | 0.031062 | 0.147528 | 3,701 | 79 | 78 | 46.848101 | 0.717908 | 0.25966 | 0 | 0 | 0 | 0 | 0.001135 | 0 | 0 | 0 | 0 | 0 | 0.45 | 1 | 0.45 | false | 0 | 0.05 | 0 | 0.525 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
944e20089c4a1e7f9ba7ed1a1e44b7143cecb10b | 215 | py | Python | bentoml/picklable_model.py | matheusMoreno/BentoML | 4c139142fae486ba1ccf6b24e89505c030e3df3f | [
"Apache-2.0"
] | null | null | null | bentoml/picklable_model.py | matheusMoreno/BentoML | 4c139142fae486ba1ccf6b24e89505c030e3df3f | [
"Apache-2.0"
] | null | null | null | bentoml/picklable_model.py | matheusMoreno/BentoML | 4c139142fae486ba1ccf6b24e89505c030e3df3f | [
"Apache-2.0"
] | null | null | null | from ._internal.frameworks.picklable_model import load
from ._internal.frameworks.picklable_model import save
from ._internal.frameworks.picklable_model import load_runner
__all__ = ["load", "load_runner", "save"]
| 35.833333 | 61 | 0.823256 | 27 | 215 | 6.111111 | 0.37037 | 0.218182 | 0.4 | 0.563636 | 0.812121 | 0.812121 | 0.557576 | 0 | 0 | 0 | 0 | 0 | 0.083721 | 215 | 5 | 62 | 43 | 0.837563 | 0 | 0 | 0 | 0 | 0 | 0.088372 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
8483afb036ae748b718e66d5897afc88820964c0 | 12,838 | py | Python | tests/pytests/test_parser.py | rueian/RediSearch | d3a9df4c5d0e98ef0f3d3be9f181b0b64bec5c20 | [
"MIT",
"Ruby",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | tests/pytests/test_parser.py | rueian/RediSearch | d3a9df4c5d0e98ef0f3d3be9f181b0b64bec5c20 | [
"MIT",
"Ruby",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | tests/pytests/test_parser.py | rueian/RediSearch | d3a9df4c5d0e98ef0f3d3be9f181b0b64bec5c20 | [
"MIT",
"Ruby",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | from includes import *
from common import *
from RLTest import Env
def test_and_or_v1():
env = Env(moduleArgs = 'DEFAULT_DIALECT 1')
conn = getConnectionByEnv(env)
env.expect('FT.CREATE', 'idx', 'SCHEMA', 't', 'TEXT', 'SORTABLE').ok()
env.expect('FT.EXPLAIN', 'idx', 'hello world | goodbye moon').equal(r'''
UNION {
INTERSECT {
UNION {
hello
+hello(expanded)
}
UNION {
world
+world(expanded)
}
}
INTERSECT {
UNION {
goodbye
+goodby(expanded)
goodby(expanded)
}
UNION {
moon
+moon(expanded)
}
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', 'hello world | "goodbye" moon').equal(r'''
INTERSECT {
UNION {
INTERSECT {
UNION {
hello
+hello(expanded)
}
UNION {
world
+world(expanded)
}
}
goodbye
}
UNION {
moon
+moon(expanded)
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', 'hello world | goodbye "moon"').equal(r'''
INTERSECT {
UNION {
INTERSECT {
UNION {
hello
+hello(expanded)
}
UNION {
world
+world(expanded)
}
}
UNION {
goodbye
+goodby(expanded)
goodby(expanded)
}
}
moon
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', '"hello" "world" | "goodbye" "moon"').equal(r'''
INTERSECT {
hello
UNION {
world
goodbye
}
moon
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', '("hello" "world")|(("hello" "world")|("hallo" "world"|"werld") | "hello" "world" "werld")').equal(r'''
UNION {
INTERSECT {
hello
world
}
INTERSECT {
UNION {
INTERSECT {
hello
world
}
INTERSECT {
hallo
UNION {
world
werld
}
}
hello
}
world
werld
}
}
'''[1:])
def test_and_or_v2():
env = Env(moduleArgs = 'DEFAULT_DIALECT 2')
conn = getConnectionByEnv(env)
env.expect('FT.CREATE', 'idx', 'SCHEMA', 't', 'TEXT', 'SORTABLE').ok()
env.expect('FT.EXPLAIN', 'idx', 'hello world | goodbye moon').equal(r'''
UNION {
INTERSECT {
UNION {
hello
+hello(expanded)
}
UNION {
world
+world(expanded)
}
}
INTERSECT {
UNION {
goodbye
+goodby(expanded)
goodby(expanded)
}
UNION {
moon
+moon(expanded)
}
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', 'hello world | "goodbye" moon').equal(r'''
UNION {
INTERSECT {
UNION {
hello
+hello(expanded)
}
UNION {
world
+world(expanded)
}
}
INTERSECT {
goodbye
UNION {
moon
+moon(expanded)
}
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', 'hello world | goodbye "moon"').equal(r'''
UNION {
INTERSECT {
UNION {
hello
+hello(expanded)
}
UNION {
world
+world(expanded)
}
}
INTERSECT {
UNION {
goodbye
+goodby(expanded)
goodby(expanded)
}
moon
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', '"hello" "world" | "goodbye" "moon"').equal(r'''
UNION {
INTERSECT {
hello
world
}
INTERSECT {
goodbye
moon
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', '("hello" "world")|(("hello" "world")|("hallo" "world"|"werld") | "hello" "world" "werld")').equal(r'''
UNION {
INTERSECT {
hello
world
}
UNION {
INTERSECT {
hello
world
}
UNION {
INTERSECT {
hallo
world
}
werld
}
INTERSECT {
hello
world
werld
}
}
}
'''[1:])
def test_modifier_v1():
env = Env(moduleArgs = 'DEFAULT_DIALECT 1')
conn = getConnectionByEnv(env)
env.expect('FT.CREATE', 'idx', 'SCHEMA', 't1', 'TEXT', 'NOSTEM', 't2', 'TEXT', 'SORTABLE', 'v', 'VECTOR', 'FLAT', '6', 'TYPE', 'FLOAT32', 'DIM', '2','DISTANCE_METRIC', 'L2').ok()
env.expect('FT.EXPLAIN', 'idx', '@t1:hello world @t2:howdy').equal(r'''
INTERSECT {
@t1:INTERSECT {
@t1:UNION {
@t1:hello
@t1:+hello(expanded)
}
@t1:UNION {
@t1:world
@t1:+world(expanded)
}
}
@t2:UNION {
@t2:howdy
@t2:+howdi(expanded)
@t2:howdi(expanded)
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', '@t1:(hello|world|mars)').equal(r'''
@t1:UNION {
@t1:UNION {
@t1:hello
@t1:+hello(expanded)
}
@t1:UNION {
@t1:world
@t1:+world(expanded)
}
@t1:UNION {
@t1:mars
@t1:+mar(expanded)
@t1:mar(expanded)
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', '@t1:hello world').equal(env.expect('FT.EXPLAIN', 'idx', '@t1:(hello world)').res)
env.expect('FT.EXPLAIN', 'idx', '@t1:hello=>{$weight:5} world').equal(env.expect('FT.EXPLAIN', 'idx', '@t1:(hello=>{$weight:5}) world').res)
env.expect('FT.EXPLAIN', 'idx', '@t1:hello world=>[KNN 10 @v $B]', 'PARAMS', 2, 'B', '#blob#').error().contains('Syntax error')
env.expect('FT.EXPLAIN', 'idx', '@t1:(hello world)=>[KNN 10 @v $B]', 'PARAMS', 2, 'B', '#blob#').error().contains('Syntax error')
def test_modifier_v2(env):
env = Env(moduleArgs = 'DEFAULT_DIALECT 2')
conn = getConnectionByEnv(env)
env.expect('FT.CREATE', 'idx', 'SCHEMA', 't1', 'TEXT', 'NOSTEM', 't2', 'TEXT', 'SORTABLE', 'v', 'VECTOR', 'FLAT', '6', 'TYPE', 'FLOAT32', 'DIM', '2','DISTANCE_METRIC', 'L2').ok()
env.expect('FT.EXPLAIN', 'idx', '@t1:hello world @t2:howdy').equal(r'''
INTERSECT {
@t1:UNION {
@t1:hello
@t1:+hello(expanded)
}
UNION {
world
+world(expanded)
}
@t2:UNION {
@t2:howdy
@t2:+howdi(expanded)
@t2:howdi(expanded)
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', '@t1:(hello|world|mars)').equal('''
@t1:UNION {
@t1:UNION {
@t1:hello
@t1:+hello(expanded)
}
@t1:UNION {
@t1:world
@t1:+world(expanded)
}
@t1:UNION {
@t1:mars
@t1:+mar(expanded)
@t1:mar(expanded)
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', '@t1:hello world').equal(env.expect('FT.EXPLAIN', 'idx', '@t1:(hello) world').res)
env.expect('FT.EXPLAIN', 'idx', '@t1:hello=>{$weight:5} world').equal(env.expect('FT.EXPLAIN', 'idx', '@t1:(hello=>{$weight:5}) world').res)
env.expect('FT.EXPLAIN', 'idx', '@t1:hello world=>[KNN 10 @v $B]', 'PARAMS', 2, 'B', '#blob#').error().contains('Syntax error')
env.expect('FT.EXPLAIN', 'idx', '@t1:(hello world)=>[KNN 10 @v $B]', 'PARAMS', 2, 'B', '#blob#').equal(r'''
VECTOR {
@t1:INTERSECT {
@t1:hello
@t1:world
}
} => {K=10 nearest vectors to `$B` in @v, AS `__v_score`}
'''[1:])
def test_filters_v1():
env = Env(moduleArgs = 'DEFAULT_DIALECT 1')
conn = getConnectionByEnv(env)
env.expect('FT.CREATE', 'idx', 'SCHEMA', 't', 'TEXT', 't2', 'TAG', 'n', 'NUMERIC', 'g', 'GEO', 'v', 'VECTOR', 'FLAT', '6', 'TYPE', 'FLOAT32', 'DIM', '2','DISTANCE_METRIC', 'L2').ok()
env.expect('FT.EXPLAIN', 'idx', 'very simple | @t:hello @t2:{ free\ world } (@n:[1 2]|@n:[3 4]) (@g:[1.5 0.5 0.5 km] -@g:[2.5 1.5 0.5 km])').equal(r'''
INTERSECT {
UNION {
INTERSECT {
UNION {
very
+veri(expanded)
veri(expanded)
}
UNION {
simple
+simpl(expanded)
simpl(expanded)
}
}
@t:UNION {
@t:hello
@t:+hello(expanded)
}
}
TAG:@t2 {
free\ world
}
UNION {
NUMERIC {1.000000 <= @n <= 2.000000}
NUMERIC {3.000000 <= @n <= 4.000000}
}
INTERSECT {
GEO g:{1.500000,0.500000 --> 0.500000 km}
NOT{
GEO g:{2.500000,1.500000 --> 0.500000 km}
}
}
}
'''[1:])
def test_filters_v2():
env = Env(moduleArgs = 'DEFAULT_DIALECT 2')
conn = getConnectionByEnv(env)
env.expect('FT.CREATE', 'idx', 'SCHEMA', 't', 'TEXT', 't2', 'TAG', 'n', 'NUMERIC', 'g', 'GEO', 'v', 'VECTOR', 'FLAT', '6', 'TYPE', 'FLOAT32', 'DIM', '2','DISTANCE_METRIC', 'L2').ok()
env.expect('FT.EXPLAIN', 'idx', 'very simple | @t:hello @t2:{ free\ world } (@n:[1 2]|@n:[3 4]) (@g:[1.5 0.5 0.5 km] -@g:[2.5 1.5 0.5 km])').equal(r'''
UNION {
INTERSECT {
UNION {
very
+veri(expanded)
veri(expanded)
}
UNION {
simple
+simpl(expanded)
simpl(expanded)
}
}
INTERSECT {
@t:UNION {
@t:hello
@t:+hello(expanded)
}
TAG:@t2 {
free\ world
}
UNION {
NUMERIC {1.000000 <= @n <= 2.000000}
NUMERIC {3.000000 <= @n <= 4.000000}
}
INTERSECT {
GEO g:{1.500000,0.500000 --> 0.500000 km}
NOT{
GEO g:{2.500000,1.500000 --> 0.500000 km}
}
}
}
}
'''[1:])
def test_combinations_v1():
env = Env(moduleArgs = 'DEFAULT_DIALECT 1')
conn = getConnectionByEnv(env)
env.expect('FT.CREATE', 'idx', 'SCHEMA', 't', 'TEXT', 't2', 'TAG', 'n', 'NUMERIC', 'g', 'GEO', 'v', 'VECTOR', 'FLAT', '6', 'TYPE', 'FLOAT32', 'DIM', '2','DISTANCE_METRIC', 'L2').ok()
env.expect('FT.EXPLAIN', 'idx', 'hello | "world" again', 'PARAMS', 2, 'B', '#blob#').equal(r'''
INTERSECT {
UNION {
UNION {
hello
+hello(expanded)
}
world
}
UNION {
again
+again(expanded)
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', 'hello | -"world" again', 'PARAMS', 2, 'B', '#blob#').equal(r'''
UNION {
UNION {
hello
+hello(expanded)
}
NOT{
INTERSECT {
world
again
}
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', 'hello ~-"world" ~again', 'PARAMS', 2, 'B', '#blob#').equal(r'''
INTERSECT {
UNION {
hello
+hello(expanded)
}
OPTIONAL{
NOT{
world
}
}
OPTIONAL{
again
}
}
'''[1:])
def test_combinations_v2():
env = Env(moduleArgs = 'DEFAULT_DIALECT 2')
conn = getConnectionByEnv(env)
env.expect('FT.CREATE', 'idx', 'SCHEMA', 't', 'TEXT', 't2', 'TAG', 'n', 'NUMERIC', 'g', 'GEO', 'v', 'VECTOR', 'FLAT', '6', 'TYPE', 'FLOAT32', 'DIM', '2','DISTANCE_METRIC', 'L2').ok()
env.expect('FT.EXPLAIN', 'idx', 'hello | "world" again', 'PARAMS', 2, 'B', '#blob#').equal(r'''
UNION {
UNION {
hello
+hello(expanded)
}
INTERSECT {
world
UNION {
again
+again(expanded)
}
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', 'hello | -"world" again', 'PARAMS', 2, 'B', '#blob#').equal(r'''
UNION {
UNION {
hello
+hello(expanded)
}
INTERSECT {
NOT{
world
}
UNION {
again
+again(expanded)
}
}
}
'''[1:])
env.expect('FT.EXPLAIN', 'idx', 'hello ~-"world" ~again', 'PARAMS', 2, 'B', '#blob#').equal(r'''
INTERSECT {
UNION {
hello
+hello(expanded)
}
OPTIONAL{
NOT{
world
}
}
OPTIONAL{
again
}
}
'''[1:])
def nest_exp(modifier, term, is_and, i):
if i == 1:
return '(@' + modifier + ':' + term + str(i) + ')'
return '(' + term + str(i) + (' ' if is_and else '|') + nest_exp(modifier, term, is_and, i - 1) + ')'
def testUnsupportedNesting(env):
nest_level = 200
env.expect('FT.CREATE', 'idx', 'SCHEMA', 'mod', 'TEXT').ok()
and_exp = nest_exp('mod', 'a', True, nest_level)
or_exp = nest_exp('mod', 'a', False, nest_level)
# env.debugPrint(and_exp, force=True)
# env.debugPrint(or_exp, force=True)
env.expect('ft.search', 'idx', and_exp, 'DIALECT', 1).error().contains('Syntax error at offset')
env.expect('ft.search', 'idx', and_exp, 'DIALECT', 2).error().contains('Parser stack overflow.')
env.expect('ft.search', 'idx', or_exp, 'DIALECT', 1).error().contains('Syntax error at offset')
env.expect('ft.search', 'idx', or_exp, 'DIALECT', 2).error().contains('Parser stack overflow.')
def testSupportedNesting_v1():
env = Env(moduleArgs = 'DEFAULT_DIALECT 1')
nest_level = 30
env.expect('FT.CREATE', 'idx', 'SCHEMA', 'mod', 'TEXT').ok()
and_exp = nest_exp('mod', 'a', True, nest_level)
or_exp = nest_exp('mod', 'a', False, nest_level)
# env.debugPrint(and_exp, force=True)
# env.debugPrint(or_exp, force=True)
env.expect('ft.search', 'idx', and_exp).equal([0])
env.expect('ft.search', 'idx', or_exp).equal([0])
def testSupportedNesting_v2():
env = Env(moduleArgs = 'DEFAULT_DIALECT 2')
nest_level = 84
env.expect('FT.CREATE', 'idx', 'SCHEMA', 'mod', 'TEXT').ok()
and_exp = nest_exp('mod', 'a', True, nest_level)
or_exp = nest_exp('mod', 'a', False, nest_level)
# env.debugPrint(and_exp, force=True)
# env.debugPrint(or_exp, force=True)
env.expect('ft.search', 'idx', and_exp).equal([0])
env.expect('ft.search', 'idx', or_exp).equal([0])
def testModifierList(env):
env.expect('FT.CREATE', 'idx', 'SCHEMA', 't1', 'TEXT', 't2', 'TEXT').ok()
env.expect('FT.EXPLAIN', 'idx', '@t1|t2:(text value)').equal(r'''
@t1|t2:INTERSECT {
@t1|t2:UNION {
@t1|t2:text
@t1|t2:+text(expanded)
}
@t1|t2:UNION {
@t1|t2:value
@t1|t2:+valu(expanded)
@t1|t2:valu(expanded)
}
}
'''[1:])
| 21.796265 | 186 | 0.532404 | 1,576 | 12,838 | 4.286802 | 0.081218 | 0.073268 | 0.08955 | 0.09325 | 0.916963 | 0.903345 | 0.887211 | 0.862197 | 0.840142 | 0.834962 | 0 | 0.03955 | 0.251597 | 12,838 | 588 | 187 | 21.833333 | 0.663614 | 0.016513 | 0 | 0.689013 | 0 | 0.007449 | 0.641385 | 0.01593 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024209 | false | 0 | 0.005587 | 0 | 0.03352 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
84a6f319cc3ff7a48bb1e99e75db91f0961f20c6 | 100,752 | py | Python | solution/lc5292.py | sth4nothing/pyleetcode | 70ac2dc55b0cbcd243b38103a96dd796538a3c05 | [
"MIT"
] | null | null | null | solution/lc5292.py | sth4nothing/pyleetcode | 70ac2dc55b0cbcd243b38103a96dd796538a3c05 | [
"MIT"
] | null | null | null | solution/lc5292.py | sth4nothing/pyleetcode | 70ac2dc55b0cbcd243b38103a96dd796538a3c05 | [
"MIT"
] | null | null | null | import collections
from typing import List, Dict, Callable, OrderedDict
class Solution:
def isPossibleDivide(self, nums: List[int], k: int) -> bool:
nums.sort()
i, j, n = -1, 0, len(nums)
cnt = collections.OrderedDict()
for v in nums:
cnt[v] = 1 + (cnt[v] if v in cnt else 0)
while cnt:
v = list(cnt.keys())[0]
for i in range(k):
if v + i not in cnt:
return False
cnt[v + i] -= 1
if cnt[v + i] == 0:
cnt.pop(v + i)
return True
inputs = '''
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2]
2
'''
import json
args = tuple(json.loads(line) for line in inputs.splitlines() if line)
print(Solution().isPossibleDivide(*args))
| 3,474.206897 | 100,001 | 0.500069 | 50,102 | 100,752 | 1.005609 | 0.000938 | 0.992398 | 1.488538 | 1.984638 | 0.992418 | 0.992418 | 0.992418 | 0.992418 | 0.992418 | 0.992418 | 0 | 0.497711 | 0.002739 | 100,752 | 28 | 100,002 | 3,598.285714 | 0.003732 | 0 | 0 | 0 | 0 | 0.04 | 0.992586 | 0.992546 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.12 | 0 | 0.28 | 0.04 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 16 |
84c0c64aaea53febcc31d863fb725c29f8c50887 | 11,312 | py | Python | turbosnake/test/snapshots/snap_test_functional_component_slots.py | AlexeyBond/turbosnake | 832c924c2cf29a741234848792bf750aa72fece2 | [
"MIT"
] | 2 | 2021-09-23T01:11:22.000Z | 2022-02-04T21:08:24.000Z | turbosnake/test/snapshots/snap_test_functional_component_slots.py | AlexeyBond/turbosnake | 832c924c2cf29a741234848792bf750aa72fece2 | [
"MIT"
] | null | null | null | turbosnake/test/snapshots/snap_test_functional_component_slots.py | AlexeyBond/turbosnake | 832c924c2cf29a741234848792bf750aa72fece2 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import GenericRepr, Snapshot
snapshots = Snapshot()
snapshots['SlotsRenderTest::test_render_slots_with_custom_prop_names 1'] = {
'__class__': 'FunctionalComponent<tc>',
'__component__': True,
'children': [
{
'__class__': GenericRepr("<class 'turbosnake._components.Fragment'>"),
'__component__': True,
'children': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'а'
}
}
],
'key': 'slotA',
'props': {
'children': {
'__class__': GenericRepr("<class 'turbosnake._components.ComponentsCollection'>"),
'items': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'а'
}
}
]
}
}
},
{
'__class__': GenericRepr("<class 'turbosnake._components.Fragment'>"),
'__component__': True,
'children': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'ь'
}
}
],
'key': 'slotB',
'props': {
'children': {
'__class__': GenericRepr("<class 'turbosnake._components.ComponentsCollection'>"),
'items': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'ь'
}
}
]
}
}
}
],
'key': None,
'props': {
'another_slot': {
'__class__': GenericRepr("<class 'turbosnake._components.ComponentsCollection'>"),
'items': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'ь'
}
}
]
},
'the_slot_named_a': {
'__class__': GenericRepr("<class 'turbosnake._components.ComponentsCollection'>"),
'items': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'а'
}
}
]
}
}
}
snapshots['SlotsRenderTest::test_render_slotted_component 1'] = {
'__class__': 'FunctionalComponent<tc>',
'__component__': True,
'children': [
{
'__class__': GenericRepr("<class 'turbosnake._components.Fragment'>"),
'__component__': True,
'children': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'stub-1-1'
}
},
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
2
),
'props': {
'label': 'stub-1-2'
}
}
],
'key': 'slot1',
'props': {
'children': {
'__class__': GenericRepr("<class 'turbosnake._components.ComponentsCollection'>"),
'items': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'stub-1-1'
}
},
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
2
),
'props': {
'label': 'stub-1-2'
}
}
]
}
}
},
{
'__class__': GenericRepr("<class 'turbosnake._components.Fragment'>"),
'__component__': True,
'children': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'stub-2-1'
}
},
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
2
),
'props': {
'label': 'stub-2-2'
}
}
],
'key': 'slot2',
'props': {
'children': {
'__class__': GenericRepr("<class 'turbosnake._components.ComponentsCollection'>"),
'items': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'stub-2-1'
}
},
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
2
),
'props': {
'label': 'stub-2-2'
}
}
]
}
}
}
],
'key': None,
'props': {
'slot_1': {
'__class__': GenericRepr("<class 'turbosnake._components.ComponentsCollection'>"),
'items': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'stub-1-1'
}
},
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
2
),
'props': {
'label': 'stub-1-2'
}
}
]
},
'slot_2': {
'__class__': GenericRepr("<class 'turbosnake._components.ComponentsCollection'>"),
'items': [
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
1
),
'props': {
'label': 'stub-2-1'
}
},
{
'__class__': 'FunctionalComponent<stub>',
'__component__': True,
'children': [
],
'key': (
'FunctionalComponent<stub>',
2
),
'props': {
'label': 'stub-2-2'
}
}
]
}
}
}
| 33.467456 | 102 | 0.279791 | 432 | 11,312 | 6.733796 | 0.127315 | 0.284634 | 0.173255 | 0.228945 | 0.892059 | 0.892059 | 0.891715 | 0.891715 | 0.891715 | 0.891715 | 0 | 0.01167 | 0.613685 | 11,312 | 337 | 103 | 33.566766 | 0.654005 | 0.005481 | 0 | 0.646526 | 0 | 0 | 0.262203 | 0.138081 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006042 | 0 | 0.006042 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ca0ac78feea23b4b4ced6fb4fdfaf6d6404887c2 | 2,897 | py | Python | requests/Cookie.py | pengchenyu111/SpiderLearning | d1fca1c7f46bfb22ad23f9396d0f2e2301ec4534 | [
"Apache-2.0"
] | 3 | 2020-11-21T13:13:46.000Z | 2020-12-03T05:43:32.000Z | requests/Cookie.py | pengchenyu111/SpiderLearning | d1fca1c7f46bfb22ad23f9396d0f2e2301ec4534 | [
"Apache-2.0"
] | null | null | null | requests/Cookie.py | pengchenyu111/SpiderLearning | d1fca1c7f46bfb22ad23f9396d0f2e2301ec4534 | [
"Apache-2.0"
] | 1 | 2020-12-03T05:43:53.000Z | 2020-12-03T05:43:53.000Z | import requests
r1 = requests.get('http://www.baidu.com')
print(r1.cookies)
for key, value in r1.cookies.items():
print(key, '=', value)
# 获取简书首页内容
# 第一种设置Cookies的方式,直接通过Header设置
headers = {
'Host': 'www.jianshu.com',
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36',
'Cookie': 'Hm_lvt_0c0e9d9b1e7d617b3e6842e85b9fb068=1550805089,1550815557,155116976; locale=zh-CN; read_mode=day; default_font=font2; remember_user_token=W1sxMzYxNDI1OF0sIiQyYSQxMSRWWDhUU0JKOU5oZDZtYjhoblMwclYuIiwiMTU1MTM0Nzk5MS4wMzU3MjI3Il0%3D--04787a1b6cfda5ed5974bf50e178b899e99eb4ec; __yadk_uid=xeqG3EJDiKBRfxVO3j2WeLKEUSNMutrB; sensorsdata2015jssdkcross=%7B%22distinct_id%22%3A%2213614258%22%2C%22%24device_id%22%3A%22169338b879465f-05e36cc0827c6-36657105-3686400-169338b8795759%22%2C%22props%22%3A%7B%22%24latest_traffic_source_type%22%3A%22%E7%9B%B4%E6%8E%A5%E6%B5%81%E9%87%8F%22%2C%22%24latest_referrer%22%3A%22%22%2C%22%24latest_referrer_host%22%3A%22%22%2C%22%24latest_search_keyword%22%3A%22%E6%9C%AA%E5%8F%96%E5%88%B0%E5%80%BC_%E7%9B%B4%E6%8E%A5%E6%89%93%E5%BC%80%22%7D%2C%22first_id%22%3A%221699465f-05e36cc0827c6-36657105-3686400-169338b8795759%22%7D; _m7e_session_core=f79a4a7802e2ffb7035adf0c44294875; Hm_lpvt_0c0e9d9b1e7d617b3e6842e85b9fb068=1551429715'
}
r2 = requests.get('https://www.jianshu.com', headers=headers)
print(r2.text)
# 另外一种设置Cookie的方式:RequestsCookieJar
headers = {
'Host': 'www.jianshu.com',
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36',
}
cookies = 'Hm_lvt_0c0e9d9b1e7d617b3e6842e85b9fb068=1550805089,1550815557,1551167360,1551347976; locale=zh-CN; read_mode=day; default_font=font2; remember_user_token=W1sxMzYxNDI1OF0sIiQyYSQxMSRWWDhUU0JKOU5oZDZtYjhoblMwclYuIiwiMTU1MTM0Nzk5MS4wMzU3MjI3Il0%3D--04787a1b6cfda5ed5974bf50e178b899e99eb4ec; __yadk_uid=xeqG3EJDiKBRfxVO3j2WeLKEUSNMutrB; sensorsdata2015jssdkcross=%7B%22distinct_id%22%3A%2213614258%22%2C%22%24device_id%22%3A%22169338b879465f-05e36cc0827c6-36657105-3686400-169338b8795759%22%2C%22props%22%3A%7B%22%24latest_traffic_source_type%22%3A%22%E7%9B%B4%E6%8E%A5%E6%B5%81%E9%87%8F%22%2C%22%24latest_referrer%22%3A%22%22%2C%22%24latest_referrer_host%22%3A%22%22%2C%22%24latest_search_keyword%22%3A%22%E6%9C%AA%E5%8F%96%E5%88%B0%E5%80%BC_%E7%9B%B4%E6%8E%A5%E6%89%93%E5%BC%80%22%7D%2C%22first_id%22%3A%22169338b879465f-05e36cc0827c6-36657105-3686400-169338b8795759%22%7D; _m7e_session_core=f79a4a7802e2ffb7035adf0c44294875; Hm_lpvt_0c0e9d9b1e7d617b3e6842e85b9fb068=1551429715'
jar = requests.cookies.RequestsCookieJar()
for cookie in cookies.split(';'):
# 继续拆分出key,value,1为最大分割数
key, value = cookie.split('=', 1)
jar.set(key, value)
r3 = requests.get('http://www.jianshu.com', cookies=jar, headers=headers)
print(r3.text)
| 85.205882 | 987 | 0.806351 | 423 | 2,897 | 5.390071 | 0.319149 | 0.02807 | 0.021053 | 0.036842 | 0.803509 | 0.753509 | 0.753509 | 0.753509 | 0.753509 | 0.746491 | 0 | 0.301784 | 0.051778 | 2,897 | 33 | 988 | 87.787879 | 0.528213 | 0.032447 | 0 | 0.26087 | 0 | 0.173913 | 0.82416 | 0.651894 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.043478 | 0 | 0.043478 | 0.173913 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ca2c4b10c588cca7f9286b6bc4de074487fa569d | 8,127 | py | Python | tests/test_dcos_e2e/test_legacy.py | cprovencher/dcos-e2e | c54b3d41f246efe3f90dcf225b0bffdc1f615daa | [
"Apache-2.0"
] | null | null | null | tests/test_dcos_e2e/test_legacy.py | cprovencher/dcos-e2e | c54b3d41f246efe3f90dcf225b0bffdc1f615daa | [
"Apache-2.0"
] | null | null | null | tests/test_dcos_e2e/test_legacy.py | cprovencher/dcos-e2e | c54b3d41f246efe3f90dcf225b0bffdc1f615daa | [
"Apache-2.0"
] | null | null | null | """
Tests for support of legacy versions of DC/OS.
We do not test the whole matrix of support, such as each version with each
Docker version or base operating system, for cost reasons.
"""
import uuid
from pathlib import Path
from kazoo.client import KazooClient
from passlib.hash import sha512_crypt
from dcos_e2e.backends import ClusterBackend
from dcos_e2e.cluster import Cluster
from dcos_e2e.node import Output
class Test19:
"""
Tests for running DC/OS 1.9.
"""
def test_oss(
self,
cluster_backend: ClusterBackend,
oss_1_9_installer: Path,
) -> None:
"""
An open source DC/OS 1.9 cluster can be started.
"""
with Cluster(cluster_backend=cluster_backend) as cluster:
cluster.install_dcos_from_path(
dcos_installer=oss_1_9_installer,
dcos_config=cluster.base_config,
output=Output.CAPTURE,
ip_detect_path=cluster_backend.ip_detect_path,
)
cluster.wait_for_dcos_oss()
# We check that the user created with the special credentials does
# not exist after ``wait_for_dcos_oss``.
email = 'albert@bekstil.net'
path = '/dcos/users/{email}'.format(email=email)
(master, ) = cluster.masters
zk_client_port = '2181'
zk_host = str(master.public_ip_address)
zk_client = KazooClient(hosts=zk_host + ':' + zk_client_port)
zk_client.start()
zk_user_exists = zk_client.exists(path=path)
zk_client.stop()
assert not zk_user_exists
def test_enterprise(
self,
cluster_backend: ClusterBackend,
enterprise_1_9_installer: Path,
) -> None:
"""
A DC/OS Enterprise 1.9 cluster can be started.
"""
superuser_username = str(uuid.uuid4())
superuser_password = str(uuid.uuid4())
config = {
'superuser_username': superuser_username,
'superuser_password_hash': sha512_crypt.hash(superuser_password),
}
with Cluster(cluster_backend=cluster_backend) as cluster:
cluster.install_dcos_from_path(
dcos_installer=enterprise_1_9_installer,
dcos_config={
**cluster.base_config,
**config,
},
output=Output.CAPTURE,
ip_detect_path=cluster_backend.ip_detect_path,
)
cluster.wait_for_dcos_ee(
superuser_username=superuser_username,
superuser_password=superuser_password,
)
class Test110:
"""
Tests for running DC/OS 1.10.
"""
def test_oss(
self,
cluster_backend: ClusterBackend,
oss_1_10_installer: Path,
) -> None:
"""
An open source DC/OS 1.10 cluster can be started.
"""
with Cluster(cluster_backend=cluster_backend) as cluster:
cluster.install_dcos_from_path(
dcos_installer=oss_1_10_installer,
dcos_config=cluster.base_config,
output=Output.CAPTURE,
ip_detect_path=cluster_backend.ip_detect_path,
)
cluster.wait_for_dcos_oss()
def test_enterprise(
self,
cluster_backend: ClusterBackend,
enterprise_1_10_installer: Path,
license_key_contents: str,
) -> None:
"""
A DC/OS Enterprise 1.10 cluster can be started.
"""
superuser_username = str(uuid.uuid4())
superuser_password = str(uuid.uuid4())
config = {
'superuser_username': superuser_username,
'superuser_password_hash': sha512_crypt.hash(superuser_password),
'fault_domain_enabled': False,
'license_key_contents': license_key_contents,
}
with Cluster(cluster_backend=cluster_backend) as cluster:
cluster.install_dcos_from_path(
dcos_installer=enterprise_1_10_installer,
dcos_config={
**cluster.base_config,
**config,
},
output=Output.CAPTURE,
ip_detect_path=cluster_backend.ip_detect_path,
)
cluster.wait_for_dcos_ee(
superuser_username=superuser_username,
superuser_password=superuser_password,
)
class Test111:
"""
Tests for running DC/OS 1.11.
"""
def test_oss(
self,
cluster_backend: ClusterBackend,
oss_1_11_installer: Path,
) -> None:
"""
An open source DC/OS 1.11 cluster can be started.
"""
with Cluster(cluster_backend=cluster_backend) as cluster:
cluster.install_dcos_from_path(
dcos_installer=oss_1_11_installer,
dcos_config=cluster.base_config,
output=Output.CAPTURE,
ip_detect_path=cluster_backend.ip_detect_path,
)
cluster.wait_for_dcos_oss()
def test_enterprise(
self,
cluster_backend: ClusterBackend,
enterprise_1_11_installer: Path,
license_key_contents: str,
) -> None:
"""
A DC/OS Enterprise 1.11 cluster can be started.
"""
superuser_username = str(uuid.uuid4())
superuser_password = str(uuid.uuid4())
config = {
'superuser_username': superuser_username,
'superuser_password_hash': sha512_crypt.hash(superuser_password),
'fault_domain_enabled': False,
'license_key_contents': license_key_contents,
}
with Cluster(cluster_backend=cluster_backend) as cluster:
cluster.install_dcos_from_path(
dcos_installer=enterprise_1_11_installer,
dcos_config={
**cluster.base_config,
**config,
},
output=Output.CAPTURE,
ip_detect_path=cluster_backend.ip_detect_path,
)
cluster.wait_for_dcos_ee(
superuser_username=superuser_username,
superuser_password=superuser_password,
)
class Test112:
"""
Tests for running DC/OS 1.12.
"""
def test_oss(
self,
cluster_backend: ClusterBackend,
oss_1_12_installer: Path,
) -> None:
"""
An open source DC/OS 1.12 cluster can be started.
"""
with Cluster(cluster_backend=cluster_backend) as cluster:
cluster.install_dcos_from_path(
dcos_installer=oss_1_12_installer,
dcos_config=cluster.base_config,
output=Output.CAPTURE,
ip_detect_path=cluster_backend.ip_detect_path,
)
cluster.wait_for_dcos_oss()
def test_enterprise(
self,
cluster_backend: ClusterBackend,
enterprise_1_12_installer: Path,
license_key_contents: str,
) -> None:
"""
A DC/OS Enterprise 1.12 cluster can be started.
"""
superuser_username = str(uuid.uuid4())
superuser_password = str(uuid.uuid4())
config = {
'superuser_username': superuser_username,
'superuser_password_hash': sha512_crypt.hash(superuser_password),
'fault_domain_enabled': False,
'license_key_contents': license_key_contents,
}
with Cluster(cluster_backend=cluster_backend) as cluster:
cluster.install_dcos_from_path(
dcos_installer=enterprise_1_12_installer,
dcos_config={
**cluster.base_config,
**config,
},
output=Output.CAPTURE,
ip_detect_path=cluster_backend.ip_detect_path,
)
cluster.wait_for_dcos_ee(
superuser_username=superuser_username,
superuser_password=superuser_password,
)
| 32.508 | 78 | 0.587055 | 853 | 8,127 | 5.256741 | 0.138335 | 0.099911 | 0.042819 | 0.067797 | 0.841436 | 0.838091 | 0.812667 | 0.812667 | 0.809545 | 0.724799 | 0 | 0.021913 | 0.337394 | 8,127 | 249 | 79 | 32.638554 | 0.810771 | 0.097822 | 0 | 0.686813 | 0 | 0 | 0.046052 | 0.012996 | 0 | 0 | 0 | 0 | 0.005495 | 1 | 0.043956 | false | 0.071429 | 0.038462 | 0 | 0.104396 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
ca41986760dba7edb9f154aef47a30aaac965ad5 | 79 | py | Python | h4rm0ny/envs/__init__.py | L1NNA/malware_rl | 9907c40bf0e95d5471c45bde7b69e84140a9e4d6 | [
"MIT"
] | null | null | null | h4rm0ny/envs/__init__.py | L1NNA/malware_rl | 9907c40bf0e95d5471c45bde7b69e84140a9e4d6 | [
"MIT"
] | null | null | null | h4rm0ny/envs/__init__.py | L1NNA/malware_rl | 9907c40bf0e95d5471c45bde7b69e84140a9e4d6 | [
"MIT"
] | null | null | null | from h4rm0ny.envs.malconv_gym import MalConvEnv
from h4rm0ny.envs import utils
| 26.333333 | 47 | 0.860759 | 12 | 79 | 5.583333 | 0.666667 | 0.328358 | 0.447761 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056338 | 0.101266 | 79 | 2 | 48 | 39.5 | 0.887324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ca8c42d9470afe877cc9f21b4cb4d2d4dc5b05af | 196 | py | Python | viabel/__init__.py | Manushi22/viabel | 97df2b09d82a1ec2d892d386d41da1dbdc29f3c1 | [
"MIT"
] | 29 | 2019-10-20T21:10:35.000Z | 2022-02-15T23:43:30.000Z | viabel/__init__.py | Manushi22/viabel | 97df2b09d82a1ec2d892d386d41da1dbdc29f3c1 | [
"MIT"
] | 29 | 2020-10-30T00:53:45.000Z | 2021-03-11T07:41:08.000Z | viabel/__init__.py | Manushi22/viabel | 97df2b09d82a1ec2d892d386d41da1dbdc29f3c1 | [
"MIT"
] | 8 | 2019-10-22T13:08:54.000Z | 2021-07-28T15:28:49.000Z | from viabel.approximations import *
from viabel.convenience import *
from viabel.diagnostics import *
from viabel.models import *
from viabel.objectives import *
from viabel.optimization import *
| 28 | 35 | 0.816327 | 24 | 196 | 6.666667 | 0.375 | 0.375 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122449 | 196 | 6 | 36 | 32.666667 | 0.930233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
049e5d1de083b5b5d32b368ad7cf57d039c039b6 | 5,692 | py | Python | src/floor.py | rgoliveira/PyTRON | 6bf2c6acc5f9d2e1a789de9d0d1a412835d6fefe | [
"Unlicense"
] | 1 | 2019-08-06T22:59:40.000Z | 2019-08-06T22:59:40.000Z | src/floor.py | rgoliveira/PyTRON | 6bf2c6acc5f9d2e1a789de9d0d1a412835d6fefe | [
"Unlicense"
] | null | null | null | src/floor.py | rgoliveira/PyTRON | 6bf2c6acc5f9d2e1a789de9d0d1a412835d6fefe | [
"Unlicense"
] | null | null | null | from OpenGL.GL import *
from objloader import *
from filenames import *
class Floor:
def __init__(self, size, tileSize, y = 0):
self.size = size
self.tileSize = tileSize
self.width = self.depth = size * tileSize
self.y = 0
self.wallHeight = 15
self.texture = load2DTexture(Filenames.textures.floor_tile)
self.wallTexture = load2DTexture(Filenames.textures.wall_tile)
self.skyTexture = load2DTexture(Filenames.textures.sky)
def draw(self):
glPushMatrix()
## floor
glEnable(GL_TEXTURE_2D)
glBindTexture(GL_TEXTURE_2D, self.texture)
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT)
glBegin(GL_QUADS)
glTexCoord2d(0.0, 0.0)
glNormal3f(0., 1., 0.)
glVertex3f(0, self.y, 0)
glTexCoord2f(1.0*self.size, 0.0)
glNormal3f(0.0,1.0,0.0)
glVertex3f(self.size*self.tileSize, self.y, 0)
glTexCoord2f(1.0*self.size, 1.0*self.size)
glNormal3f(0.0,1.0,0.0)
glVertex3f(self.size*self.tileSize, self.y, self.size*self.tileSize)
glTexCoord2f(0.0, 1.0*self.size)
glNormal3f(0.0,1.0,0.0)
glVertex3f(0, self.y, self.size*self.tileSize)
glEnd()
glDisable(GL_TEXTURE_2D)
## sky
"""
glEnable(GL_TEXTURE_2D)
glBindTexture(GL_TEXTURE_2D, self.skyTexture)
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT)
"""
glColor3f(1., 0., 0.)
glBegin(GL_QUADS)
glTexCoord2d(0.0, 0.0)
glNormal3f(0., -1., 0.)
glVertex3f(0, self.wallHeight, 0)
glTexCoord2f(1.0*self.size, 0.0)
glNormal3f(0.0,-1.0,0.0)
glVertex3f(self.size*self.tileSize, self.wallHeight, 0)
glTexCoord2f(1.0*self.size, 1.0*self.size)
glNormal3f(0.0,-1.0,0.0)
glVertex3f(self.size*self.tileSize, self.wallHeight, self.size*self.tileSize)
glTexCoord2f(0.0, 1.0*self.size)
glNormal3f(0.0,-1.0,0.0)
glVertex3f(0, self.wallHeight, self.size*self.tileSize)
glEnd()
#glDisable(GL_TEXTURE_2D)
### walls
glEnable(GL_TEXTURE_2D)
glBindTexture(GL_TEXTURE_2D, self.wallTexture)
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT)
## west wall
glBegin(GL_QUADS)
glTexCoord2f(0., 0.)
glNormal3f(0.5, 0., 0.5)
glVertex3f(0, self.y, 0)
glTexCoord2f(0., 1)
glNormal3f(0.5, 0., 0.5)
glVertex3f(0, -self.wallHeight, 0)
glTexCoord2f(1, 1)
glNormal3f(0.5, 0., -0.5)
glVertex3f(0, -self.wallHeight, self.size*self.tileSize)
glTexCoord2f(1, 0)
glNormal3f(0.5, 0., -0.5)
glVertex3f(0, 0, self.size*self.tileSize)
glEnd()
## south wall
glBegin(GL_QUADS)
glTexCoord2f(1, 0.)
glNormal3f(0.5, 0., 0.5)
glVertex3f(0, self.y, 0)
glTexCoord2f(0. ,0.)
glNormal3f(-0.5, 0., 0.5)
glVertex3f(self.size*self.tileSize, self.y, 0)
glTexCoord2f(0., 1)
glNormal3f(-0.5, 0., 0.5)
glVertex3f(self.size*self.tileSize, -self.wallHeight, 0)
glTexCoord2f(1, 1)
glNormal3f(0.5, 0., 0.5)
glVertex3f(0, -self.wallHeight, 0)
glEnd()
## east wall
glBegin(GL_QUADS)
glTexCoord2f(1, 0)
glNormal3f(-0.5, 0., 0.5)
glVertex3f(self.size*self.tileSize, self.y, 0)
glTexCoord2f(0., 0.)
glNormal3f(-0.5, 0., -0.5)
glVertex3f(self.size*self.tileSize, self.y, self.size*self.tileSize)
glTexCoord2f(0., 1)
glNormal3f(-0.5, 0., -0.5)
glVertex3f(self.size*self.tileSize, -self.wallHeight, self.size*self.tileSize)
glTexCoord2f(1, 1)
glNormal3f(-0.5, 0., 0.5)
glVertex3f(self.size*self.tileSize, -self.wallHeight, 0)
glEnd()
## north wall
glBegin(GL_QUADS)
glTexCoord2f(0., 0.)
glNormal3f(0.5, 0., -0.5)
glVertex3f(0, self.y, self.size*self.tileSize)
glTexCoord2f(1, 0)
glNormal3f(-0.5, 0., -0.5)
glVertex3f(self.size*self.tileSize, self.y, self.size*self.tileSize)
glTexCoord2f(1, 1)
glNormal3f(-0.5, 0., -0.5)
glVertex3f(self.size*self.tileSize, -self.wallHeight, self.size*self.tileSize)
glTexCoord2f(0., 1)
glNormal3f(0.5, 0., -0.5)
glVertex3f(0, -self.wallHeight, self.size*self.tileSize)
glEnd()
glDisable(GL_TEXTURE_2D)
glPopMatrix()
| 34.289157 | 86 | 0.591181 | 734 | 5,692 | 4.420981 | 0.084469 | 0.030817 | 0.123267 | 0.14792 | 0.864099 | 0.856394 | 0.856394 | 0.856394 | 0.852696 | 0.795378 | 0 | 0.077663 | 0.287421 | 5,692 | 165 | 87 | 34.49697 | 0.722387 | 0.014231 | 0 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017094 | false | 0 | 0.025641 | 0 | 0.051282 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
04d7dad69dcaea080f8527ce15a872e5f44f42a6 | 7,064 | py | Python | tests/recipes/vasp/test_vasp_recipes.py | siddhant-deepsource/quacc | 60bcb32f65e9cee0bd44aa6cfc0df142a76387cf | [
"BSD-3-Clause-LBNL"
] | null | null | null | tests/recipes/vasp/test_vasp_recipes.py | siddhant-deepsource/quacc | 60bcb32f65e9cee0bd44aa6cfc0df142a76387cf | [
"BSD-3-Clause-LBNL"
] | null | null | null | tests/recipes/vasp/test_vasp_recipes.py | siddhant-deepsource/quacc | 60bcb32f65e9cee0bd44aa6cfc0df142a76387cf | [
"BSD-3-Clause-LBNL"
] | null | null | null | from ase.build import bulk, molecule
from jobflow.managers.local import run_locally
from quacc.recipes.vasp.core import RelaxMaker, StaticMaker
from quacc.recipes.vasp.slabs import (
BulkToSlabMaker,
SlabRelaxMaker,
SlabStaticMaker,
SlabToAdsSlabMaker,
)
def test_static_maker():
atoms = bulk("Cu") * (2, 2, 2)
job = StaticMaker().make(atoms)
responses = run_locally(job, ensure_success=True)
output = responses[job.uuid][1].output
assert output["nsites"] == len(atoms)
assert output["parameters"]["isym"] == 2
assert output["parameters"]["nsw"] == 0
assert output["parameters"]["lwave"] == True
assert output["name"] == "VASP-Static"
job = StaticMaker(
preset="BulkRelaxSet", name="test", swaps={"ncore": 2, "kpar": 4}
).make(atoms)
responses = run_locally(job, ensure_success=True)
output = responses[job.uuid][1].output
assert output["parameters"]["encut"] == 650
assert output["parameters"]["ncore"] == 2
assert output["parameters"]["kpar"] == 4
assert output["name"] == "test"
def test_relax_maker():
atoms = bulk("Cu") * (2, 2, 2)
job = RelaxMaker().make(atoms)
responses = run_locally(job, ensure_success=True)
output = responses[job.uuid][1].output
assert output["nsites"] == len(atoms)
assert output["parameters"]["isym"] == 0
assert output["parameters"]["nsw"] > 0
assert output["parameters"]["isif"] == 3
assert output["parameters"]["lwave"] == False
assert output["name"] == "VASP-Relax"
job = RelaxMaker(preset="BulkRelaxSet", name="test", swaps={"nelmin": 6}).make(
atoms
)
responses = run_locally(job, ensure_success=True)
output = responses[job.uuid][1].output
assert output["parameters"]["encut"] == 650
assert output["parameters"]["nelmin"] == 6
assert output["name"] == "test"
job = RelaxMaker(volume_relax=False).make(atoms)
responses = run_locally(job, ensure_success=True)
output = responses[job.uuid][1].output
assert output["parameters"]["isif"] == 2
def test_slab_static_maker():
atoms = bulk("Cu") * (2, 2, 2)
job = SlabStaticMaker().make(atoms)
responses = run_locally(job, ensure_success=True)
output = responses[job.uuid][1].output
assert output["nsites"] == len(atoms)
assert output["parameters"]["idipol"] == 3
assert output["parameters"]["nsw"] == 0
assert output["parameters"]["lvhar"] == True
assert output["name"] == "VASP-SlabStatic"
job = SlabStaticMaker(preset="SlabRelaxSet", name="test", swaps={"nelmin": 6}).make(
atoms
)
responses = run_locally(job, ensure_success=True)
output = responses[job.uuid][1].output
assert output["parameters"]["encut"] == 450
assert output["parameters"]["nelmin"] == 6
assert output["name"] == "test"
def test_slab_relax_maker():
atoms = bulk("Cu") * (2, 2, 2)
job = SlabRelaxMaker().make(atoms)
responses = run_locally(job, ensure_success=True)
output = responses[job.uuid][1].output
assert output["nsites"] == len(atoms)
assert output["parameters"]["isif"] == 2
assert output["parameters"]["nsw"] > 0
assert output["parameters"]["isym"] == 0
assert output["parameters"]["lwave"] == False
assert output["name"] == "VASP-SlabRelax"
job = SlabRelaxMaker(preset="SlabRelaxSet", name="test", swaps={"nelmin": 6}).make(
atoms
)
responses = run_locally(job, ensure_success=True)
output = responses[job.uuid][1].output
assert output["parameters"]["encut"] == 450
assert output["parameters"]["nelmin"] == 6
assert output["name"] == "test"
def test_slab_flows():
atoms = bulk("Cu") * (2, 2, 2)
### --------- Test BulkToSlabMaker --------- ###
flow = BulkToSlabMaker().make(atoms)
responses = run_locally(flow, ensure_success=True)
assert len(responses) == 9
uuids = list(responses.keys())
# First job is a dummy job to make slabs and should have no output
output0 = responses[uuids[0]][1].output
assert output0 is None
output1 = responses[uuids[1]][1].output
assert output1["nsites"] > len(atoms)
assert output1["parameters"]["isif"] == 2
assert output1["name"] == "VASP-SlabRelax"
output2 = responses[uuids[2]][1].output
assert output2["nsites"] == output1["nsites"]
assert output2["parameters"]["nsw"] == 0
assert output2["name"] == "VASP-SlabStatic"
# Now try with kwargs
flow = BulkToSlabMaker(
preset="SlabRelaxSet",
name="test",
slab_relax_maker=SlabRelaxMaker(swaps={"nelmin": 6}),
slab_static_maker=SlabStaticMaker(swaps={"nelmin": 6}),
).make(atoms)
responses = run_locally(flow, ensure_success=True)
assert len(responses) == 9
uuids = list(responses.keys())
output0 = responses[uuids[0]][1].output
assert output0 is None
output1 = responses[uuids[1]][1].output
assert output1["parameters"]["isif"] == 2
assert output1["parameters"]["nelmin"] == 6
assert output1["parameters"]["encut"] == 450
assert output1["name"] == "VASP-SlabRelax"
output2 = responses[uuids[2]][1].output
assert output2["parameters"]["nsw"] == 0
assert output2["parameters"]["nelmin"] == 6
assert output2["parameters"]["encut"] == 450
assert output2["name"] == "VASP-SlabStatic"
### --------- Test SlabToAdsSlabMaker --------- ###
atoms = output2["atoms"]
adsorbate = molecule("H2")
flow = SlabToAdsSlabMaker().make(atoms, adsorbate)
responses = run_locally(flow, ensure_success=True)
assert len(responses) == 11
uuids = list(responses.keys())
# First job is a dummy job to make slabs and should have no output
output0 = responses[uuids[0]][1].output
assert output0 is None
# Subsequent jobs should be alternating relaxations and statics
output1 = responses[uuids[1]][1].output
assert output1["nsites"] == len(output2["atoms"]) + 2
assert output1["parameters"]["isif"] == 2
assert output1["name"] == "VASP-SlabRelax"
output2 = responses[uuids[2]][1].output
assert output2["nsites"] == output1["nsites"]
assert output2["parameters"]["nsw"] == 0
assert output2["name"] == "VASP-SlabStatic"
# Now try with kwargs
flow = SlabToAdsSlabMaker(
preset="SlabRelaxSet", name="test", swaps={"nelmin": 6}
).make(atoms, adsorbate)
responses = run_locally(flow, ensure_success=True)
assert len(responses) == 11
uuids = list(responses.keys())
output0 = responses[uuids[0]][1].output
assert output0 is None
output1 = responses[uuids[1]][1].output
assert output1["parameters"]["isif"] == 2
assert output1["parameters"]["nelmin"] == 6
assert output1["parameters"]["encut"] == 450
assert output1["name"] == "VASP-SlabRelax"
output2 = responses[uuids[2]][1].output
assert output2["parameters"]["nsw"] == 0
assert output2["parameters"]["nelmin"] == 6
assert output2["parameters"]["encut"] == 450
assert output2["name"] == "VASP-SlabStatic"
| 33.638095 | 88 | 0.641563 | 832 | 7,064 | 5.394231 | 0.121394 | 0.096257 | 0.117647 | 0.051471 | 0.824421 | 0.799688 | 0.790553 | 0.790553 | 0.765152 | 0.68115 | 0 | 0.028601 | 0.188279 | 7,064 | 209 | 89 | 33.799043 | 0.754098 | 0.045017 | 0 | 0.672956 | 0 | 0 | 0.156064 | 0 | 0 | 0 | 0 | 0 | 0.45283 | 1 | 0.031447 | false | 0 | 0.025157 | 0 | 0.056604 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
04f4ea6c7708a0cfcf6c78b6b3a6eadc10aa91ba | 5,491 | py | Python | asnets/experiments/det_blocksworld_60probs.py | xf1590281/ASNets | 5f4b29fb62a5e72004b813228442d06246c9ec33 | [
"MIT"
] | 21 | 2017-12-05T13:27:36.000Z | 2021-11-16T20:32:33.000Z | asnets/experiments/det_blocksworld_60probs.py | xf1590281/ASNets | 5f4b29fb62a5e72004b813228442d06246c9ec33 | [
"MIT"
] | 2 | 2018-07-16T12:15:46.000Z | 2020-10-31T00:02:49.000Z | asnets/experiments/det_blocksworld_60probs.py | xf1590281/ASNets | 5f4b29fb62a5e72004b813228442d06246c9ec33 | [
"MIT"
] | 7 | 2018-03-19T13:45:13.000Z | 2022-03-24T07:52:20.000Z | """Smaller version of det_blocksworld_uber, with 1/5th the problems (and all
small eval problems removed, so it's just 35/50 block problems)."""
PDDL_DIR = '../problems/mine/det-bw-challenge/pddl/'
COMMON_PDDLS = ['domain.pddl']
TRAIN_PDDLS = [
'train/prob-blocks-blocks-nblk8-ntow1-seed270765476-seq0.pddl',
'train/prob-blocks-blocks-nblk8-ntow1-seed270765476-seq1.pddl',
'train/prob-blocks-blocks-nblk8-ntow1-seed270765476-seq2.pddl',
'train/prob-blocks-blocks-nblk8-ntow1-seed270765476-seq3.pddl',
'train/prob-blocks-blocks-nblk8-ntow1-seed270765476-seq4.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq0.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq1.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq2.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq3.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq4.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq5.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq6.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq7.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq8.pddl',
'train/prob-blocks-blocks-nblk8-seed236108287-seq9.pddl',
'train/prob-blocks-blocks-nblk9-seed129483654-seq0.pddl',
'train/prob-blocks-blocks-nblk9-seed129483654-seq1.pddl',
'train/prob-blocks-blocks-nblk9-seed129483654-seq2.pddl',
'train/prob-blocks-blocks-nblk9-seed129483654-seq3.pddl',
'train/prob-blocks-blocks-nblk9-seed129483654-seq4.pddl',
'train/prob-blocks-blocks-nblk10-seed614849806-seq0.pddl',
'train/prob-blocks-blocks-nblk10-seed614849806-seq1.pddl',
'train/prob-blocks-blocks-nblk10-seed614849806-seq2.pddl',
'train/prob-blocks-blocks-nblk10-seed614849806-seq3.pddl',
'train/prob-blocks-blocks-nblk10-seed614849806-seq4.pddl',
] # yapf: disable
TRAIN_NAMES = None
_TEST_RUNS = [
'prob-blocks-blocks-nblk35-seed2107726020-seq77.pddl',
'prob-blocks-blocks-nblk35-seed2107726020-seq3.pddl',
'prob-blocks-blocks-nblk35-seed2107726020-seq96.pddl',
'prob-blocks-blocks-nblk35-seed2107726020-seq12.pddl',
'prob-blocks-blocks-nblk35-seed2107726020-seq33.pddl',
'prob-blocks-blocks-nblk35-seed2107726020-seq75.pddl',
'prob-blocks-blocks-nblk35-seed2107726020-seq36.pddl',
'prob-blocks-blocks-nblk35-seed2107726020-seq71.pddl',
'prob-blocks-blocks-nblk35-seed2107726020-seq29.pddl',
'prob-blocks-blocks-nblk35-seed2107726020-seq93.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq0.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq10.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq20.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq21.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq22.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq23.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq24.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq25.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq26.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq27.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq28.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq29.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq2.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq30.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq31.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq32.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq33.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq34.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq35.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq36.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq37.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq38.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq39.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq3.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq40.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq41.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq42.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq43.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq44.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq45.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq46.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq47.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq48.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq49.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq4.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq50.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq51.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq52.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq53.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq54.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq55.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq56.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq57.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq58.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq59.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq5.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq60.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq61.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq62.pddl',
'prob-blocks-blocks-nblk50-seed1184714140-seq63.pddl',
] # yapf: disable
TEST_RUNS = [([fname], None) for fname in _TEST_RUNS]
| 56.608247 | 76 | 0.758696 | 682 | 5,491 | 6.092375 | 0.170088 | 0.204573 | 0.327316 | 0.283995 | 0.869073 | 0.851264 | 0.274609 | 0.068833 | 0 | 0 | 0 | 0.224461 | 0.087234 | 5,491 | 96 | 77 | 57.197917 | 0.604549 | 0.030413 | 0 | 0 | 0 | 0 | 0.844273 | 0.842204 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b6cc769696b84dd9e05b858a93f973aeb72aa7bf | 13,022 | py | Python | yolink_mqtt_client.py | Panda88CO/udi-yolink | 50eaa6f505bfd2f27ef1bc41c580c03ede143f14 | [
"MIT"
] | null | null | null | yolink_mqtt_client.py | Panda88CO/udi-yolink | 50eaa6f505bfd2f27ef1bc41c580c03ede143f14 | [
"MIT"
] | null | null | null | yolink_mqtt_client.py | Panda88CO/udi-yolink | 50eaa6f505bfd2f27ef1bc41c580c03ede143f14 | [
"MIT"
] | null | null | null | import hashlib
import json
import sys
import time
try:
import udi_interface
logging = udi_interface.LOGGER
Custom = udi_interface.Custom
except ImportError:
import logging
logging.basicConfig(level=logging.DEBUG)
import paho.mqtt.client as mqtt
#from logger import getLogger
#log = getLogger(__name__)
DEBUG = True
"""
Object representation for YoLink MQTT Client
"""
class YoLinkMQTTClient(object):
def __init__(self, csName, csid, csseckey, mqtt_url, mqtt_port, deviceId, callback ):
self.callback = callback
self.csid = csid
self.csseckey = csseckey
self.uniqueID = deviceId+str(int(time.time()))
self.uniqueID = str(csName+'_'+ self.uniqueID )
self.topicReq = csName+'/'+ self.uniqueID +'/request'
self.topicResp = csName+'/'+ self.uniqueID +'/response'
self.topicReport = csName+'/'+ self.uniqueID +'/report'
self.topicReportAll = csName+'/report'
self.mqtt_port = int(mqtt_port)
self.csid = csid
self.csseckey = csseckey
#self.topic = topic
self.mqtt_url = mqtt_url
#self.device_hash = device_hash
self.deviceId = deviceId
try:
print('initialize MQTT' )
self.client = mqtt.Client(self.uniqueID, clean_session=True, userdata=None, protocol=mqtt.MQTTv311, transport="tcp")
self.client.on_connect = self.on_connect
self.client.on_message = self.on_message
self.client.on_subscribe = self.on_subscribe
self.client.on_disconnect = self.on_disconnect
print('finish subscribing ')
except Exception as E:
logging.error('Exception - -init-: ' + str(E))
self.messagePending = False
logging.debug(self.deviceId)
#self.client.tls_set()
def connect_to_broker(self):
"""
Connect to MQTT broker
"""
try:
logging.info("Connecting to broker...")
self.client.username_pw_set(username=self.csid, password=hashlib.md5(self.csseckey.encode('utf-8')).hexdigest())
self.client.connect(self.mqtt_url, self.mqtt_port, 30)
#time.sleep(3)
logging.debug ('connect:')
self.client.loop_start()
#self.client.loop_forever()
#logging.debug('loop started')
time.sleep(1)
except Exception as E:
logging.error('Exception - connect_to_broker: ' + str(E))
def on_message(self, client, userdata, msg):
"""
Callback for broker published events
"""
logging.debug('on_message')
#logging.debug(client)
#logging.debug(userdata)
#logging.debug(msg)
#logging.debug(msg.topic, msg.payload)
payload = json.loads(msg.payload.decode("utf-8"))
logging.debug('on_message')
logging.debug(payload)
if msg.topic == self.topicReportAll or msg.topic == self.topicReport:
if payload['deviceId'] == self.deviceId :
#self.eventQueue.put(payload['msgid'])
#self.dataQueue.put(payload)
logging.debug (payload)
self.callback(payload)
logging.debug(' device reporting')
else:
logging.debug ('\n report on differnt device : ' + msg.topic)
logging.debug (payload)
logging.debug('\n')
elif msg.topic == self.topicResp:
#self.dataQueue.put(payload)
logging.debug (payload)
self.callback(payload)
#print('Device response:')
#print(payload)
elif msg.topic == self.topicReq:
logging.debug('publishing request' )
logging.debug (payload)
self.callback(payload) # is this needed????
logging.debug('device publishing')
logging.debug(payload)
else:
logging.debug(msg.topic, self.topicReport, self.topicReportAll )
if DEBUG:
f = open('packets.txt', 'a')
jsonStr = json.dumps(payload, sort_keys=True, indent=4, separators=(',', ': '))
f.write(jsonStr)
f.write('\n\n')
#json.dump(jsonStr, f)
f.close()
#logging.debug("Event:{0} Device:{1} State:{2}".format(event, self.device_hash[deviceId].get_name(), state))
def on_connect(self, client, userdata, flags, rc):
"""
Callback for connection to broker
"""
logging.debug("Connected with result code %s" % rc)
#logging.debug( client, userdata, flags)
try:
if (rc == 0):
logging.debug("Successfully connected to broker %s" % self.mqtt_url)
test1 = self.client.subscribe(self.topicResp)
#logging.debug(test1)
test2 = self.client.subscribe(self.topicReport)
#logging.debug(test2)
test3 = self.client.subscribe(self.topicReportAll)
#logging.debug(test3)
else:
logging.debug("Connection with result code %s" % rc);
sys.exit(2)
time.sleep(1)
logging.debug('Subsribe: ' + self.topicResp + ', '+self.topicReport+', '+ self.topicReportAll )
except Exception as E:
logging.error('Exception - on_connect: ' + str(E))
def on_disconnect(self, client, userdata,rc=0):
logging.debug('Disconnect - stop loop')
self.client.loop_stop()
def on_subscribe(self, client, userdata, mID, granted_QOS):
logging.debug('on_subscribe')
#logging.debug('on_subscribe called')
#logging.debug('client = ' + str(client))
#logging.debug('userdata = ' + str(userdata))
#logging.debug('mID = '+str(mID))
#logging.debug('Granted QoS: ' + str(granted_QOS))
#logging.debug('\n')
def on_publish(self, client, userdata, mID):
logging.debug('on_publish')
#logging.debug('client = ' + str(client))
#logging.debug('userdata = ' + str(userdata))
#logging.debug('mID = '+str(mID))
#logging.debug('\n')
def publish_data(self, data):
logging.debug('publish_data: ')
logging.debug(data)
try:
dataTemp = str(json.dumps(data))
logging.debug('Publishing: {}'.format(dataTemp))
result = self.client.publish(self.topicReq, dataTemp)
if result.rc == 0:
time.sleep(2)
except Exception as E:
logging.error('Exception - publish_data: ' + str(E))
def shut_down(self):
self.client.loop_stop()
'''
For use with API v2 and PAC/UAC authndication
'''
class YoLinkMQTTClientV2(object):
def __init__(self, yolink, deviceId, callback ):
self.callback = callback
#self.UaID = UaID
#self.houseID = houseID
self.uniqueID = deviceId
self.topicReq = self.yolink.homeID +'/'+ self.uniqueID +'/request'
self.topicResp = self.yolink.homeID+'/'+ self.uniqueID +'/response'
self.topicReport = self.yolink.homeID+'/'+ self.uniqueID +'/report'
self.topicReportAll = self.yolink.homeID+'/report'
#self.mqtt_port = int(mqtt_port)
#self.topic = topic
#self.mqtt_url = mqtt_url
#self.device_hash = device_hash
self.deviceId = deviceId
try:
print('initialize MQTT' )
self.client = mqtt.Client(self.uniqueID, clean_session=True, userdata=None, protocol=mqtt.MQTTv311, transport="tcp")
self.client.on_connect = self.on_connect
self.client.on_message = self.on_message
self.client.on_subscribe = self.on_subscribe
self.client.on_disconnect = self.on_disconnect
print('finish subscribing ')
except Exception as E:
logging.error('Exception - -init-: ' + str(E))
self.messagePending = False
logging.debug(self.deviceId)
#self.client.tls_set()
def connect_to_broker(self):
"""
Connect to MQTT broker
"""
try:
logging.info("Connecting to broker...")
self.client.username_pw_set(username=self.yolink.access_token, password=None)
self.client.connect(self.mqtt_url, self.mqtt_port, 30)
#time.sleep(3)
logging.debug ('connect:')
self.client.loop_start()
#self.client.loop_forever()
#logging.debug('loop started')
time.sleep(1)
except Exception as E:
logging.error('Exception - connect_to_broker: ' + str(E))
def on_message(self, client, userdata, msg):
"""
Callback for broker published events
"""
logging.debug('on_message')
#logging.debug(client)
#logging.debug(userdata)
#logging.debug(msg)
#logging.debug(msg.topic, msg.payload)
payload = json.loads(msg.payload.decode("utf-8"))
logging.debug('on_message')
logging.debug(payload)
if msg.topic == self.topicReportAll or msg.topic == self.topicReport:
if payload['deviceId'] == self.deviceId :
#self.eventQueue.put(payload['msgid'])
#self.dataQueue.put(payload)
logging.debug (payload)
self.callback(payload)
logging.debug(' device reporting')
else:
logging.debug ('\n report on differnt device : ' + msg.topic)
logging.debug (payload)
logging.debug('\n')
elif msg.topic == self.topicResp:
#self.dataQueue.put(payload)
logging.debug (payload)
self.callback(payload)
#print('Device response:')
#print(payload)
elif msg.topic == self.topicReq:
logging.debug('publishing request' )
logging.debug (payload)
self.callback(payload) # is this needed????
logging.debug('device publishing')
logging.debug(payload)
else:
logging.debug(msg.topic, self.topicReport, self.topicReportAll )
if DEBUG:
f = open('packets.txt', 'a')
jsonStr = json.dumps(payload, sort_keys=True, indent=4, separators=(',', ': '))
f.write(jsonStr)
f.write('\n\n')
#json.dump(jsonStr, f)
f.close()
#logging.debug("Event:{0} Device:{1} State:{2}".format(event, self.device_hash[deviceId].get_name(), state))
def on_connect(self, client, userdata, flags, rc):
"""
Callback for connection to broker
"""
logging.debug("Connected with result code %s" % rc)
#logging.debug( client, userdata, flags)
try:
if (rc == 0):
logging.debug("Successfully connected to broker %s" % self.mqtt_url)
test1 = self.client.subscribe(self.topicResp)
#logging.debug(test1)
test2 = self.client.subscribe(self.topicReport)
#logging.debug(test2)
test3 = self.client.subscribe(self.topicReportAll)
#logging.debug(test3)
else:
logging.debug("Connection with result code %s" % rc);
sys.exit(2)
time.sleep(1)
logging.debug('Subsribe: ' + self.topicResp + ', '+self.topicReport+', '+ self.topicReportAll )
except Exception as E:
logging.error('Exception - on_connect: ' + str(E))
def on_disconnect(self, client, userdata,rc=0):
logging.debug('Disconnect - stop loop')
self.client.loop_stop()
def on_subscribe(self, client, userdata, mID, granted_QOS):
logging.debug('on_subscribe')
#logging.debug('on_subscribe called')
#logging.debug('client = ' + str(client))
#logging.debug('userdata = ' + str(userdata))
#logging.debug('mID = '+str(mID))
#logging.debug('Granted QoS: ' + str(granted_QOS))
#logging.debug('\n')
def on_publish(self, client, userdata, mID):
logging.debug('on_publish')
#logging.debug('client = ' + str(client))
#logging.debug('userdata = ' + str(userdata))
#logging.debug('mID = '+str(mID))
#logging.debug('\n')
def publish_data(self, data):
logging.debug('publish_data: ')
logging.debug(data)
try:
dataTemp = str(json.dumps(data))
result = self.client.publish(self.topicReq, dataTemp)
if result.rc == 0:
time.sleep(2)
except Exception as E:
logging.error('Exception - publish_data: ' + str(E))
def shut_down(self):
self.client.loop_stop()
| 36.994318 | 130 | 0.56873 | 1,391 | 13,022 | 5.240115 | 0.127247 | 0.151461 | 0.03128 | 0.019756 | 0.898889 | 0.864042 | 0.853615 | 0.844835 | 0.836329 | 0.836329 | 0 | 0.005762 | 0.307019 | 13,022 | 352 | 131 | 36.994318 | 0.801973 | 0.174474 | 0 | 0.849765 | 0 | 0 | 0.099837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084507 | false | 0.00939 | 0.037559 | 0 | 0.131455 | 0.018779 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b65ee13bbf310767e8dc7b136fdc0679c3f73424 | 3,831 | py | Python | zip-brute-master/zipbrute.py | Zusyaku/Termux-And-Lali-Linux-V2 | b1a1b0841d22d4bf2cc7932b72716d55f070871e | [
"Apache-2.0"
] | 2 | 2021-11-17T03:35:03.000Z | 2021-12-08T06:00:31.000Z | zip-brute-master/zipbrute.py | Zusyaku/Termux-And-Lali-Linux-V2 | b1a1b0841d22d4bf2cc7932b72716d55f070871e | [
"Apache-2.0"
] | null | null | null | zip-brute-master/zipbrute.py | Zusyaku/Termux-And-Lali-Linux-V2 | b1a1b0841d22d4bf2cc7932b72716d55f070871e | [
"Apache-2.0"
] | 2 | 2021-11-05T18:07:48.000Z | 2022-02-24T21:25:07.000Z | #Author: AnonyminHack5
#Whatsapp: KzIzNDkwMzM2Nzc10DkK (Decrypt to know my number)
#Do no try to modify or change the script!!
#Language: Python2
#Contact me if you face issues: AnonyminHack5@protonmail.com
import base64
exec(base64.b64decode('aW1wb3J0IG9zCmltcG9ydCBjb2xvcmFtYQpjb2xvcmFtYS5pbml0KCkgCmRlZiBjbHMoKToKCWxpbnV4ID0gJ2NsZWFyJwoJd2luZG93cyA9ICdjbHMnCglvcy5zeXN0ZW0oW2xpbnV4LCB3aW5kb3dzXVtvcy5uYW1lID09ICdudCddKQpjbHMoKQpiYW5uZXIgPSAiIiJcMDMzWzE7MzM7NDBtCiBfX19fX18gICAgICAgICAgICAgXyAgICAgICAgICAgICAgICBfICAgICAgICAgICAgCnxfXyAgKF8pXyBfXyAgICAgICB8IHxfXyAgXyBfXyBfICAgX3wgfF8gX19fIF8gX18gCiAgLyAvfCB8ICdfIFwgX19fX198ICdfIFx8ICdfX3wgfCB8IHwgX18vIF8gXCAnX198CiAvIC9ffCB8IHxfKSB8X19fX198IHxfKSB8IHwgIHwgfF98IHwgfHwgIF9fLyB8ICAgCi9fX19ffF98IC5fXy8gICAgICB8Xy5fXy98X3wgICBcX18sX3xcX19cX19ffF98ICAgCiAgICAgICB8X3wgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgClwwMzNbMG0gXG4iIiIKYmFubmVyICs9ICdcMDMzWzE7MzY7NDBtIEF1dGhvcjogQW5vbnltaW5IYWNrNSBcMDMzWzBtIFxuJwpiYW5uZXIgKz0gJ1wwMzNbMTszNzs0MG0gR2l0aHViOiBUZXJtdXhIYWNreiBcMDMzWzBtIFxuJwpiYW5uZXIgKz0gJ1wwMzNbMTszMjs0MG0gVGVsZWdyYW06IGh0dHBzOi8vdC5tZS9Bbm9ueW1pbkhhY2s1IFwwMzNbMG0gXG4nCmJhbm5lciArPSAnXG4nCmJhbm5lciArPSAnIFsxXSBaaXAgUGFzc3dvcmQgQ3JhY2tlclxuJwpiYW5uZXIgKz0gJyBbMl0gVXBkYXRlIENyYWNrZXJcbicKYmFubmVyICs9ICcgWzBdIEV4aXRcbicKcHJpbnQgYmFubmVyCgphPWlucHV0KCIgWz9dIEVudGVyIE51bWJlciA6ICIpCmlmIGE9PTA6CiBpbXBvcnQgb3MKIGNscygpCiBwcmludCAiIFshXSBHb29kIEJ5ZSwgU2VlIHlvdSBzb29uIgogcXVpdCgpCmVsaWYgYT09MjoKCWltcG9ydCBvcwoJaW1wb3J0IHRpbWUKCWNscygpIAoJcHJpbnQgIlwwMzNbMjszNzs0MG0gWmlwIENyYWNrZXIgd2lsbCBzdGFydCB1cGRhdGluZyBOb3cuLiBcMDMzWzBtIFxuIgoJdGltZS5zbGVlcCgzKQoJb3Muc3lzdGVtKCJjZCAkSE9NRSIpIAoJb3Muc3lzdGVtKCJybSAtcmYgemlwLWJydXRlZm9yY2VyIikgCglvcy5zeXN0ZW0oImdpdCBjbG9uZSBodHRwczovL2dpdGh1Yi5jb20vVGVybXV4SGFja3ovemlwLWJydXRlZm9yY2VyIikgCglvcy5zeXN0ZW0oImNkIHppcC1icnV0ZWZvcmNlciIpIAoJcHJpbnQgIlwwMzNbMTszNTs0MG0gVXBkYXRlIENvbXBsZXRlIE5vdyB0eXBlIHB5dGhvbjIgemlwLWJydXRlLnB5IFwwMzNbMG0gXG4iCglxdWl0KCkgCmVsaWYgYT09MToKICMhL3Vzci9iaW4vcHl0aG9uCgogaW1wb3J0IHppcGZpbGUKIGltcG9ydCBvcwogZnJvbSB0aW1lIGltcG9ydCB0aW1lCiAKIGNscygpCiB0ZXh0emlwcGFzcyA9ICcnJwogX19fX19fICAgICAgICAgICAgIF8gICAgICAgICAgICAgICAgXyAgICAgICAgICAgIAp8X18gIChfKV8gX18gICAgICAgfCB8X18gIF8gX18gXyAgIF98IHxfIF9fXyBfIF9fIAogIC8gL3wgfCAnXyBcIF9fX19ffCAnXyBcfCAnX198IHwgfCB8IF9fLyBfIFwgJ19ffAogLyAvX3wgfCB8XykgfF9fX19ffCB8XykgfCB8ICB8IHxffCB8IHx8ICBfXy8gfCAgIAovX19fX3xffCAuX18vICAgICAgfF8uX18vfF98ICAgXF9fLF98XF9fXF9fX3xffCAgIAogICAgICAgfF98ICAgICAoVG9wIFNwZWVkKSAKICcnJwogcHJpbnQgdGV4dHppcHBhc3MKIGZpbGVfcGF0aCA9IHJhd19pbnB1dCAoIiBbK10gWklQIEZpbGUgQWRkcmVzczogIikKIHByaW50ICIiCiB3b3JkX2xpc3QgPSByYXdfaW5wdXQgKCIgWytdIFBhc3N3b3JkIExpc3QgQWRkcmVzczogIikKIHByaW50ICJcMDMzWzE7MzRtIFsqXSBXYWl0IGFzIGkgYW0gZ2V0dGluZyB0aGUgcGFzc3dvcmQuLi5bKl0gIgogZGVmIG1haW4oZmlsZV9wYXRoLCB3b3JkX2xpc3QpOgoJdHJ5OgoJCXppcF8gPSB6aXBmaWxlLlppcEZpbGUoZmlsZV9wYXRoKQoJZXhjZXB0IHppcGZpbGUuQmFkWmlwZmlsZToKCQlwcmludCAiIFshXSBQbGVhc2UgY2hlY2sgdGhlIGZpbGUncyBQYXRoLiBJdCBkb2Vzbid0IHNlZW0gdG8gYmUgYSBaSVAgZmlsZS4iCgkJcXVpdCgpCgoJcGFzc3dvcmQgPSBOb25lIAoJaSA9IDAgCgljX3QgPSB0aW1lKCkKCXdpdGggb3Blbih3b3JkX2xpc3QsICJyIikgYXMgZjogCgkJcGFzc2VzID0gZi5yZWFkbGluZXMoKSAKCQlmb3IgeCBpbiBwYXNzZXM6CgkJCWkgKz0gMQoJCQlwYXNzd29yZCA9IHguc3BsaXQoIlxuIilbMF0gIAoJCQl0cnk6CgkJCQl6aXBfLmV4dHJhY3RhbGwocHdkPXBhc3N3b3JkKQoJCQkJdF90ID0gdGltZSgpIC0gY190IAoJCQkJcHJpbnQgIlxuIFsqXSBQYXNzd29yZCBGb3VuZCA6KVxuIiArICIgWypdIFRoZSBwYXNzd29yZCBvZiB0aGUgemlwIGlzOiBcMDMzWzE7OTNtIitwYXNzd29yZCsiXDAzM1swbVxuIiAKCQkJCXByaW50ICIgWyoqKl0gVG9vayAlZiBzZWNvbmRzIHRvIGNyYWNrIHRoZSBQYXNzd29yZC4gVGhhdCBpcywgJWkgYXR0ZW1wdHMgcGVyIHNlY29uZC4iICUgKHRfdCxpL3RfdCkKCQkJCXF1aXQoKQoJCQlleGNlcHQgRXhjZXB0aW9uOgoJCQkJcGFzcwoJCXByaW50ICJcMDMzWzE7MzdtIFtYXSBTb3JyeSwgUGFzc3dvcmQgTm90IEZvdW5kLCB0cnkgdXNpbmcgYW5vdGhlciB3b3JkbGlzdCIKCQlxdWl0KCkKIG1haW4oZmlsZV9wYXRoLCB3b3JkX2xpc3Qp'))
| 348.272727 | 3,602 | 0.984599 | 35 | 3,831 | 107.771429 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11281 | 0.009658 | 3,831 | 10 | 3,603 | 383.1 | 0.881392 | 0.052467 | 0 | 0 | 0 | 0 | 0.9873 | 0.9873 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
b670da7cf24fc37615d51b831fff74a8346ad82c | 10,220 | py | Python | kea/utils/test_pulse_synchroniser.py | SmartAcoustics/Kea | 5790f18dafccfc01fe9dbe98de5bb1a5ce584c56 | [
"BSD-3-Clause-Clear",
"BSD-3-Clause"
] | 3 | 2020-02-28T13:03:59.000Z | 2020-09-20T06:33:04.000Z | kea/utils/test_pulse_synchroniser.py | SmartAcoustics/Kea | 5790f18dafccfc01fe9dbe98de5bb1a5ce584c56 | [
"BSD-3-Clause-Clear",
"BSD-3-Clause"
] | null | null | null | kea/utils/test_pulse_synchroniser.py | SmartAcoustics/Kea | 5790f18dafccfc01fe9dbe98de5bb1a5ce584c56 | [
"BSD-3-Clause-Clear",
"BSD-3-Clause"
] | 3 | 2018-12-17T16:33:08.000Z | 2020-01-21T14:10:25.000Z | from myhdl import *
import veriutils
import random
from ._pulse_synchroniser import pulse_synchroniser
from kea.test_utils.base_test import (
KeaTestCase, KeaVivadoVHDLTestCase, KeaVivadoVerilogTestCase)
class TestPulseSynchroniserSimulation(KeaTestCase):
def setUp(self):
self.trigger_clock = Signal(False)
self.output_clock = Signal(False)
self.trigger = Signal(False)
self.synchronised_pulse_output = Signal(False)
self.busy = Signal(False)
self.default_args = {
'trigger_clock': self.trigger_clock,
'output_clock': self.output_clock,
'trigger': self.trigger,
'output': self.synchronised_pulse_output,
'busy': self.busy,
}
self.default_arg_types = {
'trigger_clock': 'clock',
'output_clock': 'output',
'trigger': 'custom',
'output': 'output',
'busy': 'output',
}
def test_high_to_low_freq_cdc(self):
''' When the ``trigger`` signal pulses high for one ``trigger_clock``
cycle, the system should output one high pulse on the ``output`` for
one ``output_clock`` cycle.
The system should set busy high and ignore any pulses on trigger
whilst it is performing the pulse synchronisation.
The above is encapsulated in the following timing diagram
(defined in Wavedrom):
{ "signal": [
{ "name": "trigger clock",
"wave": "p..................."},
{ "name": "output clock",
"wave": "p.........",
"period": 2 },
{ "name": "trigger",
"wave": "010................." },
{ "name": "trigger pulse detected",
"wave": "0.1........0........" },
{ "name": "output pipeline 0",
"wave": "0.1...0...",
"period": 2 },
{ "name": "output pipeline 1",
"wave": "0..1...0..",
"period": 2 },
{ "name": "output pipeline 2",
"wave": "0...1...0.",
"period": 2 },
{ "name": "acknowledge pipeline 0",
"wave": "0........1.......0.." },
{ "name": "acknowledge pipeline 1",
"wave": "0.........1.......0." },
{ "name": "busy",
"wave": "0.1...............0.",},
{ "name": "output",
"wave": "0...10....",
"period": 2,},
]}
'''
args = self.default_args.copy()
arg_types = self.default_arg_types.copy()
cycles = 5000
test_confirmation = {'tests_run': 0}
# Set the output clock period making sure it is longer than the
# trigger clock period
trigger_clock_period = veriutils.cosimulation.PERIOD
output_clock_period = random.randrange(
trigger_clock_period+1, 2*trigger_clock_period)
@block
def dut_wrapper(trigger_clock, output_clock, trigger, output, busy):
# Create the output clock source
output_clock_source = veriutils.clock_source(
output_clock, output_clock_period)
# Create the DUT
pulse_cdc_block = pulse_synchroniser(
trigger_clock, output_clock, trigger, output, busy)
return output_clock_source, pulse_cdc_block
@block
def test():
test_data = {'expected_output_pipeline': [False, False],
'expected_output': False,}
trigger_sent = Signal(False)
trigger_sent_d0 = Signal(False)
@always(self.trigger_clock.posedge)
def trigger_driver():
# Randomly pulse the trigger signal
if self.trigger:
self.trigger.next = False
if not self.busy:
# If the system is not busy then we should get a pulse
# on the output so set up a check.
trigger_sent.next = True
test_confirmation['tests_run'] += 1
elif random.random() < 0.1:
self.trigger.next = True
@always(self.output_clock.posedge)
def check():
trigger_sent_d0.next = trigger_sent
if trigger_sent_d0:
# A trigger has been sent so we expect to see a pulse on
# the output
trigger_sent.next = False
trigger_sent_d0.next = False
test_data['expected_output_pipeline'].append(True)
else:
test_data['expected_output_pipeline'].append(False)
test_data['expected_output'] = (
test_data['expected_output_pipeline'].pop(0))
# Check the output
self.assertTrue(
test_data['expected_output']==
self.synchronised_pulse_output)
#NOTE The busy signal is checked implicitly as we only add a
# pulse to the expected output if busy is low.
return trigger_driver, check
dut_outputs, ref_outputs = self.cosimulate(
cycles, dut_wrapper, dut_wrapper, args, arg_types,
custom_sources=[(test, (),{})])
assert(test_confirmation['tests_run'] >= 5)
self.assertTrue(dut_outputs == ref_outputs)
def test_low_to_high_freq_cdc(self):
''' When the ``trigger`` signal pulses high for one ``trigger_clock``
cycle, the system should output one high pulse on the ``output`` for
one ``output_clock`` cycle.
The system should set busy high and ignore any pulses on trigger
whilst it is performing the pulse synchronisation.
The above is encapsulated in the following timing diagram
(defined in Wavedrom):
{ "signal": [
{ "name": "trigger clock",
"wave": "p..........",
"period": 2 },
{ "name": "output clock",
"wave": "p....................."},
{ "name": "trigger",
"wave": "010........",
"period": 2 },
{ "name": "trigger pulse detected",
"wave": "0.1...0....",
"period": 2 },
{ "name": "output pipeline 0",
"wave": "0....1.......0........" },
{ "name": "output pipeline 1",
"wave": "0.....1.......0......." },
{ "name": "output pipeline 2",
"wave": "0......1.......0......" },
{ "name": "acknowledge pipeline 0",
"wave": "0...1...0..",
"period": 2 },
{ "name": "acknowledge pipeline 1",
"wave": "0....1...0.",
"period": 2 },
{ "name": "busy",
"wave": "0.1......0.",
"period": 2},
{ "name": "output",
"wave": "0......10............."},
]}
'''
args = self.default_args.copy()
arg_types = self.default_arg_types.copy()
cycles = 5000
test_confirmation = {'tests_run': 0}
# Set the output clock period making sure it is shorter than the
# trigger clock period
trigger_clock_period = veriutils.cosimulation.PERIOD
output_clock_period = random.randrange(1, trigger_clock_period)
@block
def dut_wrapper(trigger_clock, output_clock, trigger, output, busy):
# Create the output clock source
output_clock_source = veriutils.clock_source(
output_clock, output_clock_period)
# Create the DUT
pulse_cdc_block = pulse_synchroniser(
trigger_clock, output_clock, trigger, output, busy)
return output_clock_source, pulse_cdc_block
@block
def test():
test_data = {'expected_output_pipeline': [False, False],
'expected_output': False,}
trigger_sent = Signal(False)
trigger_sent_d0 = Signal(False)
@always(self.trigger_clock.posedge)
def trigger_driver():
# Randomly pulse the trigger signal
if self.trigger:
self.trigger.next = False
if not self.busy:
# If the system is not busy then we should get a pulse
# on the output so set up a check.
trigger_sent.next = True
test_confirmation['tests_run'] += 1
elif random.random() < 0.1:
self.trigger.next = True
@always(self.output_clock.posedge)
def check():
trigger_sent_d0.next = trigger_sent
if trigger_sent_d0:
# A trigger has been sent so we expect to see a pulse on
# the output
trigger_sent.next = False
trigger_sent_d0.next = False
test_data['expected_output_pipeline'].append(True)
else:
test_data['expected_output_pipeline'].append(False)
test_data['expected_output'] = (
test_data['expected_output_pipeline'].pop(0))
# Check the output
self.assertTrue(
test_data['expected_output']==
self.synchronised_pulse_output)
#NOTE The busy signal is checked implicitly as we only add a
# pulse to the expected output if busy is low.
return trigger_driver, check
dut_outputs, ref_outputs = self.cosimulate(
cycles, dut_wrapper, dut_wrapper, args, arg_types,
custom_sources=[(test, (),{})])
assert(test_confirmation['tests_run'] >= 5)
self.assertTrue(dut_outputs == ref_outputs)
class TestPulseSynchroniserVivadoVhdlSimulation(
KeaVivadoVHDLTestCase, TestPulseSynchroniserSimulation):
pass
class TestPulseSynchroniserVivadoVerilogSimulation(
KeaVivadoVerilogTestCase, TestPulseSynchroniserSimulation):
pass
| 31.446154 | 78 | 0.522505 | 1,039 | 10,220 | 4.956689 | 0.130895 | 0.059806 | 0.016311 | 0.019029 | 0.83068 | 0.816505 | 0.805437 | 0.80466 | 0.765049 | 0.72466 | 0 | 0.016122 | 0.356654 | 10,220 | 324 | 79 | 31.54321 | 0.767148 | 0.315166 | 0 | 0.727273 | 0 | 0 | 0.07042 | 0.030113 | 0 | 0 | 0 | 0 | 0.045455 | 1 | 0.083333 | false | 0.015152 | 0.037879 | 0 | 0.174242 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1e33b101e848825576ab6de0207956ef1b95f1ef | 37 | py | Python | source_code/manager.py | Wiolarz/Console_PY_dungeon | cbf3b9a68251b9ce620aac1f4ca36361160186ea | [
"Apache-2.0"
] | null | null | null | source_code/manager.py | Wiolarz/Console_PY_dungeon | cbf3b9a68251b9ce620aac1f4ca36361160186ea | [
"Apache-2.0"
] | 2 | 2021-11-29T16:26:03.000Z | 2021-11-29T16:27:14.000Z | source_code/manager.py | Wiolarz/Console_PY_dungeon | cbf3b9a68251b9ce620aac1f4ca36361160186ea | [
"Apache-2.0"
] | null | null | null | def choice():
return int(input()) | 18.5 | 23 | 0.621622 | 5 | 37 | 4.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189189 | 37 | 2 | 23 | 18.5 | 0.766667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
1e3a8b6bffb6f88a5f737dcb283f6dfc0e2c7eda | 123 | py | Python | 2021.12.21/Distilling Knowledge via Knowledge Review/code/Detection/model/backbone/__init__.py | ToniChopp/MIRACLE-Paper-Sharing-Album | 72a3843101483fc8b53df2746c488da066eda2a1 | [
"MIT"
] | 7 | 2021-11-01T08:44:06.000Z | 2022-01-10T09:42:34.000Z | 2021.12.21/Distilling Knowledge via Knowledge Review/code/Detection/model/backbone/__init__.py | ToniChopp/MIRACLE-Paper-Sharing-Album | 72a3843101483fc8b53df2746c488da066eda2a1 | [
"MIT"
] | null | null | null | 2021.12.21/Distilling Knowledge via Knowledge Review/code/Detection/model/backbone/__init__.py | ToniChopp/MIRACLE-Paper-Sharing-Album | 72a3843101483fc8b53df2746c488da066eda2a1 | [
"MIT"
] | 1 | 2021-11-16T16:31:05.000Z | 2021-11-16T16:31:05.000Z | from .resnet import build_resnet_backbone_kd
from .fpn import build_resnet_fpn_backbone_kd, build_mobilenetv2_fpn_backbone
| 41 | 77 | 0.902439 | 19 | 123 | 5.315789 | 0.421053 | 0.217822 | 0.336634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008772 | 0.073171 | 123 | 2 | 78 | 61.5 | 0.877193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1e46f1df5a23443d8d6d5db61d60eebc1981a660 | 10,753 | py | Python | ietf/group/migrations/0003_groupfeatures_data.py | hassanakbar4/ietfdb | cabee059092ae776015410640226064331c293b7 | [
"BSD-3-Clause"
] | 25 | 2022-03-05T08:26:52.000Z | 2022-03-30T15:45:42.000Z | ietf/group/migrations/0003_groupfeatures_data.py | hassanakbar4/ietfdb | cabee059092ae776015410640226064331c293b7 | [
"BSD-3-Clause"
] | 219 | 2022-03-04T17:29:12.000Z | 2022-03-31T21:16:14.000Z | ietf/group/migrations/0003_groupfeatures_data.py | hassanakbar4/ietfdb | cabee059092ae776015410640226064331c293b7 | [
"BSD-3-Clause"
] | 22 | 2022-03-04T15:34:34.000Z | 2022-03-28T13:30:59.000Z | # Copyright The IETF Trust 2018-2020, All Rights Reserved
# -*- coding: utf-8 -*-
# Generated by Django 1.11.13 on 2018-07-10 15:58
from django.conf import settings
from django.db import migrations
import debug # pyflakes:ignore
from ietf.review.utils import active_review_teams
group_type_features = {
'ag': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'ietf',
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': True,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'area': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'ietf',
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'dir': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair,secr',
'agenda_type': None,
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'review': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair,secr',
'agenda_type': None,
'customize_workflow': False,
'default_tab': 'ietf.group.views.review_requests',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': True,
'material_types': 'slides'},
'iab': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'ietf',
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': True,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'ietf': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'ietf',
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': True,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'individ': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': None,
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'irtf': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'ietf',
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'isoc': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': None,
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'nomcom': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'side',
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'program': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'lead',
'agenda_type': None,
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': True,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': True,
'has_reviews': False,
'material_types': 'slides'},
'rfcedtyp': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'side',
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'rg': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'ietf',
'customize_workflow': True,
'default_tab': 'ietf.group.views.group_documents',
'has_chartering_process': True,
'has_default_jabber': True,
'has_dependencies': True,
'has_documents': True,
'has_meetings': True,
'has_nonsession_materials': False,
'has_milestones': True,
'has_reviews': False,
'material_types': 'slides'},
'sdo': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': None,
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': False,
'has_nonsession_materials': False,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'team': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'ietf',
'customize_workflow': False,
'default_tab': 'ietf.group.views.group_about',
'has_chartering_process': False,
'has_default_jabber': False,
'has_dependencies': False,
'has_documents': False,
'has_meetings': True,
'has_nonsession_materials': True,
'has_milestones': False,
'has_reviews': False,
'material_types': 'slides'},
'wg': {
'about_page': 'ietf.group.views.group_about',
'admin_roles': 'chair',
'agenda_type': 'ietf',
'customize_workflow': True,
'default_tab': 'ietf.group.views.group_documents',
'has_chartering_process': True,
'has_default_jabber': True,
'has_dependencies': True,
'has_documents': True,
'has_meetings': True,
'has_nonsession_materials': False,
'has_milestones': True,
'has_reviews': False,
'material_types': 'slides'},
}
def forward(apps, schema_editor):
Group = apps.get_model('group', 'Group')
GroupTypeName = apps.get_model('name', 'GroupTypeName')
GroupFeatures = apps.get_model('group', 'GroupFeatures')
AgendaTypeName = apps.get_model('name', 'AgendaTypeName')
for type in group_type_features:
features = group_type_features[type]
features['type_id'] = type
if features['agenda_type']:
features['agenda_type'] = AgendaTypeName.objects.get(slug=features['agenda_type'])
GroupFeatures.objects.create(**features)
dir = GroupTypeName.objects.get(slug='dir')
review = GroupTypeName.objects.create(slug='review', name='Directorate (with reviews)', desc='', used=True, order=0)
review_teams = [ g.acronym for g in active_review_teams() ]
for group in Group.objects.filter(type=dir):
if group.acronym in review_teams:
group.type = review
group.save()
def reverse(apps, schema_editor):
Group = apps.get_model('group', 'Group')
GroupFeatures = apps.get_model('group', 'GroupFeatures')
GroupTypeName = apps.get_model('name', 'GroupTypeName')
dir = GroupTypeName.objects.get(slug='dir')
review = GroupTypeName.objects.get(slug='review')
for group in Group.objects.filter(type=review):
group.type = dir
group.save()
for entry in GroupFeatures.objects.all():
entry.delete()
review.delete()
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('group', '0002_groupfeatures_historicalgroupfeatures'),
('name', '0003_agendatypename_data'),
]
operations = [
migrations.RunPython(forward, reverse),
]
| 35.843333 | 120 | 0.606714 | 1,110 | 10,753 | 5.574775 | 0.115315 | 0.120233 | 0.072398 | 0.095184 | 0.828378 | 0.827731 | 0.800259 | 0.789916 | 0.789916 | 0.739011 | 0 | 0.004374 | 0.255929 | 10,753 | 299 | 121 | 35.963211 | 0.769029 | 0.013113 | 0 | 0.807018 | 1 | 0 | 0.447723 | 0.161214 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007018 | false | 0 | 0.014035 | 0 | 0.031579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1e534f641ad91463788acec0d09e29b9e0f94c85 | 6,732 | py | Python | pages/migrations/0002_auto_20201019_1319.py | yusufom/marlymart | 06088af43e6f78b7385c1cf7ea5b4b68337360d8 | [
"Unlicense"
] | null | null | null | pages/migrations/0002_auto_20201019_1319.py | yusufom/marlymart | 06088af43e6f78b7385c1cf7ea5b4b68337360d8 | [
"Unlicense"
] | null | null | null | pages/migrations/0002_auto_20201019_1319.py | yusufom/marlymart | 06088af43e6f78b7385c1cf7ea5b4b68337360d8 | [
"Unlicense"
] | null | null | null | # Generated by Django 2.1.7 on 2020-10-19 12:19
import ckeditor_uploader.fields
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('pages', '0001_initial'),
]
operations = [
migrations.RenameField(
model_name='homesetting',
old_name='black_quote',
new_name='Buttongroup1',
),
migrations.RenameField(
model_name='homesetting',
old_name='buyer',
new_name='Buttongroup2',
),
migrations.RenameField(
model_name='homesetting',
old_name='headline2',
new_name='Buttongroup3',
),
migrations.RenameField(
model_name='homesetting',
old_name='headline3',
new_name='aimh2',
),
migrations.RenameField(
model_name='homesetting',
old_name='aim',
new_name='aimp',
),
migrations.RenameField(
model_name='homesetting',
old_name='seller',
new_name='featuredPH1',
),
migrations.RenameField(
model_name='homesetting',
old_name='head2Qwords',
new_name='slider1P',
),
migrations.RenameField(
model_name='homesetting',
old_name='head3Qwords',
new_name='slider2P',
),
migrations.RemoveField(
model_name='homesetting',
name='contact',
),
migrations.RemoveField(
model_name='homesetting',
name='question',
),
migrations.RemoveField(
model_name='homesetting',
name='whatWeDo',
),
migrations.AddField(
model_name='homesetting',
name='aimimg',
field=models.ImageField(blank=True, null=True, upload_to=''),
),
migrations.AddField(
model_name='homesetting',
name='contactformimg',
field=models.ImageField(blank=True, null=True, upload_to=''),
),
migrations.AddField(
model_name='homesetting',
name='featuredPP',
field=models.CharField(blank=True, max_length=500),
),
migrations.AddField(
model_name='homesetting',
name='front',
field=models.ImageField(blank=True, null=True, upload_to=''),
),
migrations.AddField(
model_name='homesetting',
name='linkedin',
field=models.CharField(blank=True, max_length=350),
),
migrations.AddField(
model_name='homesetting',
name='offerbox1',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='offerbox2',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='offerbox3',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='offerbox4',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='offerbox5',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='offerbox6',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='offerbox7',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='slider1A',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='slider1H1',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='slider1img',
field=models.ImageField(blank=True, null=True, upload_to=''),
),
migrations.AddField(
model_name='homesetting',
name='slider2A',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='slider2H1',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='slider2img',
field=models.ImageField(blank=True, null=True, upload_to=''),
),
migrations.AddField(
model_name='homesetting',
name='slider3A',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='slider3H1',
field=models.CharField(blank=True, max_length=150),
),
migrations.AddField(
model_name='homesetting',
name='slider3P',
field=ckeditor_uploader.fields.RichTextUploadingField(blank=True, max_length=500),
),
migrations.AddField(
model_name='homesetting',
name='slider3img',
field=models.ImageField(blank=True, null=True, upload_to=''),
),
migrations.AddField(
model_name='homesetting',
name='whatsapp',
field=models.CharField(blank=True, max_length=350),
),
migrations.AlterField(
model_name='homesetting',
name='address',
field=ckeditor_uploader.fields.RichTextUploadingField(blank=True, max_length=150),
),
migrations.AlterField(
model_name='homesetting',
name='facebook',
field=models.CharField(blank=True, max_length=350),
),
migrations.AlterField(
model_name='homesetting',
name='instagram',
field=models.CharField(blank=True, max_length=350),
),
migrations.AlterField(
model_name='homesetting',
name='twitter',
field=models.CharField(blank=True, max_length=350),
),
]
| 33.326733 | 95 | 0.52852 | 556 | 6,732 | 6.244604 | 0.172662 | 0.098502 | 0.218894 | 0.207373 | 0.822581 | 0.822581 | 0.748848 | 0.632488 | 0.62068 | 0.572581 | 0 | 0.026188 | 0.359031 | 6,732 | 201 | 96 | 33.492537 | 0.778447 | 0.006684 | 0 | 0.707692 | 1 | 0 | 0.127853 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010256 | 0 | 0.025641 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1e623a3ea57579f030f714653f149ef81aa47b8d | 117 | py | Python | jupyterlabpymolpysnips/Count/numResiNucleicChainA.py | MooersLab/pymolpysnips | 50a89c85adf8006d85c1d6cd3f8aad7e440a0b92 | [
"MIT"
] | null | null | null | jupyterlabpymolpysnips/Count/numResiNucleicChainA.py | MooersLab/pymolpysnips | 50a89c85adf8006d85c1d6cd3f8aad7e440a0b92 | [
"MIT"
] | null | null | null | jupyterlabpymolpysnips/Count/numResiNucleicChainA.py | MooersLab/pymolpysnips | 50a89c85adf8006d85c1d6cd3f8aad7e440a0b92 | [
"MIT"
] | null | null | null | cmd.do('sel = 'chain A and polymer.nucleic'; print(len(set([(i.resi, i.resn) for i in cmd.get_model(sel).atom])));')
| 58.5 | 116 | 0.65812 | 23 | 117 | 3.304348 | 0.826087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 117 | 1 | 117 | 117 | 0.72381 | 0 | 0 | 0 | 0 | 1 | 0.65812 | 0.435897 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
1e700897a8fda42dbf25549ada1db5bfa23d6d76 | 98 | py | Python | skyamqp/rpc/__init__.py | dmvuong95/py-amqp-client | ec11e3b054e3f917504cb667c434dd7be948e91a | [
"MIT"
] | null | null | null | skyamqp/rpc/__init__.py | dmvuong95/py-amqp-client | ec11e3b054e3f917504cb667c434dd7be948e91a | [
"MIT"
] | null | null | null | skyamqp/rpc/__init__.py | dmvuong95/py-amqp-client | ec11e3b054e3f917504cb667c434dd7be948e91a | [
"MIT"
] | null | null | null | from skyamqp.rpc.server import RPC_Server_Thread
from skyamqp.rpc.client import RPC_Client_Thread
| 32.666667 | 48 | 0.877551 | 16 | 98 | 5.125 | 0.4375 | 0.268293 | 0.341463 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 98 | 2 | 49 | 49 | 0.911111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1e7c536fc6c6d926c9f817c15205a2c8544c7176 | 163 | py | Python | tests/test_version.py | nismod/water_demand | e95670540c14f0d501b6d446e749b5640828bbec | [
"MIT"
] | 1 | 2021-03-31T03:00:08.000Z | 2021-03-31T03:00:08.000Z | tests/test_version.py | nismod/water_demand | e95670540c14f0d501b6d446e749b5640828bbec | [
"MIT"
] | 1 | 2019-06-12T15:12:45.000Z | 2019-06-12T15:12:45.000Z | tests/test_version.py | nismod/water_demand | e95670540c14f0d501b6d446e749b5640828bbec | [
"MIT"
] | null | null | null | import water_demand
def test_version():
assert water_demand.version() == (2, 1, 1)
assert(water_demand.version(formatted=True) == 'water_demand v2.1.1')
| 23.285714 | 73 | 0.705521 | 24 | 163 | 4.583333 | 0.5 | 0.4 | 0.309091 | 0.436364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043165 | 0.147239 | 163 | 6 | 74 | 27.166667 | 0.748201 | 0 | 0 | 0 | 0 | 0 | 0.116564 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1ea044c7e598a45bd05da5f8c1d7cc77d28e2159 | 6,635 | py | Python | tests/test_util.py | amedyukhina/intake_io | 1e14fecd76dfa615c82a18450548f6ac3817392f | [
"MIT"
] | 2 | 2021-04-19T18:13:17.000Z | 2021-06-04T10:09:05.000Z | tests/test_util.py | amedyukhina/intake_io | 1e14fecd76dfa615c82a18450548f6ac3817392f | [
"MIT"
] | 9 | 2020-12-03T00:05:32.000Z | 2021-07-20T17:05:09.000Z | tests/test_util.py | amedyukhina/intake_io | 1e14fecd76dfa615c82a18450548f6ac3817392f | [
"MIT"
] | 2 | 2020-12-02T21:55:49.000Z | 2021-03-19T18:30:16.000Z | import pytest
from intake_io.util import *
def test_get_axes():
with pytest.raises(ValueError):
get_axes(-1)
with pytest.raises(ValueError):
get_axes(0)
assert get_axes(1) == "x"
assert get_axes(2) == "yx"
assert get_axes(3) == "zyx"
assert get_axes(4) == "czyx"
assert get_axes(5) == "tczyx"
assert get_axes(6) == "itczyx"
with pytest.raises(ValueError):
get_axes(7)
with pytest.raises(ValueError):
get_axes(())
assert get_axes((1,)) == "x"
assert get_axes((1, 2)) == "yx"
assert get_axes((1, 2, 9)) == "zyx"
assert get_axes((1, 2, 3)) == "yxc"
assert get_axes((3, 1, 2)) == "cyx"
assert get_axes((1, 2, 3, 4)) == "czyx"
assert get_axes((1, 2, 3, 4, 5)) == "tczyx"
assert get_axes((1, 2, 3, 4, 5, 6)) == "itczyx"
with pytest.raises(ValueError):
get_axes((1, 2, 3, 4, 5, 6, 7))
with pytest.raises(ValueError):
get_axes(np.zeros((), np.uint8))
assert get_axes(np.zeros((1,), np.uint8)) == "x"
assert get_axes(np.zeros((1, 2), np.uint8)) == "yx"
assert get_axes(np.zeros((1, 2, 9), np.uint8)) == "zyx"
assert get_axes(np.zeros((1, 2, 3), np.uint8)) == "yxc"
assert get_axes(np.zeros((3, 1, 2), np.uint8)) == "cyx"
assert get_axes(np.zeros((1, 2, 3, 4), np.uint8)) == "czyx"
assert get_axes(np.zeros((1, 2, 3, 4, 5), np.uint8)) == "tczyx"
assert get_axes(np.zeros((1, 2, 3, 4, 5, 6), np.uint8)) == "itczyx"
with pytest.raises(ValueError):
get_axes(np.zeros((1, 2, 3, 4, 5, 6, 7), np.uint8))
with pytest.raises(ValueError):
get_axes(da.zeros((), np.uint8))
assert get_axes(da.zeros((1,), np.uint8)) == "x"
assert get_axes(da.zeros((1, 2), np.uint8)) == "yx"
assert get_axes(da.zeros((1, 2, 9), np.uint8)) == "zyx"
assert get_axes(da.zeros((1, 2, 3), np.uint8)) == "yxc"
assert get_axes(da.zeros((3, 1, 2), np.uint8)) == "cyx"
assert get_axes(da.zeros((1, 2, 3, 4), np.uint8)) == "czyx"
assert get_axes(da.zeros((1, 2, 3, 4, 5), np.uint8)) == "tczyx"
assert get_axes(da.zeros((1, 2, 3, 4, 5, 6), np.uint8)) == "itczyx"
with pytest.raises(ValueError):
get_axes(da.zeros((1, 2, 3, 4, 5, 6, 7), np.uint8))
with pytest.raises(ValueError):
get_axes(xr.DataArray((1, 2, 3)))
with pytest.raises(ValueError):
get_axes(xr.DataArray(np.zeros((1, 2, 3), np.uint8)))
with pytest.raises(ValueError):
get_axes(xr.DataArray(np.zeros((1,), np.uint8), dims=tuple("X")))
assert get_axes(xr.DataArray(np.zeros((1,), np.uint8), dims=tuple("x"))) == "x"
assert get_axes(xr.DataArray(np.zeros((1,), np.uint8), dims=tuple("z"))) == "z"
assert get_axes(xr.DataArray(np.zeros((1, 2), np.uint8), dims=tuple("yx"))) == "yx"
assert get_axes(xr.DataArray(np.zeros((1, 2, 3), np.uint8), dims=tuple("iyz"))) == "iyz"
assert get_axes(xr.Dataset({"1": xr.DataArray(np.zeros((1,), np.uint8), dims=tuple("x"))})) == "x"
assert get_axes(xr.Dataset({
"1": xr.DataArray(np.zeros((8,), np.uint8), dims=tuple("x")),
"2": xr.DataArray(np.zeros((16, 16, 8), np.uint8), dims=tuple("zyx"))
})) == "zyx"
assert get_axes(xr.Dataset({
"1": xr.DataArray(np.zeros((16, 16, 8), np.uint8), dims=tuple("zyx")),
"2": xr.DataArray(np.zeros((8,), np.uint8), dims=tuple("x"))
})) == "zyx"
def test_get_spacing():
assert get_spacing(xr.DataArray(np.zeros((8,)), dims=tuple("x"))) == (None,)
assert get_spacing(xr.DataArray(np.zeros((8,)), dims=tuple("x")), "x") == None
assert get_spacing(xr.DataArray(np.zeros((8, 8, 8)), dims=tuple("zyx"))) == (None, None, None)
assert get_spacing(xr.DataArray(np.zeros((8, 8)), dims=tuple("cx"))) == (None,)
assert get_spacing(xr.DataArray(np.zeros((8, 8, 8, 8)), dims=tuple("czyx"))) == (None, None, None)
assert get_spacing(xr.DataArray(np.zeros((8, 8, 8)), dims=tuple("zyx"), coords={
"z": np.arange(8),
"x": np.arange(8) * 0.125
})) == (1, None, 0.125)
assert get_spacing(xr.Dataset({
"image": xr.DataArray(np.zeros((8, 8, 8)), dims=tuple("zyx"), coords={
"z": np.arange(8),
"x": np.arange(8) * 0.125
})
})) == (1, None, 0.125)
assert get_spacing(xr.Dataset({
"0": xr.DataArray(np.zeros((8, 8)), dims=tuple("yx")),
"image": xr.DataArray(np.zeros((8, 8, 8)), dims=tuple("zyx"), coords={
"z": np.arange(8),
"x": np.arange(8) * 0.125
})
})) == (1, None, 0.125)
def test_to_xarray():
arr = xr.DataArray(np.zeros((8, 8, 8)), dims=tuple("zyx"), coords={
"z": np.arange(8),
"x": np.arange(8) * 0.125
})
with pytest.raises(ValueError):
to_xarray(xr.Dataset({"image": arr, "0": arr}))
assert isinstance(to_xarray(xr.Dataset({"image": arr})), xr.DataArray)
assert to_xarray(xr.Dataset({"image": arr})).shape == arr.shape
assert get_spacing(to_xarray(xr.Dataset({"image": arr}))) == get_spacing(arr)
arr = to_xarray(np.zeros((8, 2, 16, 32), np.uint8), (0.1, 0.2, 0.3), axes="tcyx", coords={"c": (1, 2)})
assert get_spacing(arr) == (0.1, 0.2, 0.3)
assert arr.coords["t"][1] == 0.1
assert arr.coords["c"][1] == 2
assert arr.coords["y"][1] == 0.2
assert arr.coords["x"][1] == 0.3
arr = to_xarray(np.zeros((8, 2, 16, 32), np.uint8), (0.2, 0.3), axes="tcyx", coords={"c": (1, 2)})
assert get_spacing(arr) == (None, 0.2, 0.3)
assert "t" not in arr.coords
assert arr.coords["c"][1] == 2
assert arr.coords["y"][1] == 0.2
assert arr.coords["x"][1] == 0.3
arr = to_xarray(np.zeros((8, 2, 16, 32), np.uint8), (0.2, 0.3), axes="tcyx")
assert get_spacing(arr) == (None, 0.2, 0.3)
assert "t" not in arr.coords
assert "c" not in arr.coords
assert arr.coords["y"][1] == 0.2
assert arr.coords["x"][1] == 0.3
arr = to_xarray(np.zeros((2, 8, 2, 8, 16, 32), np.uint8), (0.1, 0.2, 0.3), axes="itczyx", coords={"c": (1, 2)})
assert get_spacing(arr) == (None, 0.1, 0.2, 0.3)
for i in "it":
assert i not in arr.coords
assert arr.coords["c"][1] == 2
assert arr.coords["z"][1] == 0.1
assert arr.coords["y"][1] == 0.2
assert arr.coords["x"][1] == 0.3
arr = to_xarray(np.zeros((2, 8, 2, 8, 16, 32), np.uint8), (0.1, 0.4, 0.2, 0.3), axes="itczyx", coords={"c": (1, 2)})
assert get_spacing(arr) == (0.1, 0.4, 0.2, 0.3)
assert "i" not in arr.coords
assert arr.coords["t"][1] == 0.1
assert arr.coords["c"][1] == 2
assert arr.coords["z"][1] == 0.4
assert arr.coords["y"][1] == 0.2
assert arr.coords["x"][1] == 0.3
| 42.532051 | 120 | 0.560362 | 1,119 | 6,635 | 3.251117 | 0.058981 | 0.126168 | 0.132216 | 0.103903 | 0.920286 | 0.893623 | 0.836174 | 0.806487 | 0.716603 | 0.681143 | 0 | 0.074418 | 0.204069 | 6,635 | 155 | 121 | 42.806452 | 0.614467 | 0 | 0 | 0.40146 | 0 | 0 | 0.038583 | 0 | 0 | 0 | 0 | 0 | 0.554745 | 1 | 0.021898 | false | 0 | 0.014599 | 0 | 0.036496 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1ea4663438c850e563e143ac1f65eb2b11645cc3 | 61,187 | py | Python | sympy/integrals/rubi/rubi_tests/tests/test_1_3.py | iamabhishek0/sympy | c461bd1ff9d178d1012b04fd0bf37ee3abb02cdd | [
"BSD-3-Clause"
] | 15 | 2020-06-29T08:33:39.000Z | 2022-02-12T00:28:51.000Z | sympy/integrals/rubi/rubi_tests/tests/test_1_3.py | iamabhishek0/sympy | c461bd1ff9d178d1012b04fd0bf37ee3abb02cdd | [
"BSD-3-Clause"
] | 13 | 2020-03-24T17:53:51.000Z | 2022-02-10T20:01:14.000Z | sympy/integrals/rubi/rubi_tests/tests/test_1_3.py | iamabhishek0/sympy | c461bd1ff9d178d1012b04fd0bf37ee3abb02cdd | [
"BSD-3-Clause"
] | 11 | 2020-06-29T08:40:24.000Z | 2022-02-24T17:39:16.000Z | import sys
from sympy.external import import_module
matchpy = import_module("matchpy")
if not matchpy:
#bin/test will not execute any tests now
disabled = True
if sys.version_info[:2] < (3, 6):
disabled = True
from sympy.integrals.rubi.rubi import rubi_integrate
from sympy.functions import log, sqrt, exp, cos, sin, tan, sec, csc, cot
from sympy.functions.elementary.hyperbolic import atanh as arctanh
from sympy.functions.elementary.hyperbolic import asinh as arcsinh
from sympy.functions.elementary.hyperbolic import acosh as arccosh
from sympy.functions.elementary.trigonometric import atan as arctan
from sympy.functions.elementary.trigonometric import asin as arcsin
from sympy.functions.elementary.trigonometric import acos as arccos
from sympy.integrals.rubi.utility_function import EllipticE, EllipticF, hypergeom, rubi_test
from sympy import pi as Pi
from sympy import S, hyper, I, simplify, exp_polar, symbols
from sympy.utilities.pytest import slow, skip, ON_TRAVIS
A, B, C, D, a, b, c, d, e, f, m, n, p, x, u = symbols('A B C D a b c d e f m n p x u', real=True, imaginary=False)
@slow
def test_1():
if ON_TRAVIS:
skip('Too slow for travis.')
test = [
[x**S(2)*(a + b*x)*(a*c - b*c*x)**S(3), x, S(2), S(1)/S(3)*a**S(4)*c**S(3)*x**S(3) - S(1)/S(2)*a**S(3)*b*c**S(3)*x**S(4) + S(1)/S(3)*a*b**S(3)*c**S(3)*x**S(6) - S(1)/S(7)*b**S(4)*c**S(3)*x**S(7)],
[x*(a + b*x)*(a*c - b*c*x)**S(3), x, S(2), S(1)/S(2)*a**S(4)*c**S(3)*x**S(2) - S(2)/S(3)*a**S(3)*b*c**S(3)*x**S(3) + S(2)/S(5)*a*b**S(3)*c**S(3)*x**S(5) - S(1)/S(6)*b**S(4)*c**S(3)*x**S(6)],
[x**S(3)*(a + b*x)*(A + B*x), x, S(2), S(1)/S(4)*a*A*x**S(4) + S(1)/S(5)*(A*b + a*B)*x**S(5) + S(1)/S(6)*b*B*x**S(6)],
[x**S(4)*(A + B*x)/(a + b*x), x, S(2), - a**S(3)*(A*b - a*B)*x/b**S(5) + S(1)/S(2)*a**S(2)*(A*b - a*B)*x**S(2)/b**S(4) - S(1)/S(3)*a*(A*b - a*B)*x**S(3)/b**S(3) + S(1)/S(4)*(A*b - a*B)*x**S(4)/b**S(2) + S(1)/S(5)*B*x**S(5)/b + a**S(4)*(A*b - a*B)*log(a + b*x)/b**S(6)],
[x**S(2)*(c + d*x)/(a + b*x), x, S(2), - a*(b*c - a*d)*x/b**S(3) + S(1)/S(2)*(b*c - a*d)*x**S(2)/b**S(2) + S(1)/S(3)*d*x**S(3)/b + a**S(2)*(b*c - a*d)*log(a + b*x)/b**S(4)],
[x**S(3)*(c + d*x)**S(2)/(a + b*x)**S(2), x, S(2), - S(2)*a*(b*c - S(2)*a*d)*(b*c - a*d)*x/b**S(5) + S(1)/S(2)*(b*c - S(3)*a*d)*(b*c - a*d)*x**S(2)/b**S(4) + S(2)/S(3)*d*(b*c - a*d)*x**S(3)/b**S(3) + S(1)/S(4)*d**S(2)*x**S(4)/b**S(2) + a**S(3)*(b*c - a*d)**S(2)/(b**S(6)*(a + b*x)) + a**S(2)*(S(3)*b*c - S(5)*a*d)*(b*c - a*d)*log(a + b*x)/b**S(6)],
[x**S(2)*(c + d*x)**S(3)/(a + b*x)**S(3), x, S(2), S(3)*d*(b*c - S(2)*a*d)*(b*c - a*d)*x/b**S(5) + S(3)/S(2)*d**S(2)*(b*c - a*d)*x**S(2)/b**S(4) + S(1)/S(3)*d**S(3)*x**S(3)/b**S(3) - S(1)/S(2)*a**S(2)*(b*c - a*d)**S(3)/(b**S(6)*(a + b*x)**S(2)) + a*(S(2)*b*c - S(5)*a*d)*(b*c - a*d)**S(2)/(b**S(6)*(a + b*x)) + (b*c - a*d)*(b**S(2)*c**S(2) - S(8)*a*b*c*d + S(10)*a**S(2)*d**S(2))*log(a + b*x)/b**S(6)],
[x**(S(5)/S(2))*(A + B*x)/(a + b*x), x, S(6), - S(2)/S(3)*a*(A*b - a*B)*x**(S(3)/S(2))/b**S(3) + S(2)/S(5)*(A*b - a*B)*x**(S(5)/S(2))/b**S(2) + S(2)/S(7)*B*x**(S(7)/S(2))/b - S(2)*a**(S(5)/S(2))*(A*b - a*B)*arctan(sqrt(b)*sqrt(x)/sqrt(a))/b**(S(9)/S(2)) + S(2)*a**S(2)*(A*b - a*B)*sqrt(x)/b**S(4)],
[x**m*(a + b*x)**S(3)*(A + B*x), x, S(2), a**S(3)*A*x**(S(1) + m)/(S(1) + m) + a**S(2)*(S(3)*A*b + a*B)*x**(S(2) + m)/(S(2) + m) + S(3)*a*b*(A*b + a*B)*x**(S(3) + m)/(S(3) + m) + b**S(2)*(A*b + S(3)*a*B)*x**(S(4) + m)/(S(4) + m) + b**S(3)*B*x**(S(5) + m)/(S(5) + m)],
[x**m*(c + d*x)**S(3)/(a + b*x), x, S(7), d*(S(3)*b**S(2)*c**S(2) - S(3)*a*b*c*d + a**S(2)*d**S(2))*x**(S(1) + m)/(b**S(3)*(S(1) + m)) + d**S(2)*(S(3)*b*c - a*d)*x**(S(2) + m)/(b**S(2)*(S(2) + m)) + d**S(3)*x**(S(3) + m)/(b*(S(3) + m)) + (b*c - a*d)**S(3)*x**(S(1) + m)*hypergeom([S(1), S(1)], [S(1) - m], a/(a + b*x))/(b**S(3)*m*(a + b*x)), c**S(2)*d*x**(S(1) + m)/(b*(S(1) + m)) + c*d*(b*c - a*d)*x**(S(1) + m)/(b**S(2)*(S(1) + m)) + d*(b*c - a*d)**S(2)*x**(S(1) + m)/(b**S(3)*(S(1) + m)) + S(2)*c*d**S(2)*x**(S(2) + m)/(b*(S(2) + m)) + d**S(2)*(b*c - a*d)*x**(S(2) + m)/(b**S(2)*(S(2) + m)) + d**S(3)*x**(S(3) + m)/(b*(S(3) + m)) + (b*c - a*d)**S(3)*x**(S(1) + m)*hypergeom([S(1), S(1) + m], [S(2) + m], - b*x/a)/(a*b**S(3)*(S(1) + m))],
[x**m*(c + d*x)**S(2)/(a + b*x), x, S(5), c*d*x**(S(1) + m)/(b*(S(1) + m)) + d*(b*c - a*d)*x**(S(1) + m)/(b**S(2)*(S(1) + m)) + d**S(2)*x**(S(2) + m)/(b*(S(2) + m)) + (b*c - a*d)**S(2)*x**(S(1) + m)*hypergeom([S(1), S(1) + m], [S(2) + m], - b*x/a)/(a*b**S(2)*(S(1) + m))],
[b**S(2)*x**m/(b + a*x**S(2))**S(2), x, S(2), x**(S(1) + m)*hypergeom([S(2), S(1)/S(2)*(S(1) + m)], [S(1)/S(2)*(S(3) + m)], - a*x**S(2)/b)/(S(1) + m)],
[x**m/((S(1) - x*sqrt(a)/sqrt( - b))**S(2)*(S(1) + x*sqrt(a)/sqrt( - b))**S(2)), x, S(2), x**(S(1) + m)*hypergeom([S(2), S(1)/S(2)*(S(1) + m)], [S(1)/S(2)*(S(3) + m)], - a*x**S(2)/b)/(S(1) + m)],
[x**S(3)*(A + B*x)*sqrt(a + b*x), x, S(2), - S(2)/S(3)*a**S(3)*(A*b - a*B)*(a + b*x)**(S(3)/S(2))/b**S(5) + S(2)/S(5)*a**S(2)*(S(3)*A*b - S(4)*a*B)*(a + b*x)**(S(5)/S(2))/b**S(5) - S(6)/S(7)*a*(A*b - S(2)*a*B)*(a + b*x)**(S(7)/S(2))/b**S(5) + S(2)/S(9)*(A*b - S(4)*a*B)*(a + b*x)**(S(9)/S(2))/b**S(5) + S(2)/S(11)*B*(a + b*x)**(S(11)/S(2))/b**S(5)],
[x**S(3)*(A + B*x)/sqrt(a + b*x), x, S(2), S(2)/S(3)*a**S(2)*(S(3)*A*b - S(4)*a*B)*(a + b*x)**(S(3)/S(2))/b**S(5) - S(6)/S(5)*a*(A*b - S(2)*a*B)*(a + b*x)**(S(5)/S(2))/b**S(5) + S(2)/S(7)*(A*b - S(4)*a*B)*(a + b*x)**(S(7)/S(2))/b**S(5) + S(2)/S(9)*B*(a + b*x)**(S(9)/S(2))/b**S(5) - S(2)*a**S(3)*(A*b - a*B)*sqrt(a + b*x)/b**S(5)],
[x**(S(5)/S(2))*(A + B*x)*sqrt(a + b*x), x, S(7), S(1)/S(5)*B*x**(S(7)/S(2))*(a + b*x)**(S(3)/S(2))/b - S(1)/S(128)*a**S(4)*(S(10)*A*b - S(7)*a*B)*arctanh(sqrt(b)*sqrt(x)/sqrt(a + b*x))/b**(S(9)/S(2)) - S(1)/S(192)*a**S(2)*(S(10)*A*b - S(7)*a*B)*x**(S(3)/S(2))*sqrt(a + b*x)/b**S(3) + S(1)/S(240)*a*(S(10)*A*b - S(7)*a*B)*x**(S(5)/S(2))*sqrt(a + b*x)/b**S(2) + S(1)/S(40)*(S(10)*A*b - S(7)*a*B)*x**(S(7)/S(2))*sqrt(a + b*x)/b + S(1)/S(128)*a**S(3)*(S(10)*A*b - S(7)*a*B)*sqrt(x)*sqrt(a + b*x)/b**S(4)],
[x**(S(3)/S(2))*(A + B*x)*sqrt(a + b*x), x, S(6), S(1)/S(4)*B*x**(S(5)/S(2))*(a + b*x)**(S(3)/S(2))/b + S(1)/S(64)*a**S(3)*(S(8)*A*b - S(5)*a*B)*arctanh(sqrt(b)*sqrt(x)/sqrt(a + b*x))/b**(S(7)/S(2)) + S(1)/S(96)*a*(S(8)*A*b - S(5)*a*B)*x**(S(3)/S(2))*sqrt(a + b*x)/b**S(2) + S(1)/S(24)*(S(8)*A*b - S(5)*a*B)*x**(S(5)/S(2))*sqrt(a + b*x)/b - S(1)/S(64)*a**S(2)*(S(8)*A*b - S(5)*a*B)*sqrt(x)*sqrt(a + b*x)/b**S(3)],
[x**(S(7)/S(2))*(A + B*x)/sqrt(a + b*x), x, S(7), S(7)/S(128)*a**S(4)*(S(10)*A*b - S(9)*a*B)*arctanh(sqrt(b)*sqrt(x)/sqrt(a + b*x))/b**(S(11)/S(2)) + S(7)/S(192)*a**S(2)*(S(10)*A*b - S(9)*a*B)*x**(S(3)/S(2))*sqrt(a + b*x)/b**S(4) - S(7)/S(240)*a*(S(10)*A*b - S(9)*a*B)*x**(S(5)/S(2))*sqrt(a + b*x)/b**S(3) + S(1)/S(40)*(S(10)*A*b - S(9)*a*B)*x**(S(7)/S(2))*sqrt(a + b*x)/b**S(2) + S(1)/S(5)*B*x**(S(9)/S(2))*sqrt(a + b*x)/b - S(7)/S(128)*a**S(3)*(S(10)*A*b - S(9)*a*B)*sqrt(x)*sqrt(a + b*x)/b**S(5)],
[x**(S(5)/S(2))*(A + B*x)/sqrt(a + b*x), x, S(6), - S(5)/S(64)*a**S(3)*(S(8)*A*b - S(7)*a*B)*arctanh(sqrt(b)*sqrt(x)/sqrt(a + b*x))/b**(S(9)/S(2)) - S(5)/S(96)*a*(S(8)*A*b - S(7)*a*B)*x**(S(3)/S(2))*sqrt(a + b*x)/b**S(3) + S(1)/S(24)*(S(8)*A*b - S(7)*a*B)*x**(S(5)/S(2))*sqrt(a + b*x)/b**S(2) + S(1)/S(4)*B*x**(S(7)/S(2))*sqrt(a + b*x)/b + S(5)/S(64)*a**S(2)*(S(8)*A*b - S(7)*a*B)*sqrt(x)*sqrt(a + b*x)/b**S(4)],
[x**S(3)*sqrt(a + b*x)*sqrt(c + d*x), x, S(6), S(1)/S(5)*x**S(2)*(a + b*x)**(S(3)/S(2))*(c + d*x)**(S(3)/S(2))/(b*d) + S(1)/S(240)*(a + b*x)**(S(3)/S(2))*(c + d*x)**(S(3)/S(2))*(S(35)*b**S(2)*c**S(2) + S(38)*a*b*c*d + S(35)*a**S(2)*d**S(2) - S(42)*b*d*(b*c + a*d)*x)/(b**S(3)*d**S(3)) + S(1)/S(128)*(b*c - a*d)**S(2)*(b*c + a*d)*(S(7)*b**S(2)*c**S(2) + S(2)*a*b*c*d + S(7)*a**S(2)*d**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(9)/S(2))*d**(S(9)/S(2))) - S(1)/S(64)*(b*c + a*d)*(S(7)*b**S(2)*c**S(2) + S(2)*a*b*c*d + S(7)*a**S(2)*d**S(2))*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b**S(4)*d**S(3)) - S(1)/S(128)*(S(7)*b**S(4)*c**S(4) + S(2)*a*b**S(3)*c**S(3)*d - S(2)*a**S(3)*b*c*d**S(3) - S(7)*a**S(4)*d**S(4))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(4)*d**S(4))],
[x**S(2)*sqrt(a + b*x)*sqrt(c + d*x), x, S(6), - S(5)/S(24)*(b*c + a*d)*(a + b*x)**(S(3)/S(2))*(c + d*x)**(S(3)/S(2))/(b**S(2)*d**S(2)) + S(1)/S(4)*x*(a + b*x)**(S(3)/S(2))*(c + d*x)**(S(3)/S(2))/(b*d) + S(1)/S(64)*(b*c - a*d)**S(2)*(S(4)*a*b*c*d - S(5)*(b*c + a*d)**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(7)/S(2))*d**(S(7)/S(2))) - S(1)/S(32)*(S(4)*a*b*c*d - S(5)*(b*c + a*d)**S(2))*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b**S(3)*d**S(2)) - S(1)/S(64)*(b*c - a*d)*(S(4)*a*b*c*d - S(5)*(b*c + a*d)**S(2))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(3)*d**S(3))],
[x**S(3)*sqrt(a + b*x)/sqrt(c + d*x), x, S(5), S(1)/S(64)*(b*c - a*d)*(S(35)*b**S(3)*c**S(3) + S(15)*a*b**S(2)*c**S(2)*d + S(9)*a**S(2)*b*c*d**S(2) + S(5)*a**S(3)*d**S(3))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(7)/S(2))*d**(S(9)/S(2))) + S(1)/S(4)*x**S(2)*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b*d) + S(1)/S(96)*(a + b*x)**(S(3)/S(2))*(S(35)*b**S(2)*c**S(2) + S(22)*a*b*c*d + S(15)*a**S(2)*d**S(2) - S(4)*b*d*(S(7)*b*c + S(5)*a*d)*x)*sqrt(c + d*x)/(b**S(3)*d**S(3)) - S(1)/S(64)*(S(35)*b**S(3)*c**S(3) + S(15)*a*b**S(2)*c**S(2)*d + S(9)*a**S(2)*b*c*d**S(2) + S(5)*a**S(3)*d**S(3))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(3)*d**S(4))],
[x**S(2)*sqrt(a + b*x)/sqrt(c + d*x), x, S(5), - S(1)/S(8)*(b*c - a*d)*(S(5)*b**S(2)*c**S(2) + S(2)*a*b*c*d + a**S(2)*d**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(5)/S(2))*d**(S(7)/S(2))) - S(1)/S(12)*(S(5)*b*c + S(3)*a*d)*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b**S(2)*d**S(2)) + S(1)/S(3)*x*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b*d) + S(1)/S(8)*(S(5)*b**S(2)*c**S(2) + S(2)*a*b*c*d + a**S(2)*d**S(2))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(2)*d**S(3))],
[x**S(2)*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x), x, S(7), - S(1)/S(40)*(S(7)*b*c + S(5)*a*d)*(a + b*x)**(S(5)/S(2))*(c + d*x)**(S(3)/S(2))/(b**S(2)*d**S(2)) + S(1)/S(5)*x*(a + b*x)**(S(5)/S(2))*(c + d*x)**(S(3)/S(2))/(b*d) + S(1)/S(128)*(b*c - a*d)**S(3)*(S(7)*b**S(2)*c**S(2) + S(6)*a*b*c*d + S(3)*a**S(2)*d**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(7)/S(2))*d**(S(9)/S(2))) + S(1)/S(192)*(b*c - a*d)*(S(7)*b**S(2)*c**S(2) + S(6)*a*b*c*d + S(3)*a**S(2)*d**S(2))*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b**S(3)*d**S(3)) + S(1)/S(48)*(S(7)*b**S(2)*c**S(2) + S(6)*a*b*c*d + S(3)*a**S(2)*d**S(2))*(a + b*x)**(S(5)/S(2))*sqrt(c + d*x)/(b**S(3)*d**S(2)) - S(1)/S(128)*(b*c - a*d)**S(2)*(S(7)*b**S(2)*c**S(2) + S(6)*a*b*c*d + S(3)*a**S(2)*d**S(2))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(3)*d**S(4))],
[x*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x), x, S(6), S(1)/S(4)*(a + b*x)**(S(5)/S(2))*(c + d*x)**(S(3)/S(2))/(b*d) - S(1)/S(64)*(b*c - a*d)**S(3)*(S(5)*b*c + S(3)*a*d)*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(5)/S(2))*d**(S(7)/S(2))) - S(1)/S(96)*(b*c - a*d)*(S(5)*b*c + S(3)*a*d)*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b**S(2)*d**S(2)) - S(1)/S(24)*(S(5)*b*c + S(3)*a*d)*(a + b*x)**(S(5)/S(2))*sqrt(c + d*x)/(b**S(2)*d) + S(1)/S(64)*(b*c - a*d)**S(2)*(S(5)*b*c + S(3)*a*d)*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(2)*d**S(3))],
[x**S(2)*(a + b*x)**(S(3)/S(2))/sqrt(c + d*x), x, S(6), S(1)/S(64)*(b*c - a*d)**S(2)*(S(35)*b**S(2)*c**S(2) + S(10)*a*b*c*d + S(3)*a**S(2)*d**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(5)/S(2))*d**(S(9)/S(2))) + S(1)/S(96)*(S(35)*b**S(2)*c**S(2) + S(10)*a*b*c*d + S(3)*a**S(2)*d**S(2))*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b**S(2)*d**S(3)) - S(1)/S(24)*(S(7)*b*c + S(3)*a*d)*(a + b*x)**(S(5)/S(2))*sqrt(c + d*x)/(b**S(2)*d**S(2)) + S(1)/S(4)*x*(a + b*x)**(S(5)/S(2))*sqrt(c + d*x)/(b*d) - S(1)/S(64)*(b*c - a*d)*(S(35)*b**S(2)*c**S(2) + S(10)*a*b*c*d + S(3)*a**S(2)*d**S(2))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(2)*d**S(4))],
[x*(a + b*x)**(S(3)/S(2))/sqrt(c + d*x), x, S(5), - S(1)/S(8)*(b*c - a*d)**S(2)*(S(5)*b*c + a*d)*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(3)/S(2))*d**(S(7)/S(2))) - S(1)/S(12)*(S(5)*b*c + a*d)*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b*d**S(2)) + S(1)/S(3)*(a + b*x)**(S(5)/S(2))*sqrt(c + d*x)/(b*d) + S(1)/S(8)*(b*c - a*d)*(S(5)*b*c + a*d)*sqrt(a + b*x)*sqrt(c + d*x)/(b*d**S(3))],
[x**S(2)*(a + b*x)**(S(5)/S(2))*sqrt(c + d*x), x, S(8), - S(1)/S(60)*(S(9)*b*c + S(5)*a*d)*(a + b*x)**(S(7)/S(2))*(c + d*x)**(S(3)/S(2))/(b**S(2)*d**S(2)) + S(1)/S(6)*x*(a + b*x)**(S(7)/S(2))*(c + d*x)**(S(3)/S(2))/(b*d) - S(1)/S(512)*(b*c - a*d)**S(4)*(S(21)*b**S(2)*c**S(2) + S(14)*a*b*c*d + S(5)*a**S(2)*d**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(7)/S(2))*d**(S(11)/S(2))) - S(1)/S(768)*(b*c - a*d)**S(2)*(S(21)*b**S(2)*c**S(2) + S(14)*a*b*c*d + S(5)*a**S(2)*d**S(2))*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b**S(3)*d**S(4)) + S(1)/S(960)*(b*c - a*d)*(S(21)*b**S(2)*c**S(2) + S(14)*a*b*c*d + S(5)*a**S(2)*d**S(2))*(a + b*x)**(S(5)/S(2))*sqrt(c + d*x)/(b**S(3)*d**S(3)) + S(1)/S(160)*(S(21)*b**S(2)*c**S(2) + S(14)*a*b*c*d + S(5)*a**S(2)*d**S(2))*(a + b*x)**(S(7)/S(2))*sqrt(c + d*x)/(b**S(3)*d**S(2)) + S(1)/S(512)*(b*c - a*d)**S(3)*(S(21)*b**S(2)*c**S(2) + S(14)*a*b*c*d + S(5)*a**S(2)*d**S(2))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(3)*d**S(5))],
[x*(a + b*x)**(S(5)/S(2))*sqrt(c + d*x), x, S(7), S(1)/S(5)*(a + b*x)**(S(7)/S(2))*(c + d*x)**(S(3)/S(2))/(b*d) + S(1)/S(128)*(b*c - a*d)**S(4)*(S(7)*b*c + S(3)*a*d)*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(5)/S(2))*d**(S(9)/S(2))) + S(1)/S(192)*(b*c - a*d)**S(2)*(S(7)*b*c + S(3)*a*d)*(a + b*x)**(S(3)/S(2))*sqrt(c + d*x)/(b**S(2)*d**S(3)) - S(1)/S(240)*(b*c - a*d)*(S(7)*b*c + S(3)*a*d)*(a + b*x)**(S(5)/S(2))*sqrt(c + d*x)/(b**S(2)*d**S(2)) - S(1)/S(40)*(S(7)*b*c + S(3)*a*d)*(a + b*x)**(S(7)/S(2))*sqrt(c + d*x)/(b**S(2)*d) - S(1)/S(128)*(b*c - a*d)**S(3)*(S(7)*b*c + S(3)*a*d)*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(2)*d**S(4))],
[x**S(2)*sqrt(c + d*x)/sqrt(a + b*x), x, S(5), S(1)/S(8)*(b*c - a*d)*(b**S(2)*c**S(2) + S(2)*a*b*c*d + S(5)*a**S(2)*d**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(7)/S(2))*d**(S(5)/S(2))) - S(1)/S(12)*(S(3)*b*c + S(5)*a*d)*(c + d*x)**(S(3)/S(2))*sqrt(a + b*x)/(b**S(2)*d**S(2)) + S(1)/S(3)*x*(c + d*x)**(S(3)/S(2))*sqrt(a + b*x)/(b*d) + S(1)/S(8)*(b**S(2)*c**S(2) + S(2)*a*b*c*d + S(5)*a**S(2)*d**S(2))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(3)*d**S(2))],
[x*sqrt(c + d*x)/sqrt(a + b*x), x, S(4), - S(1)/S(4)*(b*c - a*d)*(b*c + S(3)*a*d)*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(5)/S(2))*d**(S(3)/S(2))) + S(1)/S(2)*(c + d*x)**(S(3)/S(2))*sqrt(a + b*x)/(b*d) - S(1)/S(4)*(b*c + S(3)*a*d)*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(2)*d)],
[x**S(3)/(sqrt(a + b*x)*sqrt(c + d*x)), x, S(4), - S(1)/S(8)*(b*c + a*d)*(S(5)*b**S(2)*c**S(2) - S(2)*a*b*c*d + S(5)*a**S(2)*d**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(7)/S(2))*d**(S(7)/S(2))) + S(1)/S(3)*x**S(2)*sqrt(a + b*x)*sqrt(c + d*x)/(b*d) + S(1)/S(24)*(S(15)*b**S(2)*c**S(2) + S(14)*a*b*c*d + S(15)*a**S(2)*d**S(2) - S(10)*b*d*(b*c + a*d)*x)*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(3)*d**S(3))],
[x**S(2)/(sqrt(a + b*x)*sqrt(c + d*x)), x, S(4), - S(1)/S(4)*(S(4)*a*b*c*d - S(3)*(b*c + a*d)**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(5)/S(2))*d**(S(5)/S(2))) - S(3)/S(4)*(b*c + a*d)*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(2)*d**S(2)) + S(1)/S(2)*x*sqrt(a + b*x)*sqrt(c + d*x)/(b*d)],
[x**S(4)/((a + b*x)**(S(3)/S(2))*(c + d*x)**(S(3)/S(2))), x, S(5), S(3)/S(4)*(S(5)*b**S(2)*c**S(2) + S(6)*a*b*c*d + S(5)*a**S(2)*d**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(7)/S(2))*d**(S(7)/S(2))) + S(2)*a*x**S(3)/(b*(b*c - a*d)*sqrt(a + b*x)*sqrt(c + d*x)) - S(2)*c*(b*c + a*d)*x**S(2)*sqrt(a + b*x)/(b*d*(b*c - a*d)**S(2)*sqrt(c + d*x)) - S(1)/S(4)*((b*c + a*d)*(S(15)*b**S(2)*c**S(2) - S(22)*a*b*c*d + S(15)*a**S(2)*d**S(2)) - S(2)*b*d*(S(5)*b**S(2)*c**S(2) - S(2)*a*b*c*d + S(5)*a**S(2)*d**S(2))*x)*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(3)*d**S(3)*(b*c - a*d)**S(2))],
[x**S(3)/((a + b*x)**(S(3)/S(2))*(c + d*x)**(S(3)/S(2))), x, S(4), - S(3)*(b*c + a*d)*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(5)/S(2))*d**(S(5)/S(2))) + S(2)*a*x**S(2)/(b*(b*c - a*d)*sqrt(a + b*x)*sqrt(c + d*x)) + (c*(S(3)*b**S(2)*c**S(2) - S(2)*a*b*c*d + S(3)*a**S(2)*d**S(2)) + d*(b*c - S(3)*a*d)*(b*c - a*d)*x)*sqrt(a + b*x)/(b**S(2)*d**S(2)*(b*c - a*d)**S(2)*sqrt(c + d*x))],
[x**S(3)*(a + b*x)**(S(1)/S(4))/(c + d*x)**(S(1)/S(4)), x, S(7), - S(1)/S(512)*(S(195)*b**S(3)*c**S(3) + S(135)*a*b**S(2)*c**S(2)*d + S(105)*a**S(2)*b*c*d**S(2) + S(77)*a**S(3)*d**S(3))*(a + b*x)**(S(1)/S(4))*(c + d*x)**(S(3)/S(4))/(b**S(3)*d**S(4)) + S(1)/S(4)*x**S(2)*(a + b*x)**(S(5)/S(4))*(c + d*x)**(S(3)/S(4))/(b*d) + S(1)/S(384)*(a + b*x)**(S(5)/S(4))*(c + d*x)**(S(3)/S(4))*(S(117)*b**S(2)*c**S(2) + S(94)*a*b*c*d + S(77)*a**S(2)*d**S(2) - S(8)*b*d*(S(13)*b*c + S(11)*a*d)*x)/(b**S(3)*d**S(3)) + S(1)/S(1024)*(b*c - a*d)*(S(195)*b**S(3)*c**S(3) + S(135)*a*b**S(2)*c**S(2)*d + S(105)*a**S(2)*b*c*d**S(2) + S(77)*a**S(3)*d**S(3))*arctan(d**(S(1)/S(4))*(a + b*x)**(S(1)/S(4))/(b**(S(1)/S(4))*(c + d*x)**(S(1)/S(4))))/(b**(S(15)/S(4))*d**(S(17)/S(4))) + S(1)/S(1024)*(b*c - a*d)*(S(195)*b**S(3)*c**S(3) + S(135)*a*b**S(2)*c**S(2)*d + S(105)*a**S(2)*b*c*d**S(2) + S(77)*a**S(3)*d**S(3))*arctanh(d**(S(1)/S(4))*(a + b*x)**(S(1)/S(4))/(b**(S(1)/S(4))*(c + d*x)**(S(1)/S(4))))/(b**(S(15)/S(4))*d**(S(17)/S(4)))],
[x**S(2)*(a + b*x)**(S(1)/S(4))/(c + d*x)**(S(1)/S(4)), x, S(7), S(1)/S(32)*(S(15)*b**S(2)*c**S(2) + S(10)*a*b*c*d + S(7)*a**S(2)*d**S(2))*(a + b*x)**(S(1)/S(4))*(c + d*x)**(S(3)/S(4))/(b**S(2)*d**S(3)) - S(1)/S(24)*(S(9)*b*c + S(7)*a*d)*(a + b*x)**(S(5)/S(4))*(c + d*x)**(S(3)/S(4))/(b**S(2)*d**S(2)) + S(1)/S(3)*x*(a + b*x)**(S(5)/S(4))*(c + d*x)**(S(3)/S(4))/(b*d) - S(1)/S(64)*(b*c - a*d)*(S(15)*b**S(2)*c**S(2) + S(10)*a*b*c*d + S(7)*a**S(2)*d**S(2))*arctan(d**(S(1)/S(4))*(a + b*x)**(S(1)/S(4))/(b**(S(1)/S(4))*(c + d*x)**(S(1)/S(4))))/(b**(S(11)/S(4))*d**(S(13)/S(4))) - S(1)/S(64)*(b*c - a*d)*(S(15)*b**S(2)*c**S(2) + S(10)*a*b*c*d + S(7)*a**S(2)*d**S(2))*arctanh(d**(S(1)/S(4))*(a + b*x)**(S(1)/S(4))/(b**(S(1)/S(4))*(c + d*x)**(S(1)/S(4))))/(b**(S(11)/S(4))*d**(S(13)/S(4)))],
[x*(a + b*x)**n*(c + d*x), x, S(2), - a*(b*c - a*d)*(a + b*x)**(S(1) + n)/(b**S(3)*(S(1) + n)) + (b*c - S(2)*a*d)*(a + b*x)**(S(2) + n)/(b**S(3)*(S(2) + n)) + d*(a + b*x)**(S(3) + n)/(b**S(3)*(S(3) + n))],
[x**S(2)*(a + b*x)**n/(c + d*x), x, S(3), - (b*c + a*d)*(a + b*x)**(S(1) + n)/(b**S(2)*d**S(2)*(S(1) + n)) + (a + b*x)**(S(2) + n)/(b**S(2)*d*(S(2) + n)) + c**S(2)*(a + b*x)**(S(1) + n)*hypergeom([S(1), S(1) + n], [S(2) + n], - d*(a + b*x)/(b*c - a*d))/(d**S(2)*(b*c - a*d)*(S(1) + n))],
[x*(a + b*x)**n/(c + d*x), x, S(2), (a + b*x)**(S(1) + n)/(b*d*(S(1) + n)) - c*(a + b*x)**(S(1) + n)*hypergeom([S(1), S(1) + n], [S(2) + n], - d*(a + b*x)/(b*c - a*d))/(d*(b*c - a*d)*(S(1) + n))],
[x**m*(S(3) - S(2)*a*x)**(S(2) + n)*(S(6) + S(4)*a*x)**n, x, S(8), S(2)**n*S(9)**(S(1) + n)*x**(S(1) + m)*hypergeom([S(1)/S(2)*(S(1) + m), - n], [S(1)/S(2)*(S(3) + m)], S(4)/S(9)*a**S(2)*x**S(2))/(S(1) + m) - S(2)**(S(2) + n)*S(3)**(S(1) + S(2)*n)*a*x**(S(2) + m)*hypergeom([S(1)/S(2)*(S(2) + m), - n], [S(1)/S(2)*(S(4) + m)], S(4)/S(9)*a**S(2)*x**S(2))/(S(2) + m) + S(2)**(S(2) + n)*S(9)**n*a**S(2)*x**(S(3) + m)*hypergeom([S(1)/S(2)*(S(3) + m), - n], [S(1)/S(2)*(S(5) + m)], S(4)/S(9)*a**S(2)*x**S(2))/(S(3) + m)],
[x**m*(S(3) - S(2)*a*x)**(S(1) + n)*(S(6) + S(4)*a*x)**n, x, S(5), S(2)**n*S(3)**(S(1) + S(2)*n)*x**(S(1) + m)*hypergeom([S(1)/S(2)*(S(1) + m), - n], [S(1)/S(2)*(S(3) + m)], S(4)/S(9)*a**S(2)*x**S(2))/(S(1) + m) - S(2)**(S(1) + n)*S(9)**n*a*x**(S(2) + m)*hypergeom([S(1)/S(2)*(S(2) + m), - n], [S(1)/S(2)*(S(4) + m)], S(4)/S(9)*a**S(2)*x**S(2))/(S(2) + m)],
[(a + b*x)*(A + B*x)*(d + e*x)**m, x, S(2), (b*d - a*e)*(B*d - A*e)*(d + e*x)**(S(1) + m)/(e**S(3)*(S(1) + m)) - (S(2)*b*B*d - A*b*e - a*B*e)*(d + e*x)**(S(2) + m)/(e**S(3)*(S(2) + m)) + b*B*(d + e*x)**(S(3) + m)/(e**S(3)*(S(3) + m))],
[(A + B*x)*(d + e*x)**S(5)/(a + b*x), x, S(2), (A*b - a*B)*e*(b*d - a*e)**S(4)*x/b**S(6) + S(1)/S(2)*(A*b - a*B)*(b*d - a*e)**S(3)*(d + e*x)**S(2)/b**S(5) + S(1)/S(3)*(A*b - a*B)*(b*d - a*e)**S(2)*(d + e*x)**S(3)/b**S(4) + S(1)/S(4)*(A*b - a*B)*(b*d - a*e)*(d + e*x)**S(4)/b**S(3) + S(1)/S(5)*(A*b - a*B)*(d + e*x)**S(5)/b**S(2) + S(1)/S(6)*B*(d + e*x)**S(6)/(b*e) + (A*b - a*B)*(b*d - a*e)**S(5)*log(a + b*x)/b**S(7)],
[(S(1) - S(2)*x)*(S(2) + S(3)*x)**m*(S(3) + S(5)*x), x, S(2), - S(7)/S(27)*(S(2) + S(3)*x)**(S(1) + m)/(S(1) + m) + S(37)/S(27)*(S(2) + S(3)*x)**(S(2) + m)/(S(2) + m) - S(10)/S(27)*(S(2) + S(3)*x)**(S(3) + m)/(S(3) + m)],
[(S(1) - S(2)*x)*(S(2) + S(3)*x)**S(8)*(S(3) + S(5)*x), x, S(2), - S(7)/S(243)*(S(2) + S(3)*x)**S(9) + S(37)/S(270)*(S(2) + S(3)*x)**S(10) - S(10)/S(297)*(S(2) + S(3)*x)**S(11)],
[(S(1) - S(2)*x)*(S(2) + S(3)*x)**m/(S(3) + S(5)*x), x, S(2), - S(2)/S(15)*(S(2) + S(3)*x)**(S(1) + m)/(S(1) + m) - S(11)/S(5)*(S(2) + S(3)*x)**(S(1) + m)*hypergeom([S(1), S(1) + m], [S(2) + m], S(5)*(S(2) + S(3)*x))/(S(1) + m)],
[(S(1) - S(2)*x)*(S(2) + S(3)*x)**S(6)/(S(3) + S(5)*x), x, S(2), S(1666663)/S(78125)*x + S(1777779)/S(31250)*x**S(2) + S(152469)/S(3125)*x**S(3) - S(152469)/S(2500)*x**S(4) - S(106677)/S(625)*x**S(5) - S(7047)/S(50)*x**S(6) - S(1458)/S(35)*x**S(7) + S(11)/S(390625)*log(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**S(2)*(S(2) + S(3)*x)**S(8)*(S(3) + S(5)*x), x, S(2), - S(49)/S(729)*(S(2) + S(3)*x)**S(9) + S(91)/S(270)*(S(2) + S(3)*x)**S(10) - S(16)/S(99)*(S(2) + S(3)*x)**S(11) + S(5)/S(243)*(S(2) + S(3)*x)**S(12)],
[(S(1) - S(2)*x)**S(2)*(S(2) + S(3)*x)**S(7)*(S(3) + S(5)*x), x, S(2), - S(49)/S(648)*(S(2) + S(3)*x)**S(8) + S(91)/S(243)*(S(2) + S(3)*x)**S(9) - S(8)/S(45)*(S(2) + S(3)*x)**S(10) + S(20)/S(891)*(S(2) + S(3)*x)**S(11)],
[(S(1) - S(2)*x)**S(2)*(S(2) + S(3)*x)**S(7)/(S(3) + S(5)*x), x, S(2), S(83333293)/S(1953125)*x + S(80555569)/S(781250)*x**S(2) + S(1327159)/S(78125)*x**S(3) - S(20577159)/S(62500)*x**S(4) - S(7315947)/S(15625)*x**S(5) + S(130383)/S(1250)*x**S(6) + S(672867)/S(875)*x**S(7) + S(16767)/S(25)*x**S(8) + S(972)/S(5)*x**S(9) + S(121)/S(9765625)*log(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**S(2)*(S(2) + S(3)*x)**S(6)/(S(3) + S(5)*x), x, S(2), S(8333293)/S(390625)*x + S(5555569)/S(156250)*x**S(2) - S(422841)/S(15625)*x**S(3) - S(1677159)/S(12500)*x**S(4) - S(228447)/S(3125)*x**S(5) + S(35883)/S(250)*x**S(6) + S(34992)/S(175)*x**S(7) + S(729)/S(10)*x**S(8) + S(121)/S(1953125)*log(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**S(3)*(S(2) + S(3)*x)**S(8)*(S(3) + S(5)*x), x, S(2), - S(343)/S(2187)*(S(2) + S(3)*x)**S(9) + S(2009)/S(2430)*(S(2) + S(3)*x)**S(10) - S(518)/S(891)*(S(2) + S(3)*x)**S(11) + S(107)/S(729)*(S(2) + S(3)*x)**S(12) - S(40)/S(3159)*(S(2) + S(3)*x)**S(13)],
[(S(1) - S(2)*x)**S(3)*(S(2) + S(3)*x)**S(7)*(S(3) + S(5)*x), x, S(2), S(384)*x + S(1184)*x**S(2) + S(480)*x**S(3) - S(5148)*x**S(4) - S(48968)/S(5)*x**S(5) + S(3514)*x**S(6) + S(29106)*x**S(7) + S(208035)/S(8)*x**S(8) - S(15507)*x**S(9) - S(217971)/S(5)*x**S(10) - S(329508)/S(11)*x**S(11) - S(7290)*x**S(12)],
[(S(1) - S(2)*x)**S(3)*(S(2) + S(3)*x)**S(6)/(S(3) + S(5)*x), x, S(2), S(41666223)/S(1953125)*x + S(11111259)/S(781250)*x**S(2) - S(17453753)/S(234375)*x**S(3) - S(5848749)/S(62500)*x**S(4) + S(2212083)/S(15625)*x**S(5) + S(331713)/S(1250)*x**S(6) - S(40338)/S(875)*x**S(7) - S(13851)/S(50)*x**S(8) - S(648)/S(5)*x**S(9) + S(1331)/S(9765625)*log(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**S(3)*(S(2) + S(3)*x)**S(5)/(S(3) + S(5)*x), x, S(2), S(4166223)/S(390625)*x - S(138741)/S(156250)*x**S(2) - S(1703753)/S(46875)*x**S(3) - S(73749)/S(12500)*x**S(4) + S(243333)/S(3125)*x**S(5) + S(4419)/S(125)*x**S(6) - S(11988)/S(175)*x**S(7) - S(243)/S(5)*x**S(8) + S(1331)/S(1953125)*log(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**m*(S(3) + S(5)*x)/(S(1) - S(2)*x), x, S(2), - S(5)/S(6)*(S(2) + S(3)*x)**(S(1) + m)/(S(1) + m) + S(11)/S(14)*(S(2) + S(3)*x)**(S(1) + m)*hypergeom([S(1), S(1) + m], [S(2) + m], S(2)/S(7)*(S(2) + S(3)*x))/(S(1) + m)],
[(S(2) + S(3)*x)**S(8)*(S(3) + S(5)*x)/(S(1) - S(2)*x), x, S(2), - S(63019595)/S(512)*x - S(60332619)/S(512)*x**S(2) - S(17391129)/S(128)*x**S(3) - S(37722699)/S(256)*x**S(4) - S(21272139)/S(160)*x**S(5) - S(2929689)/S(32)*x**S(6) - S(353565)/S(8)*x**S(7) - S(422091)/S(32)*x**S(8) - S(3645)/S(2)*x**S(9) - S(63412811)/S(1024)*log(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**m/((S(1) - S(2)*x)*(S(3) + S(5)*x)), x, S(3), S(2)/S(77)*(S(2) + S(3)*x)**(S(1) + m)*hypergeom([S(1), S(1) + m], [S(2) + m], S(2)/S(7)*(S(2) + S(3)*x))/(S(1) + m) - S(5)/S(11)*(S(2) + S(3)*x)**(S(1) + m)*hypergeom([S(1), S(1) + m], [S(2) + m], S(5)*(S(2) + S(3)*x))/(S(1) + m)],
[(S(2) + S(3)*x)**S(8)*(S(3) + S(5)*x)/(S(1) - S(2)*x)**S(2), x, S(2), S(63412811)/S(1024)/(S(1) - S(2)*x) + S(91609881)/S(256)*x + S(122887143)/S(512)*x**S(2) + S(5892813)/S(32)*x**S(3) + S(32991057)/S(256)*x**S(4) + S(5859459)/S(80)*x**S(5) + S(976617)/S(32)*x**S(6) + S(56862)/S(7)*x**S(7) + S(32805)/S(32)*x**S(8) + S(246239357)/S(1024)*log(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(7)*(S(3) + S(5)*x)/(S(1) - S(2)*x)**S(2), x, S(2), S(9058973)/S(512)/(S(1) - S(2)*x) + S(22333965)/S(256)*x + S(873207)/S(16)*x**S(2) + S(2399985)/S(64)*x**S(3) + S(1423899)/S(64)*x**S(4) + S(793881)/S(80)*x**S(5) + S(11421)/S(4)*x**S(6) + S(10935)/S(28)*x**S(7) + S(15647317)/S(256)*log(S(1) - S(2)*x)],
[(a + b*x)**m/(e + f*x)**S(2), x, S(1), b*(a + b*x)**(S(1) + m)*hypergeom([S(2), S(1) + m], [S(2) + m], - f*(a + b*x)/(b*e - a*f))/((b*e - a*f)**S(2)*(S(1) + m))],
[(a + b*x)**m/((c + d*x)*(e + f*x)**S(2)), x, S(4), - f*(a + b*x)**(S(1) + m)/((b*e - a*f)*(d*e - c*f)*(e + f*x)) + d**S(2)*(a + b*x)**(S(1) + m)*hypergeom([S(1), S(1) + m], [S(2) + m], - d*(a + b*x)/(b*c - a*d))/((b*c - a*d)*(d*e - c*f)**S(2)*(S(1) + m)) + f*(a*d*f - b*(d*e*(S(1) - m) + c*f*m))*(a + b*x)**(S(1) + m)*hypergeom([S(1), S(1) + m], [S(2) + m], - f*(a + b*x)/(b*e - a*f))/((b*e - a*f)**S(2)*(d*e - c*f)**S(2)*(S(1) + m))],
[(S(2) + S(3)*x)**S(7)*(S(3) + S(5)*x)/(S(1) - S(2)*x)**S(3), x, S(2), S(9058973)/S(1024)/(S(1) - S(2)*x)**S(2) + ( - S(15647317)/S(256))/(S(1) - S(2)*x) - S(24960933)/S(256)*x - S(10989621)/S(256)*x**S(2) - S(631611)/S(32)*x**S(3) - S(235467)/S(32)*x**S(4) - S(147987)/S(80)*x**S(5) - S(3645)/S(16)*x**S(6) - S(23647449)/S(256)*log(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(8)/((S(1) - S(2)*x)**S(3)*(S(3) + S(5)*x)), x, S(2), S(5764801)/S(5632)/(S(1) - S(2)*x)**S(2) + ( - S(188591347)/S(30976))/(S(1) - S(2)*x) - S(2941619571)/S(400000)*x - S(110180817)/S(40000)*x**S(2) - S(124416)/S(125)*x**S(3) - S(408969)/S(1600)*x**S(4) - S(6561)/S(200)*x**S(5) - S(2644396573)/S(340736)*log(S(1) - S(2)*x) + S(1)/S(20796875)*log(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(7)/((S(1) - S(2)*x)**S(3)*(S(3) + S(5)*x)), x, S(2), S(823543)/S(2816)/(S(1) - S(2)*x)**S(2) + ( - S(5764801)/S(3872))/(S(1) - S(2)*x) - S(26161299)/S(20000)*x - S(792423)/S(2000)*x**S(2) - S(40581)/S(400)*x**S(3) - S(2187)/S(160)*x**S(4) - S(269063263)/S(170368)*log(S(1) - S(2)*x) + S(1)/S(4159375)*log(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(6)*(S(3) + S(5)*x)*sqrt(S(1) - S(2)*x), x, S(2), - S(1294139)/S(384)*(S(1) - S(2)*x)**(S(3)/S(2)) + S(3916031)/S(640)*(S(1) - S(2)*x)**(S(5)/S(2)) - S(725445)/S(128)*(S(1) - S(2)*x)**(S(7)/S(2)) + S(406455)/S(128)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(1580985)/S(1408)*(S(1) - S(2)*x)**(S(11)/S(2)) + S(409941)/S(1664)*(S(1) - S(2)*x)**(S(13)/S(2)) - S(19683)/S(640)*(S(1) - S(2)*x)**(S(15)/S(2)) + S(3645)/S(2176)*(S(1) - S(2)*x)**(S(17)/S(2))],
[(S(2) + S(3)*x)**S(5)*(S(3) + S(5)*x)*sqrt(S(1) - S(2)*x), x, S(2), - S(184877)/S(192)*(S(1) - S(2)*x)**(S(3)/S(2)) + S(12005)/S(8)*(S(1) - S(2)*x)**(S(5)/S(2)) - S(74235)/S(64)*(S(1) - S(2)*x)**(S(7)/S(2)) + S(4165)/S(8)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(97335)/S(704)*(S(1) - S(2)*x)**(S(11)/S(2)) + S(81)/S(4)*(S(1) - S(2)*x)**(S(13)/S(2)) - S(81)/S(64)*(S(1) - S(2)*x)**(S(15)/S(2))],
[(S(2) + S(3)*x)**S(4)*sqrt(S(1) - S(2)*x)/(S(3) + S(5)*x), x, S(5), - S(45473)/S(5000)*(S(1) - S(2)*x)**(S(3)/S(2)) + S(34371)/S(5000)*(S(1) - S(2)*x)**(S(5)/S(2)) - S(2889)/S(1400)*(S(1) - S(2)*x)**(S(7)/S(2)) + S(9)/S(40)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(2)/S(3125)*arctanh(sqrt(S(5)/S(11))*sqrt(S(1) - S(2)*x))*sqrt(S(11)/S(5)) + S(2)/S(3125)*sqrt(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(3)*sqrt(S(1) - S(2)*x)/(S(3) + S(5)*x), x, S(5), - S(1299)/S(500)*(S(1) - S(2)*x)**(S(3)/S(2)) + S(162)/S(125)*(S(1) - S(2)*x)**(S(5)/S(2)) - S(27)/S(140)*(S(1) - S(2)*x)**(S(7)/S(2)) - S(2)/S(625)*arctanh(sqrt(S(5)/S(11))*sqrt(S(1) - S(2)*x))*sqrt(S(11)/S(5)) + S(2)/S(625)*sqrt(S(1) - S(2)*x)],
[(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(6)*(S(3) + S(5)*x), x, S(2), - S(1294139)/S(640)*(S(1) - S(2)*x)**(S(5)/S(2)) + S(559433)/S(128)*(S(1) - S(2)*x)**(S(7)/S(2)) - S(564235)/S(128)*(S(1) - S(2)*x)**(S(9)/S(2)) + S(3658095)/S(1408)*(S(1) - S(2)*x)**(S(11)/S(2)) - S(1580985)/S(1664)*(S(1) - S(2)*x)**(S(13)/S(2)) + S(136647)/S(640)*(S(1) - S(2)*x)**(S(15)/S(2)) - S(59049)/S(2176)*(S(1) - S(2)*x)**(S(17)/S(2)) + S(3645)/S(2432)*(S(1) - S(2)*x)**(S(19)/S(2))],
[(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(5)*(S(3) + S(5)*x), x, S(2), - S(184877)/S(320)*(S(1) - S(2)*x)**(S(5)/S(2)) + S(8575)/S(8)*(S(1) - S(2)*x)**(S(7)/S(2)) - S(173215)/S(192)*(S(1) - S(2)*x)**(S(9)/S(2)) + S(37485)/S(88)*(S(1) - S(2)*x)**(S(11)/S(2)) - S(97335)/S(832)*(S(1) - S(2)*x)**(S(13)/S(2)) + S(351)/S(20)*(S(1) - S(2)*x)**(S(15)/S(2)) - S(1215)/S(1088)*(S(1) - S(2)*x)**(S(17)/S(2))],
[(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(6)/(S(3) + S(5)*x), x, S(6), S(2)/S(234375)*(S(1) - S(2)*x)**(S(3)/S(2)) - S(167115051)/S(2500000)*(S(1) - S(2)*x)**(S(5)/S(2)) + S(70752609)/S(700000)*(S(1) - S(2)*x)**(S(7)/S(2)) - S(665817)/S(10000)*(S(1) - S(2)*x)**(S(9)/S(2)) + S(507627)/S(22000)*(S(1) - S(2)*x)**(S(11)/S(2)) - S(43011)/S(10400)*(S(1) - S(2)*x)**(S(13)/S(2)) + S(243)/S(800)*(S(1) - S(2)*x)**(S(15)/S(2)) - S(22)/S(390625)*arctanh(sqrt(S(5)/S(11))*sqrt(S(1) - S(2)*x))*sqrt(S(11)/S(5)) + S(22)/S(390625)*sqrt(S(1) - S(2)*x)],
[(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(5)/(S(3) + S(5)*x), x, S(6), S(2)/S(46875)*(S(1) - S(2)*x)**(S(3)/S(2)) - S(4774713)/S(250000)*(S(1) - S(2)*x)**(S(5)/S(2)) + S(806121)/S(35000)*(S(1) - S(2)*x)**(S(7)/S(2)) - S(5673)/S(500)*(S(1) - S(2)*x)**(S(9)/S(2)) + S(5751)/S(2200)*(S(1) - S(2)*x)**(S(11)/S(2)) - S(243)/S(1040)*(S(1) - S(2)*x)**(S(13)/S(2)) - S(22)/S(78125)*arctanh(sqrt(S(5)/S(11))*sqrt(S(1) - S(2)*x))*sqrt(S(11)/S(5)) + S(22)/S(78125)*sqrt(S(1) - S(2)*x)],
[(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(6)*(S(3) + S(5)*x), x, S(2), - S(184877)/S(128)*(S(1) - S(2)*x)**(S(7)/S(2)) + S(3916031)/S(1152)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(5078115)/S(1408)*(S(1) - S(2)*x)**(S(11)/S(2)) + S(3658095)/S(1664)*(S(1) - S(2)*x)**(S(13)/S(2)) - S(105399)/S(128)*(S(1) - S(2)*x)**(S(15)/S(2)) + S(409941)/S(2176)*(S(1) - S(2)*x)**(S(17)/S(2)) - S(59049)/S(2432)*(S(1) - S(2)*x)**(S(19)/S(2)) + S(1215)/S(896)*(S(1) - S(2)*x)**(S(21)/S(2))],
[(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(5)*(S(3) + S(5)*x), x, S(2), - S(26411)/S(64)*(S(1) - S(2)*x)**(S(7)/S(2)) + S(60025)/S(72)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(519645)/S(704)*(S(1) - S(2)*x)**(S(11)/S(2)) + S(37485)/S(104)*(S(1) - S(2)*x)**(S(13)/S(2)) - S(6489)/S(64)*(S(1) - S(2)*x)**(S(15)/S(2)) + S(1053)/S(68)*(S(1) - S(2)*x)**(S(17)/S(2)) - S(1215)/S(1216)*(S(1) - S(2)*x)**(S(19)/S(2))],
[(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(4)/(S(3) + S(5)*x), x, S(7), S(22)/S(46875)*(S(1) - S(2)*x)**(S(3)/S(2)) + S(2)/S(15625)*(S(1) - S(2)*x)**(S(5)/S(2)) - S(136419)/S(35000)*(S(1) - S(2)*x)**(S(7)/S(2)) + S(3819)/S(1000)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(2889)/S(2200)*(S(1) - S(2)*x)**(S(11)/S(2)) + S(81)/S(520)*(S(1) - S(2)*x)**(S(13)/S(2)) - S(242)/S(78125)*arctanh(sqrt(S(5)/S(11))*sqrt(S(1) - S(2)*x))*sqrt(S(11)/S(5)) + S(242)/S(78125)*sqrt(S(1) - S(2)*x)],
[(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(3)/(S(3) + S(5)*x), x, S(7), S(22)/S(9375)*(S(1) - S(2)*x)**(S(3)/S(2)) + S(2)/S(3125)*(S(1) - S(2)*x)**(S(5)/S(2)) - S(3897)/S(3500)*(S(1) - S(2)*x)**(S(7)/S(2)) + S(18)/S(25)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(27)/S(220)*(S(1) - S(2)*x)**(S(11)/S(2)) - S(242)/S(15625)*arctanh(sqrt(S(5)/S(11))*sqrt(S(1) - S(2)*x))*sqrt(S(11)/S(5)) + S(242)/S(15625)*sqrt(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(5)*(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x), x, S(2), S(60025)/S(24)*(S(1) - S(2)*x)**(S(3)/S(2)) - S(103929)/S(64)*(S(1) - S(2)*x)**(S(5)/S(2)) + S(5355)/S(8)*(S(1) - S(2)*x)**(S(7)/S(2)) - S(10815)/S(64)*(S(1) - S(2)*x)**(S(9)/S(2)) + S(1053)/S(44)*(S(1) - S(2)*x)**(S(11)/S(2)) - S(1215)/S(832)*(S(1) - S(2)*x)**(S(13)/S(2)) - S(184877)/S(64)*sqrt(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(4)*(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x), x, S(2), S(57281)/S(96)*(S(1) - S(2)*x)**(S(3)/S(2)) - S(24843)/S(80)*(S(1) - S(2)*x)**(S(5)/S(2)) + S(1539)/S(16)*(S(1) - S(2)*x)**(S(7)/S(2)) - S(519)/S(32)*(S(1) - S(2)*x)**(S(9)/S(2)) + S(405)/S(352)*(S(1) - S(2)*x)**(S(11)/S(2)) - S(26411)/S(32)*sqrt(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(5)/((S(3) + S(5)*x)*sqrt(S(1) - S(2)*x)), x, S(4), S(268707)/S(5000)*(S(1) - S(2)*x)**(S(3)/S(2)) - S(51057)/S(2500)*(S(1) - S(2)*x)**(S(5)/S(2)) + S(5751)/S(1400)*(S(1) - S(2)*x)**(S(7)/S(2)) - S(27)/S(80)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(2)/S(3125)*arctanh(sqrt(S(5)/S(11))*sqrt(S(1) - S(2)*x))/sqrt(S(55)) - S(4774713)/S(50000)*sqrt(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(7)*(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(3)/S(2)), x, S(2), - S(7882483)/S(128)*(S(1) - S(2)*x)**(S(3)/S(2)) + S(4084101)/S(128)*(S(1) - S(2)*x)**(S(5)/S(2)) - S(787185)/S(64)*(S(1) - S(2)*x)**(S(7)/S(2)) + S(422919)/S(128)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(821583)/S(1408)*(S(1) - S(2)*x)**(S(11)/S(2)) + S(101331)/S(1664)*(S(1) - S(2)*x)**(S(13)/S(2)) - S(729)/S(256)*(S(1) - S(2)*x)**(S(15)/S(2)) + S(9058973)/S(256)/sqrt(S(1) - S(2)*x) + S(15647317)/S(128)*sqrt(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(6)*(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(3)/S(2)), x, S(2), - S(1692705)/S(128)*(S(1) - S(2)*x)**(S(3)/S(2)) + S(731619)/S(128)*(S(1) - S(2)*x)**(S(5)/S(2)) - S(225855)/S(128)*(S(1) - S(2)*x)**(S(7)/S(2)) + S(45549)/S(128)*(S(1) - S(2)*x)**(S(9)/S(2)) - S(59049)/S(1408)*(S(1) - S(2)*x)**(S(11)/S(2)) + S(3645)/S(1664)*(S(1) - S(2)*x)**(S(13)/S(2)) + S(1294139)/S(128)/sqrt(S(1) - S(2)*x) + S(3916031)/S(128)*sqrt(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(5)*(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(5)/S(2)), x, S(2), S(184877)/S(192)/(S(1) - S(2)*x)**(S(3)/S(2)) + S(12495)/S(8)*(S(1) - S(2)*x)**(S(3)/S(2)) - S(19467)/S(64)*(S(1) - S(2)*x)**(S(5)/S(2)) + S(1053)/S(28)*(S(1) - S(2)*x)**(S(7)/S(2)) - S(135)/S(64)*(S(1) - S(2)*x)**(S(9)/S(2)) + ( - S(60025)/S(8))/sqrt(S(1) - S(2)*x) - S(519645)/S(64)*sqrt(S(1) - S(2)*x)],
[(S(2) + S(3)*x)**S(4)*(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(5)/S(2)), x, S(2), S(26411)/S(96)/(S(1) - S(2)*x)**(S(3)/S(2)) + S(3591)/S(16)*(S(1) - S(2)*x)**(S(3)/S(2)) - S(4671)/S(160)*(S(1) - S(2)*x)**(S(5)/S(2)) + S(405)/S(224)*(S(1) - S(2)*x)**(S(7)/S(2)) + ( - S(57281)/S(32))/sqrt(S(1) - S(2)*x) - S(24843)/S(16)*sqrt(S(1) - S(2)*x)],
[(A + B*x)*(d + e*x)**(S(5)/S(2))*sqrt(a + b*x), x, S(7), - S(1)/S(48)*(b*d - a*e)*(S(3)*b*B*d - S(10)*A*b*e + S(7)*a*B*e)*(a + b*x)**(S(3)/S(2))*(d + e*x)**(S(3)/S(2))/(b**S(3)*e) - S(1)/S(40)*(S(3)*b*B*d - S(10)*A*b*e + S(7)*a*B*e)*(a + b*x)**(S(3)/S(2))*(d + e*x)**(S(5)/S(2))/(b**S(2)*e) + S(1)/S(5)*B*(a + b*x)**(S(3)/S(2))*(d + e*x)**(S(7)/S(2))/(b*e) + S(1)/S(128)*(b*d - a*e)**S(4)*(S(3)*b*B*d - S(10)*A*b*e + S(7)*a*B*e)*arctanh(sqrt(e)*sqrt(a + b*x)/(sqrt(b)*sqrt(d + e*x)))/(b**(S(9)/S(2))*e**(S(5)/S(2))) - S(1)/S(64)*(b*d - a*e)**S(2)*(S(3)*b*B*d - S(10)*A*b*e + S(7)*a*B*e)*(a + b*x)**(S(3)/S(2))*sqrt(d + e*x)/(b**S(4)*e) - S(1)/S(128)*(b*d - a*e)**S(3)*(S(3)*b*B*d - S(10)*A*b*e + S(7)*a*B*e)*sqrt(a + b*x)*sqrt(d + e*x)/(b**S(4)*e**S(2))],
[(A + B*x)*(d + e*x)**(S(3)/S(2))*sqrt(a + b*x), x, S(6), - S(1)/S(24)*(S(3)*b*B*d - S(8)*A*b*e + S(5)*a*B*e)*(a + b*x)**(S(3)/S(2))*(d + e*x)**(S(3)/S(2))/(b**S(2)*e) + S(1)/S(4)*B*(a + b*x)**(S(3)/S(2))*(d + e*x)**(S(5)/S(2))/(b*e) + S(1)/S(64)*(b*d - a*e)**S(3)*(S(3)*b*B*d - S(8)*A*b*e + S(5)*a*B*e)*arctanh(sqrt(e)*sqrt(a + b*x)/(sqrt(b)*sqrt(d + e*x)))/(b**(S(7)/S(2))*e**(S(5)/S(2))) - S(1)/S(32)*(b*d - a*e)*(S(3)*b*B*d - S(8)*A*b*e + S(5)*a*B*e)*(a + b*x)**(S(3)/S(2))*sqrt(d + e*x)/(b**S(3)*e) - S(1)/S(64)*(b*d - a*e)**S(2)*(S(3)*b*B*d - S(8)*A*b*e + S(5)*a*B*e)*sqrt(a + b*x)*sqrt(d + e*x)/(b**S(3)*e**S(2))],
[(A + B*x)*(d + e*x)**(S(5)/S(2))/sqrt(a + b*x), x, S(6), - S(5)/S(64)*(b*d - a*e)**S(3)*(b*B*d - S(8)*A*b*e + S(7)*a*B*e)*arctanh(sqrt(e)*sqrt(a + b*x)/(sqrt(b)*sqrt(d + e*x)))/(b**(S(9)/S(2))*e**(S(3)/S(2))) - S(5)/S(96)*(b*d - a*e)*(b*B*d - S(8)*A*b*e + S(7)*a*B*e)*(d + e*x)**(S(3)/S(2))*sqrt(a + b*x)/(b**S(3)*e) - S(1)/S(24)*(b*B*d - S(8)*A*b*e + S(7)*a*B*e)*(d + e*x)**(S(5)/S(2))*sqrt(a + b*x)/(b**S(2)*e) + S(1)/S(4)*B*(d + e*x)**(S(7)/S(2))*sqrt(a + b*x)/(b*e) - S(5)/S(64)*(b*d - a*e)**S(2)*(b*B*d - S(8)*A*b*e + S(7)*a*B*e)*sqrt(a + b*x)*sqrt(d + e*x)/(b**S(4)*e)],
[(A + B*x)*(d + e*x)**(S(3)/S(2))/sqrt(a + b*x), x, S(5), - S(1)/S(8)*(b*d - a*e)**S(2)*(b*B*d - S(6)*A*b*e + S(5)*a*B*e)*arctanh(sqrt(e)*sqrt(a + b*x)/(sqrt(b)*sqrt(d + e*x)))/(b**(S(7)/S(2))*e**(S(3)/S(2))) - S(1)/S(12)*(b*B*d - S(6)*A*b*e + S(5)*a*B*e)*(d + e*x)**(S(3)/S(2))*sqrt(a + b*x)/(b**S(2)*e) + S(1)/S(3)*B*(d + e*x)**(S(5)/S(2))*sqrt(a + b*x)/(b*e) - S(1)/S(8)*(b*d - a*e)*(b*B*d - S(6)*A*b*e + S(5)*a*B*e)*sqrt(a + b*x)*sqrt(d + e*x)/(b**S(3)*e)],
[(S(2) + S(3)*x)**S(4)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x), x, S(7), - S(333)/S(2000)*(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(2)*(S(3) + S(5)*x)**(S(3)/S(2)) - S(1)/S(20)*(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(3)*(S(3) + S(5)*x)**(S(3)/S(2)) - S(7)/S(640000)*(S(1) - S(2)*x)**(S(3)/S(2))*(S(3) + S(5)*x)**(S(3)/S(2))*(S(231223) + S(140652)*x) + S(4122385421)/S(51200000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) - S(34069301)/S(5120000)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) + S(374762311)/S(51200000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(3)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x), x, S(6), - S(3)/S(50)*(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(2)*(S(3) + S(5)*x)**(S(3)/S(2)) - S(21)/S(16000)*(S(1) - S(2)*x)**(S(3)/S(2))*(S(3) + S(5)*x)**(S(3)/S(2))*(S(731) + S(444)*x) + S(39142411)/S(1280000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) - S(323491)/S(128000)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) + S(3558401)/S(1280000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(3)*sqrt(S(1) - S(2)*x)/sqrt(S(3) + S(5)*x), x, S(5), S(525371)/S(64000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) - S(3)/S(40)*(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(2)*sqrt(S(3) + S(5)*x) - S(21)/S(6400)*(S(1) - S(2)*x)**(S(3)/S(2))*(S(335) + S(216)*x)*sqrt(S(3) + S(5)*x) + S(47761)/S(64000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(2)*sqrt(S(1) - S(2)*x)/sqrt(S(3) + S(5)*x), x, S(5), S(3047)/S(800)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) - S(23)/S(80)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) - S(1)/S(10)*(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)*sqrt(S(3) + S(5)*x) + S(277)/S(800)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x), x, S(7), - S(1)/S(20)*(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(2)*(S(3) + S(5)*x)**(S(3)/S(2)) - S(1)/S(160000)*(S(1) - S(2)*x)**(S(5)/S(2))*(S(3) + S(5)*x)**(S(3)/S(2))*(S(88987) + S(63120)*x) + S(452517373)/S(25600000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(3739813)/S(7680000)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) - S(339983)/S(384000)*(S(1) - S(2)*x)**(S(5)/S(2))*sqrt(S(3) + S(5)*x) + S(41137943)/S(25600000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(2)*sqrt(S(3) + S(5)*x), x, S(7), - S(567)/S(4000)*(S(1) - S(2)*x)**(S(5)/S(2))*(S(3) + S(5)*x)**(S(3)/S(2)) - S(3)/S(50)*(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)*(S(3) + S(5)*x)**(S(3)/S(2)) + S(5487713)/S(640000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(45353)/S(192000)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) - S(4123)/S(9600)*(S(1) - S(2)*x)**(S(5)/S(2))*sqrt(S(3) + S(5)*x) + S(498883)/S(640000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(3)/sqrt(S(3) + S(5)*x), x, S(6), S(18648399)/S(3200000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(51373)/S(320000)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) - S(3)/S(50)*(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(2)*sqrt(S(3) + S(5)*x) - S(3)/S(80000)*(S(1) - S(2)*x)**(S(5)/S(2))*(S(14629) + S(11580)*x)*sqrt(S(3) + S(5)*x) + S(1695309)/S(3200000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**(S(3)/S(2))*(S(2) + S(3)*x)**S(2)/sqrt(S(3) + S(5)*x), x, S(6), S(109263)/S(32000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(301)/S(3200)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) - S(119)/S(800)*(S(1) - S(2)*x)**(S(5)/S(2))*sqrt(S(3) + S(5)*x) - S(3)/S(40)*(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)*sqrt(S(3) + S(5)*x) + S(9933)/S(32000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x), x, S(8), - S(3)/S(70)*(S(1) - S(2)*x)**(S(7)/S(2))*(S(2) + S(3)*x)**S(2)*(S(3) + S(5)*x)**(S(3)/S(2)) - S(3)/S(280000)*(S(1) - S(2)*x)**(S(7)/S(2))*(S(3) + S(5)*x)**(S(3)/S(2))*(S(33857) + S(26700)*x) + S(3735929329)/S(256000000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(30875449)/S(76800000)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) + S(2806859)/S(19200000)*(S(1) - S(2)*x)**(S(5)/S(2))*sqrt(S(3) + S(5)*x) - S(255169)/S(640000)*(S(1) - S(2)*x)**(S(7)/S(2))*sqrt(S(3) + S(5)*x) + S(339629939)/S(256000000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(2)*sqrt(S(3) + S(5)*x), x, S(8), - S(193)/S(2000)*(S(1) - S(2)*x)**(S(7)/S(2))*(S(3) + S(5)*x)**(S(3)/S(2)) - S(1)/S(20)*(S(1) - S(2)*x)**(S(7)/S(2))*(S(2) + S(3)*x)*(S(3) + S(5)*x)**(S(3)/S(2)) + S(105254149)/S(12800000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(869869)/S(3840000)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) + S(79079)/S(960000)*(S(1) - S(2)*x)**(S(5)/S(2))*sqrt(S(3) + S(5)*x) - S(7189)/S(32000)*(S(1) - S(2)*x)**(S(7)/S(2))*sqrt(S(3) + S(5)*x) + S(9568559)/S(12800000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(4)/sqrt(S(3) + S(5)*x), x, S(8), S(12679836719)/S(1280000000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(104792039)/S(384000000)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) + S(9526549)/S(96000000)*(S(1) - S(2)*x)**(S(5)/S(2))*sqrt(S(3) + S(5)*x) - S(271)/S(2800)*(S(1) - S(2)*x)**(S(7)/S(2))*(S(2) + S(3)*x)**S(2)*sqrt(S(3) + S(5)*x) - S(3)/S(70)*(S(1) - S(2)*x)**(S(7)/S(2))*(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x) - S(1)/S(22400000)*(S(1) - S(2)*x)**(S(7)/S(2))*(S(12923401) + S(11603280)*x)*sqrt(S(3) + S(5)*x) + S(1152712429)/S(1280000000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(1) - S(2)*x)**(S(5)/S(2))*(S(2) + S(3)*x)**S(3)/sqrt(S(3) + S(5)*x), x, S(7), S(368012183)/S(64000000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(3041423)/S(19200000)*(S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x) + S(276493)/S(4800000)*(S(1) - S(2)*x)**(S(5)/S(2))*sqrt(S(3) + S(5)*x) - S(1)/S(20)*(S(1) - S(2)*x)**(S(7)/S(2))*(S(2) + S(3)*x)**S(2)*sqrt(S(3) + S(5)*x) - S(1)/S(160000)*(S(1) - S(2)*x)**(S(7)/S(2))*(S(52951) + S(47280)*x)*sqrt(S(3) + S(5)*x) + S(33455653)/S(64000000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(4)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x), x, S(6), S(1067352517)/S(2560000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) - S(987)/S(4000)*(S(2) + S(3)*x)**S(2)*(S(3) + S(5)*x)**(S(3)/S(2))*sqrt(S(1) - S(2)*x) - S(3)/S(50)*(S(2) + S(3)*x)**S(3)*(S(3) + S(5)*x)**(S(3)/S(2))*sqrt(S(1) - S(2)*x) - S(21)/S(640000)*(S(3) + S(5)*x)**(S(3)/S(2))*(S(194923) + S(92040)*x)*sqrt(S(1) - S(2)*x) - S(97032047)/S(2560000)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x), x, S(5), S(677017)/S(5120)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) - S(3)/S(40)*(S(2) + S(3)*x)**S(2)*(S(3) + S(5)*x)**(S(3)/S(2))*sqrt(S(1) - S(2)*x) - S(3)/S(1280)*(S(3) + S(5)*x)**(S(3)/S(2))*(S(865) + S(408)*x)*sqrt(S(1) - S(2)*x) - S(61547)/S(5120)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(4)/(sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)), x, S(5), S(10866247)/S(128000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) - S(259)/S(800)*(S(2) + S(3)*x)**S(2)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) - S(3)/S(40)*(S(2) + S(3)*x)**S(3)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) - S(7)/S(128000)*(S(187559) + S(77820)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(3)/(sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)), x, S(4), S(44437)/S(1600)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) - S(1)/S(10)*(S(2) + S(3)*x)**S(2)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) - S(1)/S(1600)*(S(5363) + S(2220)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(5)*sqrt(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(3)/S(2)), x, S(7), - S(35439958001)/S(5120000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + (S(2) + S(3)*x)**S(5)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x) + S(847637)/S(32000)*(S(2) + S(3)*x)**S(2)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) + S(10389)/S(1600)*(S(2) + S(3)*x)**S(3)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) + S(33)/S(20)*(S(2) + S(3)*x)**S(4)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) + S(49)/S(5120000)*(S(87394471) + S(36265980)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(4)*sqrt(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(3)/S(2)), x, S(6), - S(92108287)/S(51200)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + (S(2) + S(3)*x)**S(4)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x) + S(2203)/S(320)*(S(2) + S(3)*x)**S(2)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) + S(27)/S(16)*(S(2) + S(3)*x)**S(3)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) + S(1)/S(51200)*(S(11129753) + S(4618500)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(5)/((S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x)), x, S(6), - S(291096141)/S(256000)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(7)/S(11)*(S(2) + S(3)*x)**S(4)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x) + S(76587)/S(17600)*(S(2) + S(3)*x)**S(2)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) + S(939)/S(880)*(S(2) + S(3)*x)**S(3)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) + S(21)/S(2816000)*(S(18424549) + S(7645620)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(4)/((S(1) - S(2)*x)**(S(3)/S(2))*sqrt(S(3) + S(5)*x)), x, S(5), - S(184641)/S(640)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(7)/S(11)*(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x) + S(243)/S(220)*(S(2) + S(3)*x)**S(2)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) + S(9)/S(7040)*(S(27269) + S(11316)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(4)*sqrt(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(5)/S(2)), x, S(6), S(13246251)/S(6400)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(1)/S(3)*(S(2) + S(3)*x)**S(4)*sqrt(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(3)/S(2)) - S(299)/S(66)*(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x) - S(697)/S(88)*(S(2) + S(3)*x)**S(2)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) - S(1)/S(70400)*(S(17606479) + S(7306140)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(5)/S(2)), x, S(5), S(126513)/S(320)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(1)/S(3)*(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(3)/S(2)) - S(233)/S(66)*(S(2) + S(3)*x)**S(2)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x) - S(1)/S(3520)*(S(168157) + S(69780)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(5)/((S(1) - S(2)*x)**(S(5)/S(2))*sqrt(S(3) + S(5)*x)), x, S(6), S(8261577)/S(6400)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(7)/S(33)*(S(2) + S(3)*x)**S(4)*sqrt(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(3)/S(2)) - S(2051)/S(726)*(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x) - S(23909)/S(4840)*(S(2) + S(3)*x)**S(2)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x) - S(1)/S(774400)*(S(120791143) + S(50124540)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(S(2) + S(3)*x)**S(4)/((S(1) - S(2)*x)**(S(5)/S(2))*sqrt(S(3) + S(5)*x)), x, S(5), S(392283)/S(1600)*arcsin(sqrt(S(2)/S(11))*sqrt(S(3) + S(5)*x))/sqrt(S(10)) + S(7)/S(33)*(S(2) + S(3)*x)**S(3)*sqrt(S(3) + S(5)*x)/(S(1) - S(2)*x)**(S(3)/S(2)) - S(1589)/S(726)*(S(2) + S(3)*x)**S(2)*sqrt(S(3) + S(5)*x)/sqrt(S(1) - S(2)*x) - S(1)/S(193600)*(S(5735477) + S(2380020)*x)*sqrt(S(1) - S(2)*x)*sqrt(S(3) + S(5)*x)],
[(c + d*x)**(S(1)/S(2))/(x**S(2)*(a + b*x)**S(2)), x, S(7), (S(4)*b*c - a*d)*arctanh(sqrt(c + d*x)/sqrt(c))/(a**S(3)*sqrt(c)) - (S(4)*b*c - S(3)*a*d)*arctanh(sqrt(b)*sqrt(c + d*x)/sqrt(b*c - a*d))*sqrt(b)/(a**S(3)*sqrt(b*c - a*d)) - S(2)*b*sqrt(c + d*x)/(a**S(2)*(a + b*x)) - sqrt(c + d*x)/(a*x*(a + b*x))],
[S(1)/(x**S(2)*(a + b*x)**S(2)*(c + d*x)**(S(1)/S(2))), x, S(7), (S(4)*b*c + a*d)*arctanh(sqrt(c + d*x)/sqrt(c))/(a**S(3)*c**(S(3)/S(2))) - b**(S(3)/S(2))*(S(4)*b*c - S(5)*a*d)*arctanh(sqrt(b)*sqrt(c + d*x)/sqrt(b*c - a*d))/(a**S(3)*(b*c - a*d)**(S(3)/S(2))) - b*(S(2)*b*c - a*d)*sqrt(c + d*x)/(a**S(2)*c*(b*c - a*d)*(a + b*x)) - sqrt(c + d*x)/(a*c*x*(a + b*x))],
[S(1)/(x**S(2)*(a + b*x)**S(2)*(c + d*x)**(S(3)/S(2))), x, S(8), (S(4)*b*c + S(3)*a*d)*arctanh(sqrt(c + d*x)/sqrt(c))/(a**S(3)*c**(S(5)/S(2))) - b**(S(5)/S(2))*(S(4)*b*c - S(7)*a*d)*arctanh(sqrt(b)*sqrt(c + d*x)/sqrt(b*c - a*d))/(a**S(3)*(b*c - a*d)**(S(5)/S(2))) - d*(S(2)*b**S(2)*c**S(2) - S(2)*a*b*c*d + S(3)*a**S(2)*d**S(2))/(a**S(2)*c**S(2)*(b*c - a*d)**S(2)*sqrt(c + d*x)) - b*(S(2)*b*c - a*d)/(a**S(2)*c*(b*c - a*d)*(a + b*x)*sqrt(c + d*x)) + ( - S(1))/(a*c*x*(a + b*x)*sqrt(c + d*x))],
[x**S(3)*(c + d*x)**(S(3)/S(2))/(a + b*x)**(S(3)/S(2)), x, S(6), S(3)/S(64)*(b*c - a*d)*(b**S(3)*c**S(3) + S(5)*a*b**S(2)*c**S(2)*d + S(35)*a**S(2)*b*c*d**S(2) - S(105)*a**S(3)*d**S(3))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(11)/S(2))*d**(S(5)/S(2))) - S(2)*x**S(3)*(c + d*x)**(S(3)/S(2))/(b*sqrt(a + b*x)) + S(9)/S(4)*x**S(2)*(c + d*x)**(S(3)/S(2))*sqrt(a + b*x)/b**S(2) - S(1)/S(32)*(c + d*x)**(S(3)/S(2))*(S(3)*b**S(2)*c**S(2) + S(14)*a*b*c*d - S(105)*a**S(2)*d**S(2) - S(4)*b*d*(b*c - S(21)*a*d)*x)*sqrt(a + b*x)/(b**S(4)*d**S(2)) + S(3)/S(64)*(b**S(3)*c**S(3) + S(5)*a*b**S(2)*c**S(2)*d + S(35)*a**S(2)*b*c*d**S(2) - S(105)*a**S(3)*d**S(3))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(5)*d**S(2))],
[x**S(2)*(c + d*x)**(S(3)/S(2))/(a + b*x)**(S(3)/S(2)), x, S(6), - S(1)/S(8)*(b*c - a*d)*(b**S(2)*c**S(2) + S(10)*a*b*c*d - S(35)*a**S(2)*d**S(2))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(9)/S(2))*d**(S(3)/S(2))) - S(2)*a**S(2)*(c + d*x)**(S(5)/S(2))/(b**S(2)*(b*c - a*d)*sqrt(a + b*x)) - S(1)/S(12)*(S(10)*a*c + b*c**S(2)/d - S(35)*a**S(2)*d/b)*(c + d*x)**(S(3)/S(2))*sqrt(a + b*x)/(b**S(2)*(b*c - a*d)) + S(1)/S(3)*(c + d*x)**(S(5)/S(2))*sqrt(a + b*x)/(b**S(2)*d) - S(1)/S(8)*(b**S(2)*c**S(2) + S(10)*a*b*c*d - S(35)*a**S(2)*d**S(2))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(4)*d)],
[x**S(3)*(c + d*x)**(S(5)/S(2))/(a + b*x)**(S(5)/S(2)), x, S(7), - S(2)/S(3)*x**S(3)*(c + d*x)**(S(5)/S(2))/(b*(a + b*x)**(S(3)/S(2))) - S(5)/S(64)*(b*c - a*d)*(b**S(3)*c**S(3) + S(21)*a*b**S(2)*c**S(2)*d - S(189)*a**S(2)*b*c*d**S(2) + S(231)*a**S(3)*d**S(3))*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(13)/S(2))*d**(S(3)/S(2))) - S(2)/S(3)*(S(6)*b*c - S(11)*a*d)*x**S(2)*(c + d*x)**(S(5)/S(2))/(b**S(2)*(b*c - a*d)*sqrt(a + b*x)) - S(5)/S(96)*(b**S(3)*c**S(3) + S(21)*a*b**S(2)*c**S(2)*d - S(189)*a**S(2)*b*c*d**S(2) + S(231)*a**S(3)*d**S(3))*(c + d*x)**(S(3)/S(2))*sqrt(a + b*x)/(b**S(5)*d*(b*c - a*d)) + S(1)/S(24)*(c + d*x)**(S(5)/S(2))*(S(5)*b**S(2)*c**S(2) - S(156)*a*b*c*d + S(231)*a**S(2)*d**S(2) + S(2)*b*d*(S(59)*b*c - S(99)*a*d)*x)*sqrt(a + b*x)/(b**S(4)*d*(b*c - a*d)) - S(5)/S(64)*(b**S(3)*c**S(3) + S(21)*a*b**S(2)*c**S(2)*d - S(189)*a**S(2)*b*c*d**S(2) + S(231)*a**S(3)*d**S(3))*sqrt(a + b*x)*sqrt(c + d*x)/(b**S(6)*d)],
[x**S(2)/((a + b*x)**(S(5)/S(2))*(c + d*x)**(S(1)/S(2))), x, S(4), S(2)*arctanh(sqrt(d)*sqrt(a + b*x)/(sqrt(b)*sqrt(c + d*x)))/(b**(S(5)/S(2))*sqrt(d)) - S(2)/S(3)*a**S(2)*sqrt(c + d*x)/(b**S(2)*(b*c - a*d)*(a + b*x)**(S(3)/S(2))) + S(4)/S(3)*a*(S(3)*b*c - S(2)*a*d)*sqrt(c + d*x)/(b**S(2)*(b*c - a*d)**S(2)*sqrt(a + b*x))],
[x*sqrt(a + b*x)/sqrt( - a - b*x), x, S(2), S(1)/S(2)*x**S(2)*sqrt(a + b*x)/sqrt( - a - b*x)],
[(c + d*x)**(S(3)/S(2))/(x*(a + b*x)**S(2)), x, S(6), - S(2)*c**(S(3)/S(2))*arctanh(sqrt(c + d*x)/sqrt(c))/a**S(2) + (S(2)*b*c + a*d)*arctanh(sqrt(b)*sqrt(c + d*x)/sqrt(b*c - a*d))*sqrt(b*c - a*d)/(a**S(2)*b**(S(3)/S(2))) + (b*c - a*d)*sqrt(c + d*x)/(a*b*(a + b*x))],
]
for i in test:
r = rubi_integrate(i[0], i[1])
if len(i) == 5:
assert rubi_test(r, i[1], i[3], expand=True, _diff=True, _numerical=True) or rubi_test(r, i[1], i[4], expand=True, _diff=True, _numerical=True)
else:
assert rubi_test(r, i[1], i[3], expand=True, _diff=True, _numerical=True)
def test_simplify():
test = [
[x**S(3)*(a + b*x)**S(2)*(c + d*x)**S(16), x, S(2), - S(1)/S(17)*c**S(3)*(b*c - a*d)**S(2)*(c + d*x)**S(17)/d**S(6) + S(1)/S(18)*c**S(2)*(S(5)*b*c - S(3)*a*d)*(b*c - a*d)*(c + d*x)**S(18)/d**S(6) - S(1)/S(19)*c*(S(10)*b**S(2)*c**S(2) - S(12)*a*b*c*d + S(3)*a**S(2)*d**S(2))*(c + d*x)**S(19)/d**S(6) + S(1)/S(20)*(S(10)*b**S(2)*c**S(2) - S(8)*a*b*c*d + a**S(2)*d**S(2))*(c + d*x)**S(20)/d**S(6) - S(1)/S(21)*b*(S(5)*b*c - S(2)*a*d)*(c + d*x)**S(21)/d**S(6) + S(1)/S(22)*b**S(2)*(c + d*x)**S(22)/d**S(6)],
[x**S(5)/((a + b*x)**S(2)*(c + d*x)**S(2)), x, S(2), - S(2)*(b*c + a*d)*x/(b**S(3)*d**S(3)) + S(1)/S(2)*x**S(2)/(b**S(2)*d**S(2)) + a**S(5)/(b**S(4)*(b*c - a*d)**S(2)*(a + b*x)) + c**S(5)/(d**S(4)*(b*c - a*d)**S(2)*(c + d*x)) + a**S(4)*(S(5)*b*c - S(3)*a*d)*log(a + b*x)/(b**S(4)*(b*c - a*d)**S(3)) + c**S(4)*(S(3)*b*c - S(5)*a*d)*log(c + d*x)/(d**S(4)*(b*c - a*d)**S(3))],
[x**S(5)/((a + b*x)**S(2)*(c + d*x)**S(2)), x, S(2), - S(2)*(b*c + a*d)*x/(b**S(3)*d**S(3)) + S(1)/S(2)*x**S(2)/(b**S(2)*d**S(2)) + a**S(5)/(b**S(4)*(b*c - a*d)**S(2)*(a + b*x)) + c**S(5)/(d**S(4)*(b*c - a*d)**S(2)*(c + d*x)) + a**S(4)*(S(5)*b*c - S(3)*a*d)*log(a + b*x)/(b**S(4)*(b*c - a*d)**S(3)) + c**S(4)*(S(3)*b*c - S(5)*a*d)*log(c + d*x)/(d**S(4)*(b*c - a*d)**S(3))],
[x**S(4)/((a + b*x)*(c + d*x)), x, S(2), (b**S(2)*c**S(2) + a*b*c*d + a**S(2)*d**S(2))*x/(b**S(3)*d**S(3)) - S(1)/S(2)*(b*c + a*d)*x**S(2)/(b**S(2)*d**S(2)) + S(1)/S(3)*x**S(3)/(b*d) + a**S(4)*log(a + b*x)/(b**S(4)*(b*c - a*d)) - c**S(4)*log(c + d*x)/(d**S(4)*(b*c - a*d))],
[(a + b*x)*(A + B*x)*(d + e*x)**S(4), x, S(2), S(1)/S(5)*(b*d - a*e)*(B*d - A*e)*(d + e*x)**S(5)/e**S(3) - S(1)/S(6)*(S(2)*b*B*d - A*b*e - a*B*e)*(d + e*x)**S(6)/e**S(3) + S(1)/S(7)*b*B*(d + e*x)**S(7)/e**S(3)],
[(a + b*x)**S(3)*(c + d*x)**S(3)*(e + f*x)**S(3), x, S(2), S(1)/S(4)*(b*c - a*d)**S(3)*(b*e - a*f)**S(3)*(a + b*x)**S(4)/b**S(7) + S(3)/S(5)*(b*c - a*d)**S(2)*(b*e - a*f)**S(2)*(b*d*e + b*c*f - S(2)*a*d*f)*(a + b*x)**S(5)/b**S(7) + S(1)/S(2)*(b*c - a*d)*(b*e - a*f)*(S(5)*a**S(2)*d**S(2)*f**S(2) - S(5)*a*b*d*f*(d*e + c*f) + b**S(2)*(d**S(2)*e**S(2) + S(3)*c*d*e*f + c**S(2)*f**S(2)))*(a + b*x)**S(6)/b**S(7) + S(1)/S(7)*(b*d*e + b*c*f - S(2)*a*d*f)*(S(10)*a**S(2)*d**S(2)*f**S(2) - S(10)*a*b*d*f*(d*e + c*f) + b**S(2)*(d**S(2)*e**S(2) + S(8)*c*d*e*f + c**S(2)*f**S(2)))*(a + b*x)**S(7)/b**S(7) + S(3)/S(8)*d*f*(S(5)*a**S(2)*d**S(2)*f**S(2) - S(5)*a*b*d*f*(d*e + c*f) + b**S(2)*(d**S(2)*e**S(2) + S(3)*c*d*e*f + c**S(2)*f**S(2)))*(a + b*x)**S(8)/b**S(7) + S(1)/S(3)*d**S(2)*f**S(2)*(b*d*e + b*c*f - S(2)*a*d*f)*(a + b*x)**S(9)/b**S(7) + S(1)/S(10)*d**S(3)*f**S(3)*(a + b*x)**S(10)/b**S(7)],
[(a + b*x)*(A + B*x)*(d + e*x)**(S(5)/S(2)), x, S(2), S(2)/S(7)*(b*d - a*e)*(B*d - A*e)*(d + e*x)**(S(7)/S(2))/e**S(3) - S(2)/S(9)*(S(2)*b*B*d - A*b*e - a*B*e)*(d + e*x)**(S(9)/S(2))/e**S(3) + S(2)/S(11)*b*B*(d + e*x)**(S(11)/S(2))/e**S(3)],
[(S(5) - S(4)*x)**S(4)*(S(2) + S(3)*x)**m/(S(1) + S(2)*x)**m, x, S(4), - S(1)/S(45)*(S(88) - m)*(S(5) - S(4)*x)**S(2)*(S(1) + S(2)*x)**(S(1) - m)*(S(2) + S(3)*x)**(S(1) + m) - S(2)/S(15)*(S(5) - S(4)*x)**S(3)*(S(1) + S(2)*x)**(S(1) - m)*(S(2) + S(3)*x)**(S(1) + m) - S(1)/S(1215)*(S(1) + S(2)*x)**(S(1) - m)*(S(2) + S(3)*x)**(S(1) + m)*(S(386850) - S(25441)*m + S(426)*m**S(2) - S(2)*m**S(3) - S(24)*(S(4359) - S(154)*m + m**S(2))*x) + S(1)/S(1215)*S(2)**( - S(1) - m)*(S(3528363) - S(639760)*m + S(29050)*m**S(2) - S(440)*m**S(3) + S(2)*m**S(4))*(S(1) + S(2)*x)**(S(1) - m)*hypergeom([S(1) - m, - m], [S(2) - m], - S(3)*(S(1) + S(2)*x))/(S(1) - m)],
[(S(5) - S(4)*x)**S(3)*(S(1) + S(2)*x)**( - S(1) - m)*(S(2) + S(3)*x)**m, x, S(3), - S(2)/S(9)*(S(5) - S(4)*x)**S(2)*(S(2) + S(3)*x)**(S(1) + m)/(S(1) + S(2)*x)**m - S(1)/S(27)*(S(2) + S(3)*x)**(S(1) + m)*(S(9261) - S(512)*m + S(4)*m**S(2) - S(4)*(S(109) - S(2)*m)*m*x)/(m*(S(1) + S(2)*x)**m) + S(1)/S(27)*S(2)**( - S(1) - m)*(S(27783) - S(8324)*m + S(390)*m**S(2) - S(4)*m**S(3))*(S(1) + S(2)*x)**(S(1) - m)*hypergeom([S(1) - m, - m], [S(2) - m], - S(3)*(S(1) + S(2)*x))/((S(1) - m)*m)],
[(a + b*x)**m*(c + d*x)**n*((b*c*f + a*d*f + a*d*f*m + b*c*f*n)/(b*d*(S(2) + m + n)) + f*x)**( - S(3) - m - n), x, S(1), b*d*(S(2) + m + n)*(a + b*x)**(S(1) + m)*(c + d*x)**(S(1) + n)*(f*(a*d*(S(1) + m) + b*c*(S(1) + n))/(b*d*(S(2) + m + n)) + f*x)**( - S(2) - m - n)/((b*c - a*d)**S(2)*f*(S(1) + m)*(S(1) + n))],
[x**S(3)*(c + d*x)**S(3)/(a + b*x)**S(3), x, S(2), (b*c - a*d)*(b**S(2)*c**S(2) - S(8)*a*b*c*d + S(10)*a**S(2)*d**S(2))*x/b**S(6) + S(3)/S(2)*d*(b*c - S(2)*a*d)*(b*c - a*d)*x**S(2)/b**S(5) + d**S(2)*(b*c - a*d)*x**S(3)/b**S(4) + S(1)/S(4)*d**S(3)*x**S(4)/b**S(3) + S(1)/S(2)*a**S(3)*(b*c - a*d)**S(3)/(b**S(7)*(a + b*x)**S(2)) - S(3)*a**S(2)*(b*c - S(2)*a*d)*(b*c - a*d)**S(2)/(b**S(7)*(a + b*x)) - S(3)*a*(b*c - a*d)*(b**S(2)*c**S(2) - S(5)*a*b*c*d + S(5)*a**S(2)*d**S(2))*log(a + b*x)/b**S(7)],
[(S(2) + S(3)*x)**S(8)*(S(3) + S(5)*x)/(S(1) - S(2)*x)**S(3), x, S(2), S(63412811)/S(2048)/(S(1) - S(2)*x)**S(2) + ( - S(246239357)/S(1024))/(S(1) - S(2)*x) - S(120864213)/S(256)*x - S(118841283)/S(512)*x**S(2) - S(16042509)/S(128)*x**S(3) - S(7568235)/S(128)*x**S(4) - S(213597)/S(10)*x**S(5) - S(162567)/S(32)*x**S(6) - S(32805)/S(56)*x**S(7) - S(106237047)/S(256)*log(S(1) - S(2)*x)],
]
for i in test:
r = rubi_integrate(i[0], i[1])
if len(i) == 5:
assert rubi_test(r, i[1], i[3], expand=True) or rubi_test(r, i[1], i[4], expand=True)
else:
assert rubi_test(r, i[1], i[3], expand=True)
def test_diff():
test = [
[(a + b*x)*(e + f*x)**(S(3)/S(2))/(c + d*x), x, S(5), - S(2)/S(3)*(b*c - a*d)*(e + f*x)**(S(3)/S(2))/d**S(2) + S(2)/S(5)*b*(e + f*x)**(S(5)/S(2))/(d*f) + S(2)*(b*c - a*d)*(d*e - c*f)**(S(3)/S(2))*arctanh(sqrt(d)*sqrt(e + f*x)/sqrt(d*e - c*f))/d**(S(7)/S(2)) - S(2)*(b*c - a*d)*(d*e - c*f)*sqrt(e + f*x)/d**S(3)],
[x**(S(5)/S(2))*(A + B*x)/(a + b*x), x, S(6), - S(2)/S(3)*a*(A*b - a*B)*x**(S(3)/S(2))/b**S(3) + S(2)/S(5)*(A*b - a*B)*x**(S(5)/S(2))/b**S(2) + S(2)/S(7)*B*x**(S(7)/S(2))/b - S(2)*a**(S(5)/S(2))*(A*b - a*B)*arctan(sqrt(b)*sqrt(x)/sqrt(a))/b**(S(9)/S(2)) + S(2)*a**S(2)*(A*b - a*B)*sqrt(x)/b**S(4)],
[(a + b*x)**S(2)/((c + d*x)**S(2)*sqrt(e + f*x)), x, S(4), (b*c - a*d)*(S(4)*b*d*e - S(3)*b*c*f - a*d*f)*arctanh(sqrt(d)*sqrt(e + f*x)/sqrt(d*e - c*f))/(d**(S(5)/S(2))*(d*e - c*f)**(S(3)/S(2))) + S(2)*b**S(2)*sqrt(e + f*x)/(d**S(2)*f) - (b*c - a*d)**S(2)*sqrt(e + f*x)/(d**S(2)*(d*e - c*f)*(c + d*x))],
]
for i in test:
r = rubi_integrate(i[0], i[1])
if len(i) == 5:
assert rubi_test(r, i[1], i[3], expand=True, _diff=True) or rubi_test(r, i[1], i[4], expand=True, _diff=True)
else:
assert rubi_test(r, i[1], i[3], expand=True, _diff=True)
| 305.935 | 1,020 | 0.397437 | 17,523 | 61,187 | 1.385893 | 0.034412 | 0.137698 | 0.079802 | 0.059461 | 0.83566 | 0.81293 | 0.76397 | 0.745151 | 0.706485 | 0.679514 | 0 | 0.16304 | 0.118277 | 61,187 | 199 | 1,021 | 307.472362 | 0.287099 | 0.000637 | 0 | 0.112299 | 0 | 0 | 0.000916 | 0 | 0 | 0 | 0 | 0 | 0.032086 | 1 | 0.016043 | false | 0 | 0.080214 | 0 | 0.096257 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
1eb8553df610e90774df208ba424cb3714f40a17 | 181 | py | Python | pysswords/db/__init__.py | chtiprog/pysswords | 1afb8de12662094b79669c541aee2726cda0e9c8 | [
"MIT"
] | null | null | null | pysswords/db/__init__.py | chtiprog/pysswords | 1afb8de12662094b79669c541aee2726cda0e9c8 | [
"MIT"
] | null | null | null | pysswords/db/__init__.py | chtiprog/pysswords | 1afb8de12662094b79669c541aee2726cda0e9c8 | [
"MIT"
] | null | null | null | from .credential import Credential
from .credential import CredentialExistsError
from .credential import CredentialNotFoundError
from .database import Database, DatabaseExistsError
| 36.2 | 51 | 0.878453 | 17 | 181 | 9.352941 | 0.411765 | 0.264151 | 0.377358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093923 | 181 | 4 | 52 | 45.25 | 0.969512 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
94fe8eace0452112f8dc8654b09013128140a58b | 8,526 | py | Python | route/dreRoute.py | filipefcl/fs-webservice-core | fdedeb79049d6af21cf2ffada0d870ed6aa99fe7 | [
"MIT"
] | null | null | null | route/dreRoute.py | filipefcl/fs-webservice-core | fdedeb79049d6af21cf2ffada0d870ed6aa99fe7 | [
"MIT"
] | null | null | null | route/dreRoute.py | filipefcl/fs-webservice-core | fdedeb79049d6af21cf2ffada0d870ed6aa99fe7 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import logging
import os
import json
import inspect
from db import DreDAO
from flask import Flask, request
from flask import send_file, send_from_directory
from werkzeug.utils import secure_filename
from util import Util, Constants, Log, CodeReturn
from controller import Controller
log = Log('DRERoute')
util = Util()
constants = Constants()
dreDAO = DreDAO()
controller = Controller()
codeReturn = CodeReturn()
class DRERoute:
def get_dre_comparative(self, request):
try:
header = request.headers
#Get Token from Header
token = str(header['token'])
#Get datas from PARAMS
data = json.loads(str(request.args.get('data')).replace("'", '"'))
companies_token = data['companies_token']
date_start = data['date_start']
date_end = data['date_end']
except:
return util.make_json(codeReturn.BAD_REQUEST_CODE, codeReturn.BAD_REQUEST_MSG, [])
#Authentication
decode_auth_token = controller.decode_auth_token(token)
if(decode_auth_token == codeReturn.EXPIRED_TOKEN_CODE):
log.warning(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.EXPIRED_TOKEN_MSG),
0)
return util.make_json(codeReturn.EXPIRED_TOKEN_CODE, codeReturn.EXPIRED_TOKEN_MSG, [])
elif(decode_auth_token == codeReturn.INVALID_TOKEN_CODE):
log.error(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.INVALID_TOKEN_MSG),
0)
return util.make_json(codeReturn.INVALID_TOKEN_CODE, codeReturn.INVALID_TOKEN_MSG, [])
else:
companies_id = util.companies_token_to_id(companies_token)
code, msg, data = dreDAO.get_dre_comparative(date_start, date_end, companies_id, decode_auth_token)
return util.make_json(code, msg, data)
def get_dre_month(self, request):
try:
header = request.headers
#Get Token from Header
token = str(header['token'])
#Get datas from PARAMS
data = json.loads(str(request.args.get('data')).replace("'", '"'))
companies_token = data['companies_token']
date_start = data['date_start']
date_end = data['date_end']
except:
return util.make_json(codeReturn.BAD_REQUEST_CODE, codeReturn.BAD_REQUEST_MSG, [])
#Authentication
decode_auth_token = controller.decode_auth_token(token)
if(decode_auth_token == codeReturn.EXPIRED_TOKEN_CODE):
log.warning(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.EXPIRED_TOKEN_MSG),
0)
return util.make_json(codeReturn.EXPIRED_TOKEN_CODE, codeReturn.EXPIRED_TOKEN_MSG, [])
elif(decode_auth_token == codeReturn.INVALID_TOKEN_CODE):
log.error(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.INVALID_TOKEN_MSG),
0)
return util.make_json(codeReturn.INVALID_TOKEN_CODE, codeReturn.INVALID_TOKEN_MSG, [])
else:
companies_id = util.companies_token_to_id(companies_token)
code, msg, data = dreDAO.get_dre_month(date_start, date_end, companies_id, decode_auth_token)
return util.make_json(code, msg, data)
def get_dre_period(self, request):
try:
header = request.headers
#Get Token from Header
token = str(header['token'])
#Get datas from PARAMS
data = json.loads(str(request.args.get('data')).replace("'", '"'))
companies_token = data['companies_token']
date_start1 = data['date_start1']
date_end1 = data['date_end1']
date_start2 = data['date_start2']
date_end2 = data['date_end2']
except:
return util.make_json(codeReturn.BAD_REQUEST_CODE, codeReturn.BAD_REQUEST_MSG, [])
#Authentication
decode_auth_token = controller.decode_auth_token(token)
if(decode_auth_token == codeReturn.EXPIRED_TOKEN_CODE):
log.warning(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.EXPIRED_TOKEN_MSG),
0)
return util.make_json(codeReturn.EXPIRED_TOKEN_CODE, codeReturn.EXPIRED_TOKEN_MSG, [])
elif(decode_auth_token == codeReturn.INVALID_TOKEN_CODE):
log.error(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.INVALID_TOKEN_MSG),
0)
return util.make_json(codeReturn.INVALID_TOKEN_CODE, codeReturn.INVALID_TOKEN_MSG, [])
else:
companies_id = util.companies_token_to_id(companies_token)
code, msg, data = dreDAO.get_dre_period(date_start1, date_end1, date_start2, date_end2, companies_id, decode_auth_token)
return util.make_json(code, msg, data)
def list_acc_mov_from_acc_ref(self, request):
try:
header = request.headers
#Get Token from Header
token = str(header['token'])
#Get datas from JSON
data = json.loads(str(request.args.get('data')).replace("'", '"'))
companies_token = data['companies_token']
date_start = data['date_start']
date_end = data['date_end']
cod_account_ref = data['cod_account_ref']
except:
return util.make_json(codeReturn.BAD_REQUEST_CODE, codeReturn.BAD_REQUEST_MSG, [])
#Authentication
decode_auth_token = controller.decode_auth_token(token)
if(decode_auth_token == codeReturn.EXPIRED_TOKEN_CODE):
log.warning(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.EXPIRED_TOKEN_MSG),
0)
return util.make_json(codeReturn.EXPIRED_TOKEN_CODE, codeReturn.EXPIRED_TOKEN_MSG, [])
elif(decode_auth_token == codeReturn.INVALID_TOKEN_CODE):
log.error(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.INVALID_TOKEN_MSG),
0)
return util.make_json(codeReturn.INVALID_TOKEN_CODE, codeReturn.INVALID_TOKEN_MSG, [])
else:
companies_id = util.companies_token_to_id(companies_token)
code, msg, data = dreDAO.list_acc_mov_from_acc_ref(date_start, date_end, cod_account_ref, companies_id, decode_auth_token)
return util.make_json(code, msg, data)
def list_acc_mov_from_acc_ref_period(self, request):
try:
header = request.headers
#Get Token from Header
token = str(header['token'])
#Get datas from JSON
data = json.loads(str(request.args.get('data')).replace("'", '"'))
companies_token = data['companies_token']
date_start1 = data['date_start1']
date_end1 = data['date_end1']
date_start2 = data['date_start2']
date_end2 = data['date_end2']
cod_account_ref = data['cod_account_ref']
except:
return util.make_json(codeReturn.BAD_REQUEST_CODE, codeReturn.BAD_REQUEST_MSG, [])
#Authentication
decode_auth_token = controller.decode_auth_token(token)
if(decode_auth_token == codeReturn.EXPIRED_TOKEN_CODE):
log.warning(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.EXPIRED_TOKEN_MSG),
0)
return util.make_json(codeReturn.EXPIRED_TOKEN_CODE, codeReturn.EXPIRED_TOKEN_MSG, [])
elif(decode_auth_token == codeReturn.INVALID_TOKEN_CODE):
log.error(inspect.getframeinfo(inspect.currentframe()).function,
str(codeReturn.INVALID_TOKEN_MSG),
0)
return util.make_json(codeReturn.INVALID_TOKEN_CODE, codeReturn.INVALID_TOKEN_MSG, [])
else:
companies_id = util.companies_token_to_id(companies_token)
code, msg, data = dreDAO.list_acc_mov_from_acc_ref_period(date_start1, date_end1, date_start2, date_end2, cod_account_ref, companies_id, decode_auth_token)
return util.make_json(code, msg, data)
| 40.028169 | 167 | 0.631363 | 957 | 8,526 | 5.3093 | 0.08464 | 0.049203 | 0.073804 | 0.070852 | 0.921079 | 0.919701 | 0.919701 | 0.919701 | 0.919701 | 0.903169 | 0 | 0.005654 | 0.273985 | 8,526 | 212 | 168 | 40.216981 | 0.815186 | 0.034835 | 0 | 0.817568 | 0 | 0 | 0.036775 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033784 | false | 0 | 0.067568 | 0 | 0.243243 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bf5d3b653e9a1fe79d50174970eb01605509d486 | 5,588 | py | Python | LED/OLED_I2C_ASC/Font_8x16.py | garymeg/mpy-lib | 0fb6e4529afe098a13a1cffa85fb03778ffb13e3 | [
"MIT"
] | 116 | 2018-07-16T14:48:44.000Z | 2022-03-16T15:24:54.000Z | LED/OLED_I2C_ASC/Font_8x16.py | garymeg/mpy-lib | 0fb6e4529afe098a13a1cffa85fb03778ffb13e3 | [
"MIT"
] | 8 | 2018-07-11T14:00:30.000Z | 2022-01-20T01:30:09.000Z | LED/OLED_I2C_ASC/Font_8x16.py | garymeg/mpy-lib | 0fb6e4529afe098a13a1cffa85fb03778ffb13e3 | [
"MIT"
] | 66 | 2018-07-11T08:50:00.000Z | 2022-03-28T15:36:00.000Z | '''
FONT 8x16 for OLED
'''
# ' ' - '~' 0x20 - 0x7E
Font_8x16 = bytes(b'\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x38\xFC\xFC\x38\x00\x00\x00\x00\x0D\x0D\x00\x00\
\x00\x0E\x1E\x00\x00\x1E\x0E\x00\x00\x00\x00\x00\x00\x00\
\x20\xF8\xF8\x20\xF8\xF8\x20\x02\x0F\x0F\x02\x0F\x0F\x02\
\x38\x7C\x44\x47\x47\xCC\x98\x06\x0C\x08\x38\x38\x0F\x07\
\x30\x30\x00\x80\xC0\x60\x30\x0C\x06\x03\x01\x00\x0C\x0C\
\x80\xD8\x7C\xE4\xBC\xD8\x40\x07\x0F\x08\x08\x07\x0F\x08\
\x00\x10\x1E\x0E\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\xF0\xF8\x0C\x04\x00\x00\x00\x03\x07\x0C\x08\x00\
\x00\x00\x04\x0C\xF8\xF0\x00\x00\x00\x08\x0C\x07\x03\x00\
\x80\xA0\xE0\xC0\xC0\xE0\xA0\x00\x02\x03\x01\x01\x03\x02\
\x00\x80\x80\xE0\xE0\x80\x80\x00\x00\x00\x03\x03\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x10\x1E\x0E\x00\x00\
\x80\x80\x80\x80\x80\x80\x80\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0C\x0C\x00\x00\
\x00\x00\x00\x80\xC0\x60\x30\x0C\x06\x03\x01\x00\x00\x00\
\xF0\xF8\x0C\xC4\x0C\xF8\xF0\x03\x07\x0C\x08\x0C\x07\x03\
\x00\x10\x18\xFC\xFC\x00\x00\x00\x08\x08\x0F\x0F\x08\x08\
\x08\x0C\x84\xC4\x64\x3C\x18\x0E\x0F\x09\x08\x08\x0C\x0C\
\x08\x0C\x44\x44\x44\xFC\xB8\x04\x0C\x08\x08\x08\x0F\x07\
\xC0\xE0\xB0\x98\xFC\xFC\x80\x00\x00\x00\x08\x0F\x0F\x08\
\x7C\x7C\x44\x44\x44\xC4\x84\x04\x0C\x08\x08\x08\x0F\x07\
\xF0\xF8\x4C\x44\x44\xC0\x80\x07\x0F\x08\x08\x08\x0F\x07\
\x0C\x0C\x04\x84\xC4\x7C\x3C\x00\x00\x0F\x0F\x00\x00\x00\
\xB8\xFC\x44\x44\x44\xFC\xB8\x07\x0F\x08\x08\x08\x0F\x07\
\x38\x7C\x44\x44\x44\xFC\xF8\x00\x08\x08\x08\x0C\x07\x03\
\x00\x00\x00\x30\x30\x00\x00\x00\x00\x00\x06\x06\x00\x00\
\x00\x00\x00\x30\x30\x00\x00\x00\x00\x08\x0E\x06\x00\x00\
\x00\x80\xC0\x60\x30\x18\x08\x00\x00\x01\x03\x06\x0C\x08\
\x00\x20\x20\x20\x20\x20\x20\x00\x01\x01\x01\x01\x01\x01\
\x00\x08\x18\x30\x60\xC0\x80\x00\x08\x0C\x06\x03\x01\x00\
\x18\x1C\x04\xC4\xE4\x3C\x18\x00\x00\x00\x0D\x0D\x00\x00\
\xF0\xF8\x08\xC8\xC8\xF8\xF0\x07\x0F\x08\x0B\x0B\x0B\x01\
\xE0\xF0\x98\x8C\x98\xF0\xE0\x0F\x0F\x00\x00\x00\x0F\x0F\
\x04\xFC\xFC\x44\x44\xFC\xB8\x08\x0F\x0F\x08\x08\x0F\x07\
\xF0\xF8\x0C\x04\x04\x0C\x18\x03\x07\x0C\x08\x08\x0C\x06\
\x04\xFC\xFC\x04\x0C\xF8\xF0\x08\x0F\x0F\x08\x0C\x07\x03\
\x04\xFC\xFC\x44\xE4\x0C\x1C\x08\x0F\x0F\x08\x08\x0C\x0E\
\x04\xFC\xFC\x44\xE4\x0C\x1C\x08\x0F\x0F\x08\x00\x00\x00\
\xF0\xF8\x0C\x84\x84\x8C\x98\x03\x07\x0C\x08\x08\x07\x0F\
\xFC\xFC\x40\x40\x40\xFC\xFC\x0F\x0F\x00\x00\x00\x0F\x0F\
\x00\x00\x04\xFC\xFC\x04\x00\x00\x00\x08\x0F\x0F\x08\x00\
\x00\x00\x00\x04\xFC\xFC\x04\x07\x0F\x08\x08\x0F\x07\x00\
\x04\xFC\xFC\xC0\xE0\x3C\x1C\x08\x0F\x0F\x00\x01\x0F\x0E\
\x04\xFC\xFC\x04\x00\x00\x00\x08\x0F\x0F\x08\x08\x0C\x0E\
\xFC\xFC\x38\x70\x38\xFC\xFC\x0F\x0F\x00\x00\x00\x0F\x0F\
\xFC\xFC\x38\x70\xE0\xFC\xFC\x0F\x0F\x00\x00\x00\x0F\x0F\
\xF8\xFC\x04\x04\x04\xFC\xF8\x07\x0F\x08\x08\x08\x0F\x07\
\x04\xFC\xFC\x44\x44\x7C\x38\x08\x0F\x0F\x08\x00\x00\x00\
\xF8\xFC\x04\x04\x04\xFC\xF8\x07\x0F\x08\x0E\x3C\x3F\x27\
\x04\xFC\xFC\x44\xC4\xFC\x38\x08\x0F\x0F\x00\x00\x0F\x0F\
\x18\x3C\x64\x44\xC4\x9C\x18\x06\x0E\x08\x08\x08\x0F\x07\
\x00\x1C\x0C\xFC\xFC\x0C\x1C\x00\x00\x08\x0F\x0F\x08\x00\
\xFC\xFC\x00\x00\x00\xFC\xFC\x07\x0F\x08\x08\x08\x0F\x07\
\xFC\xFC\x00\x00\x00\xFC\xFC\x01\x03\x06\x0C\x06\x03\x01\
\xFC\xFC\x00\xC0\x00\xFC\xFC\x07\x0F\x0E\x03\x0E\x0F\x07\
\x0C\x3C\xF0\xE0\xF0\x3C\x0C\x0C\x0F\x03\x01\x03\x0F\x0C\
\x00\x3C\x7C\xC0\xC0\x7C\x3C\x00\x00\x08\x0F\x0F\x08\x00\
\x1C\x0C\x84\xC4\x64\x3C\x1C\x0E\x0F\x09\x08\x08\x0C\x0E\
\x00\x00\xFC\xFC\x04\x04\x00\x00\x00\x0F\x0F\x08\x08\x00\
\x38\x70\xE0\xC0\x80\x00\x00\x00\x00\x00\x01\x03\x07\x0E\
\x00\x00\x04\x04\xFC\xFC\x00\x00\x00\x08\x08\x0F\x0F\x00\
\x08\x0C\x06\x03\x06\x0C\x08\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x20\x20\x20\x20\x20\x20\x20\
\x00\x00\x03\x07\x04\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\xA0\xA0\xA0\xE0\xC0\x00\x07\x0F\x08\x08\x07\x0F\x08\
\x04\xFC\xFC\x20\x60\xC0\x80\x00\x0F\x0F\x08\x08\x0F\x07\
\xC0\xE0\x20\x20\x20\x60\x40\x07\x0F\x08\x08\x08\x0C\x04\
\x80\xC0\x60\x24\xFC\xFC\x00\x07\x0F\x08\x08\x07\x0F\x08\
\xC0\xE0\xA0\xA0\xA0\xE0\xC0\x07\x0F\x08\x08\x08\x0C\x04\
\x40\xF8\xFC\x44\x0C\x18\x00\x08\x0F\x0F\x08\x00\x00\x00\
\xC0\xE0\x20\x20\xC0\xE0\x20\x27\x6F\x48\x48\x7F\x3F\x00\
\x04\xFC\xFC\x40\x20\xE0\xC0\x08\x0F\x0F\x00\x00\x0F\x0F\
\x00\x00\x20\xEC\xEC\x00\x00\x00\x00\x08\x0F\x0F\x08\x00\
\x00\x00\x00\x00\x20\xEC\xEC\x00\x30\x70\x40\x40\x7F\x3F\
\x04\xFC\xFC\x80\xC0\x60\x20\x08\x0F\x0F\x01\x03\x0E\x0C\
\x00\x00\x04\xFC\xFC\x00\x00\x00\x00\x08\x0F\x0F\x08\x00\
\xE0\xE0\x60\xC0\x60\xE0\xC0\x0F\x0F\x00\x07\x00\x0F\x0F\
\x20\xE0\xC0\x20\x20\xE0\xC0\x00\x0F\x0F\x00\x00\x0F\x0F\
\xC0\xE0\x20\x20\x20\xE0\xC0\x07\x0F\x08\x08\x08\x0F\x07\
\x20\xE0\xC0\x20\x20\xE0\xC0\x40\x7F\x7F\x48\x08\x0F\x07\
\xC0\xE0\x20\x20\xC0\xE0\x20\x07\x0F\x08\x48\x7F\x7F\x40\
\x20\xE0\xC0\x60\x20\xE0\xC0\x08\x0F\x0F\x08\x00\x00\x00\
\x40\xE0\xA0\x20\x20\x60\x40\x04\x0C\x09\x09\x0B\x0E\x04\
\x20\x20\xF8\xFC\x20\x20\x00\x00\x00\x07\x0F\x08\x0C\x04\
\xE0\xE0\x00\x00\xE0\xE0\x00\x07\x0F\x08\x08\x07\x0F\x08\
\x00\xE0\xE0\x00\x00\xE0\xE0\x00\x03\x07\x0C\x0C\x07\x03\
\xE0\xE0\x00\x80\x00\xE0\xE0\x07\x0F\x0C\x07\x0C\x0F\x07\
\x20\x60\xC0\x80\xC0\x60\x20\x08\x0C\x07\x03\x07\x0C\x08\
\xE0\xE0\x00\x00\x00\xE0\xE0\x47\x4F\x48\x48\x68\x3F\x1F\
\x60\x60\x20\xA0\xE0\x60\x20\x0C\x0E\x0B\x09\x08\x0C\x0C\
\x00\x40\x40\xF8\xBC\x04\x04\x00\x00\x00\x07\x0F\x08\x08\
\x00\x00\x00\xBC\xBC\x00\x00\x00\x00\x00\x0F\x0F\x00\x00\
\x00\x04\x04\xBC\xF8\x40\x40\x00\x08\x08\x0F\x07\x00\x00\
\x08\x0C\x04\x0C\x08\x0C\x04\x00\x00\x00\x00\x00\x00\x00\
')
| 54.784314 | 57 | 0.72083 | 1,340 | 5,588 | 3.005224 | 0.05 | 0.306928 | 0.306183 | 0.25925 | 0.622051 | 0.494661 | 0.420164 | 0.234666 | 0.195927 | 0.13335 | 0 | 0.378586 | 0.02058 | 5,588 | 101 | 58 | 55.326733 | 0.357208 | 0.007337 | 0 | 0 | 0 | 0.979381 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bf6e4b2337f68221a31f21683ccd413117884ca8 | 3,156 | py | Python | pyhcl/tester/wir.py | raybdzhou/PyChip-py-hcl | 08edc6ad4d2978eb417482f6f92678f8f9a1e3c7 | [
"MIT"
] | null | null | null | pyhcl/tester/wir.py | raybdzhou/PyChip-py-hcl | 08edc6ad4d2978eb417482f6f92678f8f9a1e3c7 | [
"MIT"
] | null | null | null | pyhcl/tester/wir.py | raybdzhou/PyChip-py-hcl | 08edc6ad4d2978eb417482f6f92678f8f9a1e3c7 | [
"MIT"
] | null | null | null | from pyclbr import Function
from pyhcl.ir.low_ir import *
@dataclass(frozen=True)
class WUIntLiteral(Expression):
expr: Expression
def get_value(self, *args) -> int:
return self.expr.value
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
...
@dataclass(frozen=True)
class WSIntLiteral(Expression):
expr: Expression
def get_value(self, *args) -> int:
return self.expr.value
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
...
@dataclass(frozen=True)
class WReference(Expression):
expr: Expression
get_func: Function
set_func: Function
def get_value(self, *args) -> int:
return self.get_func(*args)
def set_value(self, *args) -> int:
return self.set_func(*args)
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
...
@dataclass(frozen=True)
class WSubField(Expression):
expr: Expression
get_func: Function
set_func: Function
def get_value(self, *args) -> int:
return self.get_func(*args)
def set_value(self, *args) -> int:
return self.set_func(*args)
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
...
@dataclass(frozen=True)
class WSubIndex(Expression):
expr: Expression
get_func: Function
set_func: Function
def get_value(self, *args) -> int:
return self.get_func(*args)
def set_value(self, *args) -> int:
return self.set_func(*args)
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
...
@dataclass(frozen=True)
class WSubAccess(Expression):
expr: Expression
get_func: Function
set_func: Function
def get_value(self, *args) -> int:
return self.get_func(*args)
def set_value(self, *args) -> int:
return self.set_func(*args)
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
...
@dataclass(frozen=True)
class WMux(Expression):
expr: Expression
get_func: Function
def get_value(self, *args) -> int:
return self.get_func(*args)
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
...
@dataclass(frozen=True)
class WValidIf(Expression):
expr: Expression
get_func: Function
def get_value(self, *args) -> int:
return self.get_func(*args)
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
...
@dataclass(frozen=True)
class WDoPrim(Expression):
expr: Expression
get_func: Function
def get_value(self, *args) -> int:
return self.get_func(*args)
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
...
@dataclass(frozen=True)
class WInt(Expression):
value: int
def get_value(self, *args) -> int:
return self.value
def serialize(self) -> str:
...
def verilog_serialize(self) -> str:
... | 20.493506 | 39 | 0.589037 | 363 | 3,156 | 4.991736 | 0.096419 | 0.143488 | 0.1766 | 0.12362 | 0.904525 | 0.904525 | 0.904525 | 0.904525 | 0.886865 | 0.886865 | 0 | 0 | 0.283904 | 3,156 | 154 | 40 | 20.493506 | 0.80177 | 0 | 0 | 0.873874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.306306 | false | 0 | 0.018018 | 0.126126 | 0.72973 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 10 |
bfa0ff233032ec390d11384f40ec4dd99dbcad27 | 45 | py | Python | test/test_import.py | szahlner/shadowhand-gym | a7fbbe8ddcc2ecbead9349b0f377a3066ca94233 | [
"MIT"
] | 11 | 2021-08-30T12:09:16.000Z | 2021-12-13T15:10:27.000Z | test/test_import.py | szahlner/shadowhand-gym | a7fbbe8ddcc2ecbead9349b0f377a3066ca94233 | [
"MIT"
] | null | null | null | test/test_import.py | szahlner/shadowhand-gym | a7fbbe8ddcc2ecbead9349b0f377a3066ca94233 | [
"MIT"
] | null | null | null | def test_import():
import shadowhand_gym
| 15 | 25 | 0.755556 | 6 | 45 | 5.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177778 | 45 | 2 | 26 | 22.5 | 0.864865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 1 | 0 | 1.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bfb4edb2c0fdbbee115175598acdd0540ff9e485 | 82 | py | Python | applitools/geometry.py | applitools/eyes.selenium.python | 3a09a3372a3a8915b3c97ee54fc223580c45c0a3 | [
"Apache-2.0"
] | 11 | 2016-04-20T21:21:37.000Z | 2020-04-27T19:46:56.000Z | applitools/geometry.py | applitools/eyes.selenium.python | 3a09a3372a3a8915b3c97ee54fc223580c45c0a3 | [
"Apache-2.0"
] | 15 | 2017-01-11T04:58:31.000Z | 2019-09-13T18:00:35.000Z | applitools/geometry.py | applitools/eyes.selenium.python | 3a09a3372a3a8915b3c97ee54fc223580c45c0a3 | [
"Apache-2.0"
] | 15 | 2016-03-23T22:06:39.000Z | 2020-06-14T09:11:58.000Z | from applitools.core.geometry import * # noqa
from applitools.core import logger
| 27.333333 | 46 | 0.804878 | 11 | 82 | 6 | 0.636364 | 0.424242 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134146 | 82 | 2 | 47 | 41 | 0.929577 | 0.04878 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bfd268ec016a2529ec322f3a226adac80b3c70bc | 20,945 | py | Python | patch_manager_sdk/api/patch/patch_client.py | easyopsapis/easyops-api-python | adf6e3bad33fa6266b5fa0a449dd4ac42f8447d0 | [
"Apache-2.0"
] | 5 | 2019-07-31T04:11:05.000Z | 2021-01-07T03:23:20.000Z | patch_manager_sdk/api/patch/patch_client.py | easyopsapis/easyops-api-python | adf6e3bad33fa6266b5fa0a449dd4ac42f8447d0 | [
"Apache-2.0"
] | null | null | null | patch_manager_sdk/api/patch/patch_client.py | easyopsapis/easyops-api-python | adf6e3bad33fa6266b5fa0a449dd4ac42f8447d0 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import os
import sys
import patch_manager_sdk.api.patch.create_win_patch_pb2
import patch_manager_sdk.api.patch.delete_win_patch_pb2
import google.protobuf.empty_pb2
import patch_manager_sdk.api.patch.get_host_pb2
import patch_manager_sdk.api.patch.get_os_versions_pb2
import patch_manager_sdk.api.patch.get_win_patch_pb2
import patch_manager_sdk.api.patch.list_host_pb2
import patch_manager_sdk.api.patch.list_win_patch_pb2
import patch_manager_sdk.api.patch.search_host_pb2
import patch_manager_sdk.api.patch.search_win_patch_pb2
import patch_manager_sdk.model.easy_command.task_detail_pb2
import patch_manager_sdk.api.patch.update_os_version_pb2
import patch_manager_sdk.api.patch.update_win_patch_pb2
import patch_manager_sdk.utils.http_util
import google.protobuf.json_format
class PatchClient(object):
def __init__(self, server_ip="", server_port=0, service_name="", host=""):
"""
初始化client
:param server_ip: 指定sdk请求的server_ip,为空时走名字服务路由
:param server_port: 指定sdk请求的server_port,与server_ip一起使用, 为空时走名字服务路由
:param service_name: 指定sdk请求的service_name, 为空时按契约名称路由。如果server_ip和service_name同时设置,server_ip优先级更高
:param host: 指定sdk请求服务的host名称, 如cmdb.easyops-only.com
"""
if server_ip == "" and server_port != 0 or server_ip != "" and server_port == 0:
raise Exception("server_ip和server_port必须同时指定")
self._server_ip = server_ip
self._server_port = server_port
self._service_name = service_name
self._host = host
def create_win_patch(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.create_win_patch_pb2.CreateWinPatchRequest, int, str, int) -> patch_manager_sdk.api.patch.create_win_patch_pb2.CreateWinPatchResponse
"""
新建windows补丁
:param request: create_win_patch请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.create_win_patch_pb2.CreateWinPatchResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.CreateWinPatch"
uri = "/api/patch_manager/v1/patch"
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="POST",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.create_win_patch_pb2.CreateWinPatchResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
def delete_win_patch(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.delete_win_patch_pb2.DeleteWinPatchRequest, int, str, int) -> google.protobuf.empty_pb2.Empty
"""
删除windows补丁
:param request: delete_win_patch请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: google.protobuf.empty_pb2.Empty
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.DeleteWinPatch"
uri = "/api/patch_manager/v1/patch/{patchId}".format(
patchId=request.patchId,
)
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="DELETE",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = google.protobuf.empty_pb2.Empty()
google.protobuf.json_format.ParseDict(rsp_obj, rsp, ignore_unknown_fields=True)
return rsp
def get_host(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.get_host_pb2.GetHostRequest, int, str, int) -> patch_manager_sdk.api.patch.get_host_pb2.GetHostResponse
"""
获取主机详情
:param request: get_host请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.get_host_pb2.GetHostResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.GetHost"
uri = "/api/patch_manager/v1/host/{instanceId}".format(
instanceId=request.instanceId,
)
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="GET",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.get_host_pb2.GetHostResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
def get_os_versions(self, request, org, user, timeout=10):
# type: (google.protobuf.empty_pb2.Empty, int, str, int) -> patch_manager_sdk.api.patch.get_os_versions_pb2.GetOsVersionsResponse
"""
获取 Windows 补丁适用的操作系统版本
:param request: get_os_versions请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.get_os_versions_pb2.GetOsVersionsResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.GetOsVersions"
uri = "/api/patch_manager/v1/win_patch_os_versions"
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="GET",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.get_os_versions_pb2.GetOsVersionsResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
def get_win_patch(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.get_win_patch_pb2.GetWinPatchRequest, int, str, int) -> patch_manager_sdk.api.patch.get_win_patch_pb2.GetWinPatchResponse
"""
获取windows补丁详情
:param request: get_win_patch请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.get_win_patch_pb2.GetWinPatchResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.GetWinPatch"
uri = "/api/patch_manager/v1/win_patch/{patchId}".format(
patchId=request.patchId,
)
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="GET",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.get_win_patch_pb2.GetWinPatchResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
def list_host(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.list_host_pb2.ListHostRequest, int, str, int) -> patch_manager_sdk.api.patch.list_host_pb2.ListHostResponse
"""
获取指定的实例ID获取主机列表
:param request: list_host请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.list_host_pb2.ListHostResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.ListHost"
uri = "/api/patch_manager/v1/host_list"
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="POST",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.list_host_pb2.ListHostResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
def list_win_patch(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.list_win_patch_pb2.ListWinPatchRequest, int, str, int) -> patch_manager_sdk.api.patch.list_win_patch_pb2.ListWinPatchResponse
"""
获取指定的实例ID获取windows补丁列表
:param request: list_win_patch请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.list_win_patch_pb2.ListWinPatchResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.ListWinPatch"
uri = "/api/patch_manager/v1/patch_list"
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="POST",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.list_win_patch_pb2.ListWinPatchResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
def search_host(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.search_host_pb2.SearchHostRequest, int, str, int) -> patch_manager_sdk.api.patch.search_host_pb2.SearchHostResponse
"""
获取主机列表
:param request: search_host请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.search_host_pb2.SearchHostResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.SearchHost"
uri = "/api/patch_manager/v1/host"
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="GET",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.search_host_pb2.SearchHostResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
def search_win_patch(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.search_win_patch_pb2.SearchWinPatchRequest, int, str, int) -> patch_manager_sdk.api.patch.search_win_patch_pb2.SearchWinPatchResponse
"""
搜索windows补丁列表
:param request: search_win_patch请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.search_win_patch_pb2.SearchWinPatchResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.SearchWinPatch"
uri = "/api/patch_manager/v1/patch"
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="GET",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.search_win_patch_pb2.SearchWinPatchResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
def update_host_patch_callback(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.model.easy_command.task_detail_pb2.TaskDetail, int, str, int) -> google.protobuf.empty_pb2.Empty
"""
同步主机已安装的补丁的回调
:param request: update_host_patch_callback请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: google.protobuf.empty_pb2.Empty
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.UpdateHostPatchCallback"
uri = "/api/patch_manager/v1/host_patch_sync_task"
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="POST",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = google.protobuf.empty_pb2.Empty()
google.protobuf.json_format.ParseDict(rsp_obj, rsp, ignore_unknown_fields=True)
return rsp
def update_os_versions(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.update_os_version_pb2.UpdateOsVersionsRequest, int, str, int) -> patch_manager_sdk.api.patch.update_os_version_pb2.UpdateOsVersionsResponse
"""
更新 Windows 补丁适用的操作系统版本
:param request: update_os_versions请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.update_os_version_pb2.UpdateOsVersionsResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.UpdateOsVersions"
uri = "/api/patch_manager/v1/win_patch_os_versions"
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="PUT",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.update_os_version_pb2.UpdateOsVersionsResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
def update_win_patch(self, request, org, user, timeout=10):
# type: (patch_manager_sdk.api.patch.update_win_patch_pb2.UpdateWinPatchRequest, int, str, int) -> patch_manager_sdk.api.patch.update_win_patch_pb2.UpdateWinPatchResponse
"""
更新windows补丁
:param request: update_win_patch请求
:param org: 客户的org编号,为数字
:param user: 调用api使用的用户名
:param timeout: 调用超时时间,单位秒
:return: patch_manager_sdk.api.patch.update_win_patch_pb2.UpdateWinPatchResponse
"""
headers = {"org": org, "user": user}
route_name = ""
server_ip = self._server_ip
if self._service_name != "":
route_name = self._service_name
elif self._server_ip != "":
route_name = "easyops.api.patch_manager.patch.UpdateWinPatch"
uri = "/api/patch_manager/v1/patch/{patchId}".format(
patchId=request.patchId,
)
requestParam = request
rsp_obj = patch_manager_sdk.utils.http_util.do_api_request(
method="PUT",
src_name="logic.patch_manager_sdk",
dst_name=route_name,
server_ip=server_ip,
server_port=self._server_port,
host=self._host,
uri=uri,
params=google.protobuf.json_format.MessageToDict(
requestParam, preserving_proto_field_name=True),
headers=headers,
timeout=timeout,
)
rsp = patch_manager_sdk.api.patch.update_win_patch_pb2.UpdateWinPatchResponse()
google.protobuf.json_format.ParseDict(rsp_obj["data"], rsp, ignore_unknown_fields=True)
return rsp
| 39.003724 | 184 | 0.630127 | 2,392 | 20,945 | 5.175167 | 0.06898 | 0.098877 | 0.094515 | 0.074158 | 0.891833 | 0.889652 | 0.874788 | 0.870749 | 0.854269 | 0.775507 | 0 | 0.006672 | 0.277202 | 20,945 | 536 | 185 | 39.076493 | 0.811018 | 0.213082 | 0 | 0.75942 | 0 | 0 | 0.091646 | 0.080981 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037681 | false | 0 | 0.049275 | 0 | 0.124638 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
787c3063142a8f0d29f53bf57491d428d4b79d2c | 57,465 | py | Python | cave/com.raytheon.viz.gfe/python/autotest/SPW_1_TestScript.py | srcarter3/awips2 | 37f31f5e88516b9fd576eaa49d43bfb762e1d174 | [
"Apache-2.0"
] | null | null | null | cave/com.raytheon.viz.gfe/python/autotest/SPW_1_TestScript.py | srcarter3/awips2 | 37f31f5e88516b9fd576eaa49d43bfb762e1d174 | [
"Apache-2.0"
] | null | null | null | cave/com.raytheon.viz.gfe/python/autotest/SPW_1_TestScript.py | srcarter3/awips2 | 37f31f5e88516b9fd576eaa49d43bfb762e1d174 | [
"Apache-2.0"
] | 1 | 2021-10-30T00:03:05.000Z | 2021-10-30T00:03:05.000Z | ##
# This software was developed and / or modified by Raytheon Company,
# pursuant to Contract DG133W-05-CQ-1067 with the US Government.
#
# U.S. EXPORT CONTROLLED TECHNICAL DATA
# This software product contains export-restricted data whose
# export/transfer/disclosure is restricted by U.S. law. Dissemination
# to non-U.S. persons whether in the United States or abroad requires
# an export license or other authorization.
#
# Contractor Name: Raytheon Company
# Contractor Address: 6825 Pine Street, Suite 340
# Mail Stop B8
# Omaha, NE 68106
# 402.291.0100
#
# See the AWIPS II Master Rights File ("Master Rights File.pdf") for
# further licensing information.
##
periodVer1= """Definition["Period_1_version"] = 1"""
periodVer2= """Definition["Period_1_version"] = 2"""
scripts = [
{
'commentary':'"""\nPrecip:LE\nSky:LE1\nNonPrecip:LE2\nPoP:LE\nConsolidation:noLE\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_1756',
'checkStrings':[
"Windward, mostly sunny, Showers likely and slight chance of snow showers in the morning, Patchy fog through the day, Chance of precipitation 40 percent.",
"Leeward, mostly cloudy with slight chance of snow showers in the morning, then showers likely in the afternoon, Patchy fog in the morning, then areas of fog in the afternoon, Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:noLE\nSky:LE1\nNonPrecip:null\nPoP:LE1\nConsolidation:noLE\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_166',
'checkStrings':[
"Windward, mostly sunny, Chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon.",
"Leeward, mostly cloudy with chance of showers and slight chance of snow showers in the morning, then mostly sunny with chance of showers in the afternoon.",
"Chance of precipitation 40 percent.",
],
},
{
'commentary':'"""\nPrecip:noLE\nSky:LE1\nNonPrecip:null\nPoP:LE2\nConsolidation:LE\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_175',
'checkStrings':[
"Windward, mostly sunny, Chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon.",
"Leeward, mostly cloudy with snow showers likely and chance of showers in the morning, then mostly sunny with chance of showers in the afternoon.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:noLE\nSky:LE1\nNonPrecip:LE\nPoP:LE2\nConsolidation:LE2\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_193',
'checkStrings':[
"Windward, mostly sunny, Chance of showers in the morning, then chance of showers and slight chance of snow showers in the afternoon, Patchy fog through the day.",
"Leeward, mostly cloudy with chance of showers in the morning, then mostly sunny with snow showers likely and chance of showers in the afternoon, Areas of fog through the day.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:noLE\nSky:LE1\nNonPrecip:LE\nPoP:LE\nConsolidation:noLE\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : SChc:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_196',
'checkStrings':[
"Windward, mostly sunny, Chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon, Patchy fog through the day, Chance of precipitation 40 percent.",
"Leeward, mostly cloudy with chance of showers and slight chance of snow showers in the morning, then mostly sunny with chance of showers in the afternoon, Areas of fog through the day, Chance of precipitation 50 percent.",
],
},
{
'commentary':'"""\nPrecip:noLE\nSky:LE2\nNonPrecip:null\nPoP:LE1\nConsolidation:null\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 87 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 87, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_269',
'checkStrings':[
"Leeward, mostly sunny in the morning then becoming mostly cloudy, A 40 percent chance of showers.",
"Windward, mostly sunny with a 40 percent chance of showers.",
],
},
{
'commentary':'"""\nPrecip:noLE\nSky:LE2\nNonPrecip:null\nPoP:LE2\nConsolidation:LE1\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 87 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 87, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_272',
'checkStrings':[
"Windward, mostly sunny, Chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon.",
"Leeward, mostly sunny with snow showers likely and chance of showers in the morning, then mostly cloudy with chance of showers in the afternoon.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:noLE\nSky:LE2\nNonPrecip:LE\nPoP:LE2\nConsolidation:LE2\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 87 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 87, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_293',
'checkStrings':[
"Windward, mostly sunny, Chance of showers in the morning, then chance of showers and slight chance of snow showers in the afternoon, Patchy fog through the day.",
"Leeward, mostly sunny with chance of showers in the morning, then mostly cloudy with snow showers likely and chance of showers in the afternoon, Areas of fog through the day.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:noLE\nSky:LE\nNonPrecip:null\nPoP:LE1\nConsolidation:LE\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 87 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 87, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_370',
'checkStrings':[
"Windward, mostly sunny, Chance of showers and slight chance of snow showers in the morning.",
"Leeward, mostly cloudy, Snow showers likely and chance of showers in the morning.",
"Chance of showers in the afternoon.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:LE1\nSky:noLE\nNonPrecip:noLE\nPoP:noLE\nConsolidation:null\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\nNOTE: The Pop is missing for Leeward because the data is set up so there\nis a local effect for SPW but not for PoP. This should not be the case\nin practice."""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_404',
'checkStrings':[
"Mostly sunny.",
"Windward, a 40 percent chance of showers.",
"Leeward, showers likely in the morning, then chance of showers in the afternoon.",
"Patchy fog through the day.",
],
},
{
'commentary':'"""\nPrecip:LE1\nSky:noLE\nNonPrecip:LE\nPoP:LE1\nConsolidation:LE2\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_488',
'checkStrings':[
"Windward, chance of showers in the morning, then chance of showers and slight chance of snow showers in the afternoon, Patchy fog through the day.",
"Leeward, showers likely in the morning, then snow showers likely and chance of showers in the afternoon, Areas of fog through the day.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:LE1\nSky:LE1\nNonPrecip:LE1\nPoP:noLE\nConsolidation:LE1\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_522',
'checkStrings':[
"Windward, mostly sunny, Chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon, Patchy fog through the day.",
"Leeward, mostly cloudy with showers and snow showers likely in the morning, then mostly sunny with chance of showers in the afternoon, Areas of fog in the morning, then patchy fog in the afternoon.",
"Chance of precipitation 40 percent.",
],
},
{
'commentary':'"""\nPrecip:LE1\nSky:LE1\nNonPrecip:LE1\nPoP:LE2\nConsolidation:LE1\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_532',
'checkStrings':[
"Windward, mostly sunny, Chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon, Patchy fog through the day.",
"Leeward, mostly cloudy with showers and snow showers likely in the morning, then mostly sunny with chance of showers in the afternoon, Areas of fog in the morning, then patchy fog in the afternoon.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:LE1\nSky:LE1\nNonPrecip:LE2\nPoP:noLE\nConsolidation:null\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\nNOTE: The Pop is missing for Leeward because the data is set up so there\nis a local effect for SPW but not for PoP. This should not be the case\nin practice."""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_544',
'checkStrings':[
"Windward, mostly sunny with a 40 percent chance of showers, Patchy fog.",
"Leeward, mostly cloudy with showers likely in the morning, then mostly sunny with chance of showers in the afternoon, Patchy fog in the morning, then areas of fog in the afternoon.",
],
},
{
'commentary':'"""\nPrecip:LE1\nSky:LE2\nNonPrecip:LE1\nPoP:LE2\nConsolidation:LE2\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 87 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Lkly:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 87, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Lkly:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_633',
'checkStrings':[
"Windward, mostly sunny, Chance of showers in the morning, then chance of showers and slight chance of snow showers in the afternoon, Patchy fog through the day.",
"Leeward, mostly sunny with showers likely in the morning, then mostly cloudy with snow showers likely and chance of showers in the afternoon, Areas of fog in the morning, then patchy fog in the afternoon.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:LE2\nSky:noLE\nNonPrecip:LE1\nPoP:LE2\nConsolidation:LE2\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Lkly:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Lkly:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_833',
'checkStrings':[
"Chance of showers in the morning.",
"Windward, patchy fog through the day, Chance of showers and slight chance of snow showers in the afternoon.",
"Leeward, areas of fog in the morning, then patchy fog in the afternoon, Showers and snow showers likely in the afternoon.",
"Chance of precipitation 70 percent",
],
},
{
'commentary':'"""\nPrecip:LE2\nSky:noLE\nNonPrecip:null\nPoP:LE\nConsolidation:LE1\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Lkly:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_877',
'checkStrings':[
"Windward, chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon, Chance of precipitation 40 percent.",
"Leeward, snow showers likely and chance of showers in the morning, then showers likely in the afternoon, Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:LE2\nSky:noLE\nNonPrecip:LE\nPoP:noLE\nConsolidation:LE\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_885',
'checkStrings':[
"Mostly sunny. Windward, chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon, Patchy fog through the day.",
"Leeward, snow showers likely and chance of showers in the morning, then showers likely in the afternoon, Areas of fog through the day.",
"Chance of precipitation 40 percent.",
],
},
{
'commentary':'"""\nPrecip:LE2\nSky:LE2\nNonPrecip:noLE\nPoP:LE2\nConsolidation:LE1\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 87 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 87, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_1012',
'checkStrings':[
"Windward, mostly sunny, Chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon.",
"Leeward, mostly sunny with snow showers likely and chance of showers in the morning, then mostly cloudy with showers likely in the afternoon.",
"Patchy fog.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:LE2\nSky:LE2\nNonPrecip:LE1\nPoP:LE1\nConsolidation:noLE\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 87 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : SChc:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 87, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_1026',
'checkStrings':[
"Windward, mostly sunny, Chance of showers and slight chance of snow showers in the morning, then chance of showers in the afternoon, Patchy fog through the day.",
"Leeward, mostly sunny with chance of showers and slight chance of snow showers in the morning, then mostly cloudy with showers likely in the afternoon, Areas of fog in the morning, then patchy fog in the afternoon.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:LE2\nSky:LE\nNonPrecip:LE2\nPoP:LE2\nConsolidation:LE2\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 87 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>: : [\'BelowElev\']\nNOTE: The morning weather for Area 1 appears out of order. This is because\nthere is a local effect for the sky phrase (Windward) which appears before weather phrases\nso orderLocalEffectPhrases groups all the windward phrases first."""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 87, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_1153',
'checkStrings':[
"Windward, mostly sunny, Patchy fog through the day, Chance of showers and slight chance of snow showers in the afternoon.",
"Chance of showers in the morning.",
"Leeward, mostly cloudy, Patchy fog in the morning, then areas of fog in the afternoon, Showers and snow showers likely in the afternoon.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:null\nSky:LE1\nNonPrecip:null\nPoP:LE\nConsolidation:LE1\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
],
'name':'SPW_1_1377',
'checkStrings':[
"Windward, mostly sunny, A 20 percent chance of snow showers in the morning.",
"Leeward, snow showers likely in the morning, then mostly sunny in the afternoon, Chance of snow 70 percent.",
],
},
{
'commentary':'"""\nPrecip:null\nSky:LE1\nNonPrecip:LE\nPoP:LE\nConsolidation:null\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Patchy:F:<NoInten>:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Areas:F:<NoInten>:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Areas:F:<NoInten>:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Patchy:F:<NoInten>:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Areas:F:<NoInten>:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Areas:F:<NoInten>:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_1399',
'checkStrings':[
"Windward, mostly sunny, Patchy fog.",
"Leeward, mostly cloudy in the morning then becoming mostly sunny, Areas of fog.",
],
},
{
'commentary':'"""\nPrecip:LE\nSky:noLE\nNonPrecip:null\nPoP:LE1\nConsolidation:LE2\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 50 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 80 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : SChc:SW:-:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Lkly:SW:-:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 80, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'SChc:SW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Lkly:SW:-:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_1668',
'checkStrings':[
"Windward, showers likely in the morning, then slight chance of snow showers in the afternoon.",
"Leeward, showers and snow showers likely in the afternoon.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:LE\nSky:LE1\nNonPrecip:LE1\nPoP:LE2\nConsolidation:LE\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 50 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 80 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>: : [\'AboveElev\']\n6-12 : Wx : Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 80, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'Lkly:SW:-:<NoVis>:^Areas:F:<NoInten>:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Patchy:F:<NoInten>:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_1735',
'checkStrings':[
"Windward, mostly sunny, Showers likely and slight chance of snow showers in the morning, Patchy fog through the day.",
"Leeward, mostly cloudy with snow showers likely in the morning, then mostly sunny with showers likely in the afternoon, Areas of fog in the morning, then patchy fog in the afternoon.",
"Chance of precipitation 70 percent.",
],
},
{
'commentary':'"""\nPrecip:LE\nSky:LE\nNonPrecip:null\nPoP:noLE\nConsolidation:noLE\n\n0-6 : Sky : 50 : [\'AboveElev\']\n0-6 : Sky : 87 : [\'BelowElev\']\n6-12 : Sky : 50 : [\'AboveElev\']\n6-12 : Sky : 87 : [\'BelowElev\']\n0-6 : PoP : 40 : [\'AboveElev\']\n0-6 : PoP : 40 : [\'BelowElev\']\n6-12 : PoP : 40 : [\'AboveElev\']\n6-12 : PoP : 40 : [\'BelowElev\']\n0-6 : Wx : NoWx : \'all\'\n0-6 : Wx : SChc:SW:-:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'AboveElev\']\n0-6 : Wx : SChc:SW:-:<NoVis>: : [\'BelowElev\']\n6-12 : Wx : NoWx : \'all\'\n6-12 : Wx : Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>: : [\'BelowElev\']\n"""',
'createGrids':[
('Fcst', 'Sky', 'SCALAR', 0, 6, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 0, 6, 87, ['BelowElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 50, ['AboveElev']),
('Fcst', 'Sky', 'SCALAR', 6, 12, 87, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 0, 6, 40, ['BelowElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['AboveElev']),
('Fcst', 'PoP', 'SCALAR', 6, 12, 40, ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:^Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['AboveElev']),
('Fcst', 'Wx', 'WEATHER', 0, 6, 'SChc:SW:-:<NoVis>:', ['BelowElev']),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'NoWx', 'all'),
('Fcst', 'Wx', 'WEATHER', 6, 12, 'Lkly:RW:-:<NoVis>:^Chc:RW:-:<NoVis>:', ['BelowElev']),
],
'name':'SPW_1_1961',
'checkStrings':[
"Windward, mostly sunny, Showers likely and slight chance of snow showers in the morning.",
"Leeward, mostly cloudy, Slight chance of snow showers in the morning, then showers likely in the afternoon.",
"Chance of precipitation 40 percent.",
],
},
]
import TestScript
def testScript(self, dataMgr):
defaults = {
"cmdLineVars" :"{('Product Issuance', 'productIssuance'): 'Morning', ('Issuance Type', 'issuanceType'): 'ROUTINE', ('Issued By', 'issuedBy'): None}",
"productType": "Phrase_Test_Local",
"fileChanges" : [
("Phrase_Test_Local", "TextUtility", "replace", (periodVer1, periodVer2), "undo")
],
}
return TestScript.generalTestScript(self, dataMgr, scripts, defaults)
| 84.013158 | 1,003 | 0.53389 | 7,535 | 57,465 | 4.063703 | 0.030524 | 0.043893 | 0.049641 | 0.061724 | 0.952776 | 0.946048 | 0.935173 | 0.927368 | 0.918354 | 0.908883 | 0 | 0.062358 | 0.18848 | 57,465 | 683 | 1,004 | 84.136164 | 0.594245 | 0.012408 | 0 | 0.714286 | 0 | 0.223404 | 0.635543 | 0.178738 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00152 | false | 0 | 0.00152 | 0 | 0.004559 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
ec86b57961ac1738f34b41545f877858fbac8228 | 16,611 | py | Python | pocketsmith/api/transaction_accounts_api.py | brett-comber/python-pocketsmith-api | a9c7f25abf65e4e022535431dc1d34d6a1bd97e8 | [
"MIT"
] | null | null | null | pocketsmith/api/transaction_accounts_api.py | brett-comber/python-pocketsmith-api | a9c7f25abf65e4e022535431dc1d34d6a1bd97e8 | [
"MIT"
] | null | null | null | pocketsmith/api/transaction_accounts_api.py | brett-comber/python-pocketsmith-api | a9c7f25abf65e4e022535431dc1d34d6a1bd97e8 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
PocketSmith
The public PocketSmith API # noqa: E501
The version of the OpenAPI document: 2.0
Contact: api@pocketsmith.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from pocketsmith.api_client import ApiClient
from pocketsmith.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class TransactionAccountsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def get_transaction_account(self, id, **kwargs): # noqa: E501
"""Get transaction account # noqa: E501
Gets a transaction account by its ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_transaction_account(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param int id: The unique identifier of the transaction account. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: TransactionAccount
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_transaction_account_with_http_info(id, **kwargs) # noqa: E501
def get_transaction_account_with_http_info(self, id, **kwargs): # noqa: E501
"""Get transaction account # noqa: E501
Gets a transaction account by its ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_transaction_account_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param int id: The unique identifier of the transaction account. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(TransactionAccount, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_transaction_account" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `get_transaction_account`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['developerKey'] # noqa: E501
return self.api_client.call_api(
'/transaction_accounts/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TransactionAccount', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def list_transaction_accounts(self, id, **kwargs): # noqa: E501
"""List transaction accounts in user # noqa: E501
List all transaction accounts belonging to a user. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_transaction_accounts(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param int id: The unique identifier of the user. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: list[TransactionAccount]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.list_transaction_accounts_with_http_info(id, **kwargs) # noqa: E501
def list_transaction_accounts_with_http_info(self, id, **kwargs): # noqa: E501
"""List transaction accounts in user # noqa: E501
List all transaction accounts belonging to a user. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_transaction_accounts_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param int id: The unique identifier of the user. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(list[TransactionAccount], status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_transaction_accounts" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `list_transaction_accounts`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['developerKey'] # noqa: E501
return self.api_client.call_api(
'/users/{id}/transaction_accounts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[TransactionAccount]', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def update_transaction_account(self, id, **kwargs): # noqa: E501
"""Update transaction account # noqa: E501
Change which institution the transaction account belongs to. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_transaction_account(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param int id: The unique identifier of the transaction account. (required)
:param InlineObject5 inline_object5:
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: TransactionAccount
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.update_transaction_account_with_http_info(id, **kwargs) # noqa: E501
def update_transaction_account_with_http_info(self, id, **kwargs): # noqa: E501
"""Update transaction account # noqa: E501
Change which institution the transaction account belongs to. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_transaction_account_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param int id: The unique identifier of the transaction account. (required)
:param InlineObject5 inline_object5:
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(TransactionAccount, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'id',
'inline_object5'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method update_transaction_account" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `update_transaction_account`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'inline_object5' in local_var_params:
body_params = local_var_params['inline_object5']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['developerKey'] # noqa: E501
return self.api_client.call_api(
'/transaction_accounts/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TransactionAccount', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 42.592308 | 126 | 0.593944 | 1,807 | 16,611 | 5.222468 | 0.10404 | 0.040691 | 0.056374 | 0.028611 | 0.912366 | 0.908763 | 0.905372 | 0.898167 | 0.89361 | 0.89361 | 0 | 0.015198 | 0.334537 | 16,611 | 389 | 127 | 42.701799 | 0.83852 | 0.462525 | 0 | 0.711111 | 1 | 0 | 0.164168 | 0.05845 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038889 | false | 0 | 0.027778 | 0 | 0.105556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
01d728c81b8794ac4e7181ff32c51f0d15885ad0 | 5,172 | py | Python | test/ll_combinator/test_reserved.py | BloggerBust/bbpyp | 078f940dd38bc3ee7c5adcfb2555c2843a4ca57b | [
"Apache-2.0"
] | null | null | null | test/ll_combinator/test_reserved.py | BloggerBust/bbpyp | 078f940dd38bc3ee7c5adcfb2555c2843a4ca57b | [
"Apache-2.0"
] | null | null | null | test/ll_combinator/test_reserved.py | BloggerBust/bbpyp | 078f940dd38bc3ee7c5adcfb2555c2843a4ca57b | [
"Apache-2.0"
] | null | null | null | import unittest
from mock import patch
from bbpyp.ll_combinator.reserved import Reserved
@patch('test.TestContext', create=True)
class TestReserved(unittest.TestCase):
def test_reserved_initialized_as_expected(self, test_context):
expected_value = test_context.value
expected_tag = test_context.tag
parser = Reserved(test_context.tag, test_context.value, test_context.concat_factory,
test_context.lhs_or_rhs_factory, test_context.expression_factory, test_context.apply_factory, test_context.greedy_factory, test_context.defer_factory, source_format_service=test_context.source_format_service, context_service=test_context.context_service)
self.assertIs(expected_value, parser.value)
self.assertIs(expected_tag, parser.tag)
def test_reserved_string_representation_is_as_expected(self, test_context):
expected_representation = f"{Reserved.__name__}({test_context.tag}, {test_context.value})"
parser = Reserved(test_context.tag, test_context.value, test_context.concat_factory,
test_context.lhs_or_rhs_factory, test_context.expression_factory, test_context.apply_factory, test_context.greedy_factory, test_context.defer_factory, source_format_service=test_context.source_format_service, context_service=test_context.context_service)
self.assertEqual(expected_representation, f"{parser}")
def test_reserved_call_with_none_matching_token_tag_returns_None(self, test_context):
tokens = [("KEYWORD", "while"), ("SYNTAX", "("), ("SYNTAX", ")")]
position = 0
parser = Reserved("SYNTAX", "while", test_context.concat_factory,
test_context.lhs_or_rhs_factory, test_context.expression_factory, test_context.apply_factory, test_context.greedy_factory, test_context.defer_factory, source_format_service=test_context.source_format_service, context_service=test_context.context_service)
result = parser(tokens, position)
self.assertEqual(position, result.position)
self.assertIsNone(result.value)
def test_reserved_call_with_none_matching_token_value_returns_None(self, test_context):
tokens = [("KEYWORD", "while"), ("SYNTAX", "("), ("SYNTAX", ")")]
position = 0
parser = Reserved("KEYWORD", "(", test_context.concat_factory,
test_context.lhs_or_rhs_factory, test_context.expression_factory, test_context.apply_factory, test_context.greedy_factory, test_context.defer_factory, source_format_service=test_context.source_format_service, context_service=test_context.context_service)
result = parser(tokens, position)
self.assertEqual(position, result.position)
self.assertIsNone(result.value)
def test_reserved_call_with_invalid_token_returns_None(self, test_context):
tokens = [("KEYWORD", "while"), ("SYNTAX", "("), ("SYNTAX", ")")]
position = 0
parser = Reserved(test_context.invalid_token_tag, test_context.invalid_token_value,
test_context.concat_factory, test_context.lhs_or_rhs_factory, test_context.expression_factory, test_context.apply_factory, test_context.greedy_factory, test_context.defer_factory, source_format_service=test_context.source_format_service, context_service=test_context.context_service)
result = parser(tokens, position)
self.assertEqual(position, result.position)
self.assertIsNone(result.value)
def test_reserved_call_with_matching_token_returns_result_with_matched_token_value_and_position_incremented_by_1(self, test_context):
expected_return_value = "("
expected_return_position = 2
tokens = [("KEYWORD", "while"), ("SYNTAX", "("), ("SYNTAX", ")")]
position = 1
parser = Reserved("SYNTAX", "(", test_context.concat_factory,
test_context.lhs_or_rhs_factory, test_context.expression_factory, test_context.apply_factory, test_context.greedy_factory, test_context.defer_factory, source_format_service=test_context.source_format_service, context_service=test_context.context_service)
result = parser(tokens, position)
self.assertEqual(expected_return_value, result.value)
self.assertEqual(expected_return_position, result.position)
def test_reserved_adding_two_reserved_parsers_should_produce_a_concatonation(self, test_context):
reserved1 = Reserved("SYNTAX", "+", test_context.concat_factory,
test_context.lhs_or_rhs_factory, test_context.expression_factory, test_context.apply_factory, test_context.greedy_factory, test_context.defer_factory, source_format_service=test_context.source_format_service, context_service=test_context.context_service)
reserved2 = Reserved("SYNTAX", "+", test_context.concat_factory,
test_context.lhs_or_rhs_factory, test_context.expression_factory, test_context.apply_factory, test_context.greedy_factory, test_context.defer_factory, source_format_service=test_context.source_format_service, context_service=test_context.context_service)
parser = reserved1 + reserved2
| 65.468354 | 309 | 0.758121 | 615 | 5,172 | 5.926829 | 0.121951 | 0.244444 | 0.197531 | 0.052675 | 0.778875 | 0.778875 | 0.742112 | 0.742112 | 0.726475 | 0.726475 | 0 | 0.002282 | 0.152746 | 5,172 | 78 | 310 | 66.307692 | 0.82953 | 0 | 0 | 0.45614 | 0 | 0 | 0.04447 | 0.011601 | 0 | 0 | 0 | 0 | 0.192982 | 1 | 0.122807 | false | 0 | 0.052632 | 0 | 0.192982 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bf33fe300b67470467f4f9f3a0bde7a4169a6666 | 136 | py | Python | discord/types/threads.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/types/threads.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/types/threads.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | from disnake.types.threads import *
from disnake.types.threads import __dict__ as __original_dict__
locals().update(__original_dict__)
| 27.2 | 63 | 0.838235 | 18 | 136 | 5.555556 | 0.555556 | 0.22 | 0.32 | 0.46 | 0.58 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 136 | 4 | 64 | 34 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
bd894ef562b46245a7918a808836cc573cb48887 | 4,300 | py | Python | backend/account/tests.py | B-sudo/USTC-Software-2019-BE-Test | e6e806ded62ea4a60fc62be31fa3815341a2386f | [
"MIT"
] | null | null | null | backend/account/tests.py | B-sudo/USTC-Software-2019-BE-Test | e6e806ded62ea4a60fc62be31fa3815341a2386f | [
"MIT"
] | null | null | null | backend/account/tests.py | B-sudo/USTC-Software-2019-BE-Test | e6e806ded62ea4a60fc62be31fa3815341a2386f | [
"MIT"
] | null | null | null | from django.test import TestCase
from django.test import Client
from .models import LoginForm,RegisterForm,User
import json
class AccountTests(TestCase):
def setUp(self):
self.user = User()
self.user.name = 'bill'
self.user.password = '123456'
self.user.sex = 'male'
self.user.email = '1111@qq.com'
self.user.save()
def test_index(self):
self.client = Client()
response=self.client.get('http://127.0.0.1:8000/account/')
self.assertEqual(response.status_code,200)
self.assertEqual(json.loads(response.content)["err_code"],"000")
def test_login(self):
response=self.client.get('http://127.0.0.1:8000/account/login')
self.assertEqual(response.status_code,200)
data={"username":"bill","password":"123456"}
response=self.client.post('http://127.0.0.1:8000/account/login',data)
self.assertEqual(response.status_code,200)
data={"username":"bill","password":"12345"}
response=self.client.post('http://127.0.0.1:8000/account/login',data)
self.assertEqual(json.loads(response.content)["err_code"],"102")
data={"username":"david","password":"123456"}
response=self.client.post('http://127.0.0.1:8000/account/login',data)
self.assertEqual(json.loads(response.content)["err_code"],"103")
def test_register(self):
response=self.client.get('http://127.0.0.1:8000/account/register')
self.assertEqual(response.status_code,200)
data={"username":"david",
"password":"23456",
"repassword":"23456",
"sex":"male",
"email":"12345@qq.com"}
response=self.client.post('http://127.0.0.1:8000/account/register',data)
self.assertEqual(json.loads(response.content)["err_code"],"201")
data={"username":"david",
"password":"23456",
"repassword":"123456",
"sex":"male",
"email":"12345@qq.com"}
response=self.client.post('http://127.0.0.1:8000/account/register',data)
self.assertEqual(json.loads(response.content)["err_code"],"203")
def test_logout(self):
session = self.client.session
session['username'] = 'bill'
session.save()
response=self.client.get('http://127.0.0.1:8000/account/logout')
self.assertEqual(response.status_code,200)
self.assertEqual(json.loads(response.content)["err_code"],"300")#借鉴了他人的写法
def test_user_index_invalid(self):
response=self.client.get('http://127.0.0.1:8000/account/user_index')
self.assertEqual(json.loads(response.content)["err_code"],"401")
def test_user_index_valid(self):
session = self.client.session
session['username'] = 'bill'
session.save()
response=self.client.get('http://127.0.0.1:8000/account/user_index')
self.assertEqual(json.loads(response.content)["err_code"],"400")
self.test_logout()
def test_update_user_index_invalid(self):
response=self.client.get('http://127.0.0.1:8000/account/update_user_index')
self.assertEqual(response.status_code,200)
self.assertEqual(json.loads(response.content)["err_code"],"501")
def test_update_user_index_valid(self):
session = self.client.session
session['username'] = 'bill'
session.save()
response=self.client.get('http://127.0.0.1:8000/account/update_user_index')
self.assertEqual(response.status_code,200)
self.assertEqual(json.loads(response.content)["err_code"],"500")
data={
"new_password":"23456",
"re_new_password":"23456",
"new_sex":"male",
"new_email":"12345@qq.com"}
response=self.client.post('http://127.0.0.1:8000/account/update_user_index',data)
self.assertEqual(json.loads(response.content)["err_code"],"502")
data={
"new_password":"23456",
"re_new_password":"123456",
"new_sex":"male",
"new_email":"12345@qq.com"}
response=self.client.post('http://127.0.0.1:8000/account/update_user_index',data)
self.assertEqual(json.loads(response.content)["err_code"],"503")
# Create your tests here.
| 34.4 | 89 | 0.626279 | 547 | 4,300 | 4.817185 | 0.13894 | 0.072106 | 0.102467 | 0.051233 | 0.827704 | 0.817837 | 0.79203 | 0.766983 | 0.743833 | 0.743833 | 0 | 0.086474 | 0.203953 | 4,300 | 124 | 90 | 34.677419 | 0.683319 | 0.007442 | 0 | 0.488636 | 0 | 0 | 0.264651 | 0 | 0 | 0 | 0 | 0 | 0.215909 | 1 | 0.102273 | false | 0.136364 | 0.045455 | 0 | 0.159091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
bd92f9b11d1dfb37d9c873c6c55626d79defc919 | 28,699 | py | Python | tests/circuit_graph/test_circuit_graph_hash.py | DanielPolatajko/pennylane | d603e810a4d34d727a436d852c540fdc0fe21a85 | [
"Apache-2.0"
] | 1 | 2021-02-18T02:14:27.000Z | 2021-02-18T02:14:27.000Z | tests/circuit_graph/test_circuit_graph_hash.py | markhop20/pennylane | 8792f0f88178f70a04d6f7afbbb9dd90d2e758b3 | [
"Apache-2.0"
] | null | null | null | tests/circuit_graph/test_circuit_graph_hash.py | markhop20/pennylane | 8792f0f88178f70a04d6f7afbbb9dd90d2e758b3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2018-2020 Xanadu Quantum Technologies Inc.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
Unit and integration tests for creating the :mod:`pennylane` :attr:`QNode.circuit.hash` attribute.
"""
import pytest
import numpy as np
import pennylane as qml
from pennylane.operation import Tensor
from pennylane.circuit_graph import CircuitGraph
from pennylane.qnodes import BaseQNode
from pennylane.variable import Variable
from pennylane.wires import Wires
pytestmark = pytest.mark.usefixtures("tape_mode")
class TestCircuitGraphHash:
"""Test the creation of a hash on a CircuitGraph"""
numeric_queues = [
([
qml.RX(0.3, wires=[0])
],
[],
'RX!0.3![0]|||'
),
([
qml.RX(0.3, wires=[0]),
qml.RX(0.4, wires=[1]),
qml.RX(0.5, wires=[2]),
],
[],
'RX!0.3![0]RX!0.4![1]RX!0.5![2]|||'
)
]
@pytest.mark.parametrize("queue, observable_queue, expected_string", numeric_queues)
def test_serialize_numeric_arguments(self, queue, observable_queue, expected_string):
"""Tests that the same hash is created for two circuitgraphs that have numeric arguments."""
circuit_graph_1 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1, 2]))
circuit_graph_2 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1, 2]))
assert circuit_graph_1.serialize() == circuit_graph_2.serialize()
assert expected_string == circuit_graph_1.serialize()
variable = Variable(1)
symbolic_queue = [
([qml.RX(variable, wires=[0])],
[],
'RX!V1![0]|||'
),
]
@pytest.mark.parametrize("queue, observable_queue, expected_string", symbolic_queue)
def test_serialize_symbolic_argument(self, queue, observable_queue, expected_string):
"""Tests that the same hash is created for two circuitgraphs that have symbolic arguments."""
circuit_graph_1 = CircuitGraph(queue + observable_queue, {}, Wires([0]))
circuit_graph_2 = CircuitGraph(queue + observable_queue, {}, Wires([0]))
assert circuit_graph_1.serialize() == circuit_graph_2.serialize()
assert expected_string == circuit_graph_1.serialize()
variable = Variable(1)
symbolic_queue = [
([
qml.RX(variable, wires=[0]),
qml.RX(0.3, wires=[1]),
qml.RX(variable, wires=[2])
],
[],
'RX!V1![0]RX!0.3![1]RX!V1![2]|||'
),
]
@pytest.mark.parametrize("queue, observable_queue, expected_string", symbolic_queue)
def test_serialize_numeric_and_symbolic_argument(self, queue, observable_queue, expected_string):
"""Tests that the same hash is created for two circuitgraphs that have both numeric and symbolic arguments."""
circuit_graph_1 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1, 2]))
circuit_graph_2 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1, 2]))
assert circuit_graph_1.serialize() == circuit_graph_2.serialize()
assert expected_string == circuit_graph_1.serialize()
variable = Variable(1)
many_symbolic_queue = [
([
qml.RX(variable, wires=[0]),
qml.RX(variable, wires=[1])
],
[],
'RX!V1![0]' +
'RX!V1![1]' +
'|||'
),
]
@pytest.mark.parametrize("queue, observable_queue, expected_string", many_symbolic_queue)
def test_serialize_symbolic_argument_multiple_times(self, queue, observable_queue, expected_string):
"""Tests that the same hash is created for two circuitgraphs that have the same symbolic argument
used multiple times."""
circuit_graph_1 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1]))
circuit_graph_2 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1]))
assert circuit_graph_1.serialize() == circuit_graph_2.serialize()
assert expected_string == circuit_graph_1.serialize()
variable1 = Variable(1)
variable2 = Variable(2)
multiple_symbolic_queue = [
([
qml.RX(variable1, wires=[0]),
qml.RX(variable2, wires=[1])
],
[],
'RX!V1![0]' +
'RX!V2![1]' +
'|||'
),
]
@pytest.mark.parametrize("queue, observable_queue, expected_string", multiple_symbolic_queue)
def test_serialize_multiple_symbolic_arguments(self, queue, observable_queue, expected_string):
"""Tests that the same hash is created for two circuitgraphs that have multiple symbolic arguments."""
circuit_graph_1 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1]))
circuit_graph_2 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1]))
assert circuit_graph_1.serialize() == circuit_graph_2.serialize()
assert expected_string == circuit_graph_1.serialize()
observable1 = qml.PauliZ(0)
observable1.return_type = not None
observable2 = qml.Hermitian(np.array([[1, 0],[0, -1]]), wires=[0])
observable2.return_type = not None
observable3 = Tensor(qml.PauliZ(0) @ qml.PauliZ(1))
observable3.return_type = not None
numeric_observable_queue = [
([],
[observable1],
'|||PauliZ[0]'
),
(
[],
[observable2],
'|||Hermitian![[ 1 0]\n [ 0 -1]]![0]'
),
(
[],
[observable3],
'|||[\'PauliZ\', \'PauliZ\'][0, 1]'
)
]
@pytest.mark.parametrize("queue, observable_queue, expected_string", numeric_observable_queue)
def test_serialize_numeric_arguments_observables(self, queue, observable_queue, expected_string):
"""Tests that the same hash is created for two circuitgraphs that have identical queues and empty variable_deps."""
circuit_graph_1 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1]))
circuit_graph_2 = CircuitGraph(queue + observable_queue, {}, Wires([0, 1]))
assert circuit_graph_1.serialize() == circuit_graph_2.serialize()
assert expected_string == circuit_graph_1.serialize()
class TestQNodeCircuitHashIntegration:
"""Test for the circuit hash that is being created for a QNode during evaluation (inside of _construct)"""
def test_evaluate_circuit_hash_numeric(self):
"""Tests that the circuit hash of identical circuits containing only numeric parameters are equal"""
dev = qml.device("default.qubit", wires=2)
a = 0.3
b = 0.2
def circuit1():
qml.RX(a, wires=[0])
qml.RY(b, wires=[1])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([], {})
circuit_hash_1 = node1.circuit.hash
def circuit2():
qml.RX(a, wires=[0])
qml.RY(b, wires=[1])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 == circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_symbolic(self, x, y):
"""Tests that the circuit hash of identical circuits containing only symbolic parameters are equal"""
dev = qml.device("default.qubit", wires=2)
def circuit1(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 == circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_numeric_and_symbolic(self, x, y):
"""Tests that the circuit hash of identical circuits containing numeric and symbolic parameters are equal"""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.RZ(0.3, wires=[2])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.RZ(0.3, wires=[2])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 == circuit_hash_2
@pytest.mark.parametrize(
"a,b",
zip(np.linspace(0.1, 2 * np.pi, 3), np.linspace(0, 2 * np.pi, 3)),
)
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 0, 3), np.linspace(-2 * np.pi, 0, 3)),
)
def test_evaluate_circuit_hash_symbolic_assigned_arguments_do_not_matter(self, a, b, x, y):
"""Tests that the circuit hashes of identical circuits where different values are assigned to symbolic parameters are equal"""
dev = qml.device("default.qubit", wires=2)
def circuit1(a, b):
qml.RX(a, wires=[0])
qml.RY(b, wires=[1])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([a, b], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 == circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_numeric_and_symbolic_tensor_return(self, x, y):
"""Tests that the circuit hashes of identical circuits having a tensor product in the return
statement are equal"""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.RZ(0.3, wires=[2])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.RZ(0.3, wires=[2])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 == circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_same_operation_has_numeric_and_symbolic(self, x, y):
"""Tests that the circuit hashes of identical circuits where one operation has both numeric
and symbolic arguments are equal"""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = dev.circuit_hash
assert circuit_hash_1 == circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_numeric_and_symbolic_return_type_does_not_matter(self, x, y):
"""Tests that the circuit hashes of identical circuits only differing on their return types are equal"""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.var(qml.PauliZ(0) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
def circuit3(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.sample(qml.PauliZ(0) @ qml.PauliX(1))
node3 = BaseQNode(circuit1, dev)
node3.evaluate([x, y], {})
circuit_hash_3 = node3.circuit.hash
assert circuit_hash_1 == circuit_hash_2 == circuit_hash_3
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_hermitian(self, x, y):
"""Tests that the circuit hashes of identical circuits containing a Hermitian observable are equal"""
dev = qml.device("default.qubit", wires=3)
matrix = np.array([[1, 0], [0, 1]])
def circuit1(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.Hermitian(matrix, wires=[0]) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.Hermitian(matrix, wires=[0]) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 == circuit_hash_2
class TestQNodeCircuitHashDifferentHashIntegration:
"""Tests for checking that different circuit graph hashes are being created for different circuits in a QNode during evaluation (inside of _construct)"""
def test_evaluate_circuit_hash_numeric_different(self):
"""Tests that the circuit hashes of identical circuits except for one numeric value are different"""
dev = qml.device("default.qubit", wires=2)
a = 0.3
b = 0.2
def circuit1():
qml.RX(a, wires=[0])
qml.RY(b, wires=[1])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([], {})
circuit_hash_1 = node1.circuit.hash
c = 0.6
def circuit2():
qml.RX(c, wires=[0])
qml.RY(b, wires=[1])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
def test_evaluate_circuit_hash_numeric_different_operation(self):
"""Tests that the circuit hashes of identical circuits except for one of the operations are different"""
dev = qml.device("default.qubit", wires=2)
a = 0.3
def circuit1():
qml.RX(a, wires=[0])
return qml.expval(qml.PauliZ(0))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([], {})
circuit_hash_1 = node1.circuit.hash
def circuit2():
qml.RY(a, wires=[0])
return qml.expval(qml.PauliZ(0))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_numeric_and_symbolic_operation_differs(self, x, y):
"""Tests that the circuit hashes of identical circuits that have numeric and symbolic arguments
except for one of the operations are different"""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.RX(x, wires=[0])
qml.RZ(y, wires=[1]) # <-------------------------------------- RZ
qml.RZ(0.3, wires=[2])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1]) # <-------------------------------------- RY
qml.RZ(0.3, wires=[2])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_different_return_observable_vs_tensor(self, x, y):
"""Tests that the circuit hashes of identical circuits except for the return statement are different"""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.RZ(0.3, wires=[2])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0)) # <------------- qml.PauliZ(0)
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.RZ(0.3, wires=[2])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1)) # <------------- qml.PauliZ(0) @ qml.PauliX(1)
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_same_operation_has_numeric_and_symbolic_different_order(self, x, y):
"""Tests that the circuit hashes of identical circuits except for the order of numeric and symbolic arguments
in one of the operations are different."""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.Rot(x, 0.3, y, wires=[0]) # <------------- x, 0.3, y
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.Rot(x, y, 0.3, wires=[0]) # <------------- x, y, 0.3
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_same_operation_has_numeric_and_symbolic_different_argument(self, x, y):
"""Tests that the circuit hashes of identical circuits except for the numeric value
in one of the operations are different."""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.Rot(x, y, 0.3, wires=[0]) # <------------- 0.3
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.Rot(x, y, 0.5, wires=[0]) # <------------- 0.5
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 2), np.linspace(-2 * np.pi, 2 * np.pi, 2) ** 2 / 11),
)
def test_evaluate_circuit_hash_same_operation_has_numeric_and_symbolic_different_wires(self, x, y):
"""Tests that the circuit hashes of identical circuits except for the wires
in one of the operations are different."""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1]) #<------ wires = [0, 1]
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[1, 0]) #<------ wires = [1, 0]
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 2), np.linspace(-2 * np.pi, 2 * np.pi, 2) ** 2 / 11),
)
def test_evaluate_circuit_hash_same_operation_has_numeric_and_symbolic_different_wires_in_return(self, x, y):
"""Tests that the circuit hashes of identical circuits except for the wires
in the return statement are different."""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1)) # <----- (0) @ (1)
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(2)) # <----- (0) @ (2)
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 7), np.linspace(-2 * np.pi, 2 * np.pi, 7) ** 2 / 11),
)
def test_evaluate_circuit_hash_numeric_and_symbolic_different_parameter(self, x, y):
"""Tests that the circuit hashes of identical circuits except for the numeric argument of a signle operation
in the circuits are different"""
dev = qml.device("default.qubit", wires=3)
def circuit1(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.RZ(0.3, wires=[2]) # <------------- 0.3
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.RX(x, wires=[0])
qml.RY(y, wires=[1])
qml.RZ(0.5, wires=[2]) # <------------- 0.5
qml.CNOT(wires=[0, 1])
return qml.expval(qml.PauliZ(0) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
@pytest.mark.parametrize(
"x,y",
zip(np.linspace(-2 * np.pi, 2 * np.pi, 2), np.linspace(-2 * np.pi, 2 * np.pi, 2) ** 2 / 11),
)
def test_evaluate_circuit_hash_hermitian_different_matrices(self, x, y):
"""Tests that the circuit hashes of identical circuits except for the matrix argument of the Hermitian observable
in the return statement are different."""
dev = qml.device("default.qubit", wires=3)
matrix_1 = np.array([[1, 0], [0, 1]])
matrix_2 = np.array([[1, 0], [0, -1]])
def circuit1(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.Hermitian(matrix_1, wires=[0]) @ qml.PauliX(1))
node1 = BaseQNode(circuit1, dev)
node1.evaluate([x, y], {})
circuit_hash_1 = node1.circuit.hash
def circuit2(x, y):
qml.Rot(x, y, 0.3, wires=[0])
qml.CNOT(wires=[0, 1])
return qml.expval(qml.Hermitian(matrix_2, wires=[0]) @ qml.PauliX(1))
node2 = BaseQNode(circuit2, dev)
node2.evaluate([x, y], {})
circuit_hash_2 = node2.circuit.hash
assert circuit_hash_1 != circuit_hash_2
def test_compiled_program_was_stored(self):
"""Test that QVM device stores the compiled program correctly"""
dev = qml.device("default.qubit", wires=3)
def circuit(params, wires):
qml.Hadamard(0)
qml.CNOT(wires=[0, 1])
obs = [qml.PauliZ(0) @ qml.PauliZ(1)]
obs_list = obs * 6
qnodes = qml.map(circuit, obs_list, dev)
qnodes([], parallel=True)
hashes = set()
for qnode in qnodes:
hashes.add(qnode.circuit.hash)
assert len(hashes) == 1
| 36.888175 | 157 | 0.554688 | 3,709 | 28,699 | 4.169857 | 0.06282 | 0.096017 | 0.019397 | 0.029419 | 0.833182 | 0.820768 | 0.808418 | 0.800336 | 0.796263 | 0.768783 | 0 | 0.04398 | 0.305167 | 28,699 | 777 | 158 | 36.93565 | 0.731608 | 0.141085 | 0 | 0.745487 | 0 | 0.001805 | 0.030248 | 0.002623 | 0 | 0 | 0 | 0 | 0.055957 | 1 | 0.113718 | false | 0 | 0.01444 | 0 | 0.225632 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bdc6b6fcace00bf94afc6834e36c227e419f787a | 141 | py | Python | {{cookiecutter.repo_name}}/{{cookiecutter.app_name}}/apps/core/__init__.py | pipermerriam/cookiecutter-django | 7197b3903c6c1bb334ed3a73d52ee1073f0bf3bf | [
"MIT"
] | null | null | null | {{cookiecutter.repo_name}}/{{cookiecutter.app_name}}/apps/core/__init__.py | pipermerriam/cookiecutter-django | 7197b3903c6c1bb334ed3a73d52ee1073f0bf3bf | [
"MIT"
] | null | null | null | {{cookiecutter.repo_name}}/{{cookiecutter.app_name}}/apps/core/__init__.py | pipermerriam/cookiecutter-django | 7197b3903c6c1bb334ed3a73d52ee1073f0bf3bf | [
"MIT"
] | null | null | null | default_app_config = '{{cookiecutter.app_name}}.apps.core.config.{{ cookiecutter.app_name|replace("_", " ")|title|replace(" ", "") }}Config'
| 70.5 | 140 | 0.695035 | 16 | 141 | 5.8125 | 0.5625 | 0.387097 | 0.451613 | 0.537634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06383 | 141 | 1 | 141 | 141 | 0.704545 | 0 | 0 | 0 | 0 | 1 | 0.829787 | 0.560284 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
bdd3d9953d2f77759c21f6301ba8092d096bf861 | 134 | py | Python | multiseg/__init__.py | mateoneira/MultiplexSegregation | 2e3dd7a928d80ff7777521d63d476bebeb0349c6 | [
"MIT"
] | 1 | 2019-07-19T10:32:45.000Z | 2019-07-19T10:32:45.000Z | multiseg/__init__.py | mateoneira/MultiplexSegregation | 2e3dd7a928d80ff7777521d63d476bebeb0349c6 | [
"MIT"
] | null | null | null | multiseg/__init__.py | mateoneira/MultiplexSegregation | 2e3dd7a928d80ff7777521d63d476bebeb0349c6 | [
"MIT"
] | null | null | null | from multiseg.multiplexSeg import *
from multiseg.processGeom import *
from multiseg.classes import *
from multiseg.generator import * | 33.5 | 35 | 0.828358 | 16 | 134 | 6.9375 | 0.4375 | 0.432432 | 0.486486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11194 | 134 | 4 | 36 | 33.5 | 0.932773 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bdee4fcb3b4db91ca427b55bd6a27b7517292220 | 1,050 | py | Python | collectors/utils/constants/__init__.py | alvesmatheus/fala-camarada | 47015fe95422d5f71c279e47edacdd31ea3f71b8 | [
"MIT"
] | 7 | 2021-02-11T20:36:16.000Z | 2021-02-12T17:22:05.000Z | collectors/utils/constants/__init__.py | alvesmatheus/fala-camarada | 47015fe95422d5f71c279e47edacdd31ea3f71b8 | [
"MIT"
] | null | null | null | collectors/utils/constants/__init__.py | alvesmatheus/fala-camarada | 47015fe95422d5f71c279e47edacdd31ea3f71b8 | [
"MIT"
] | null | null | null | from collectors.utils.constants.committees import COMMITTEE_CATEGORY_PATTERNS
from collectors.utils.constants.committees import PERMANENT_COMMITTEE_NAMES
from collectors.utils.constants.committees import WRONG_COMMITTEE_NAMES
from collectors.utils.constants.paths import COMMITTEES_SCHEDULE_PATH
from collectors.utils.constants.paths import RAW_DATA_DIR_PATH
from collectors.utils.constants.patterns import DATE_PATTERN
from collectors.utils.constants.patterns import DOUBT_NOTATION_PATTERN
from collectors.utils.constants.patterns import SPEECH_SPEAKER_PATTERN
from collectors.utils.constants.patterns import TRANSCRIPTION_NOTATION_PATTERN
from collectors.utils.constants.selectors import COMMITTEE_EVENT_SELECTORS
from collectors.utils.constants.urls import CHAMBER_OF_DEPUTIES_URL
from collectors.utils.constants.urls import COMMITTEES_SCHEDULE_URL
from collectors.utils.constants.urls import COMMITTEE_SPEECH_URL
from collectors.utils.constants.years import DEFAULT_FINAL_YEAR
from collectors.utils.constants.years import DEFAULT_START_YEAR
| 50 | 78 | 0.895238 | 136 | 1,050 | 6.683824 | 0.272059 | 0.231023 | 0.313531 | 0.462046 | 0.771177 | 0.759076 | 0.353135 | 0 | 0 | 0 | 0 | 0 | 0.061905 | 1,050 | 20 | 79 | 52.5 | 0.922843 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bdf4440f74e19be9f4bd73f07c11cbf9ec3fa93d | 3,511 | py | Python | test/test_signals.py | jasonpjacobs/systemrdl-compiler | e3fdaf53b6c605a24d6e1149817f3636a85aed09 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | test/test_signals.py | jasonpjacobs/systemrdl-compiler | e3fdaf53b6c605a24d6e1149817f3636a85aed09 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | test/test_signals.py | jasonpjacobs/systemrdl-compiler | e3fdaf53b6c605a24d6e1149817f3636a85aed09 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | from unittest_utils import RDLSourceTestCase
class TestParameters(RDLSourceTestCase):
def test_signalwidth(self):
root = self.compile(
["rdl_src/signals.rdl"],
"top"
)
self.assertEqual(root.find_by_path("top.s1").width, 8)
def test_field_resets(self):
root = self.compile(
["rdl_src/reset_signals.rdl"],
"field_resets"
)
self.assertEqual(
root.find_by_path("field_resets.rf.x.A").get_property('resetsignal'),
root.find_by_path("field_resets.rf.x.reset_z"),
)
self.assertEqual(
root.find_by_path("field_resets.rf.x.B").get_property('resetsignal'),
root.find_by_path("field_resets.reset_x"),
)
self.assertEqual(
root.find_by_path("field_resets.rf.x.C").get_property('resetsignal'),
root.find_by_path("field_resets.rf.reset_y"),
)
self.assertEqual(
root.find_by_path("field_resets.rf.y.A").get_property('resetsignal'),
root.find_by_path("field_resets.rf.reset_y"),
)
self.assertEqual(
root.find_by_path("field_resets.rf.y.B").get_property('resetsignal'),
root.find_by_path("field_resets.reset_x"),
)
self.assertIsNone(
root.find_by_path("field_resets.z.A").get_property('resetsignal')
)
self.assertEqual(
root.find_by_path("field_resets.z.B").get_property('resetsignal'),
root.find_by_path("field_resets.reset_x"),
)
def test_cpuif_resets(self):
root = self.compile(
["rdl_src/reset_signals.rdl"],
"cpuif_resets"
)
reset_x = root.find_by_path("cpuif_resets.rf.reset_x")
reset_y = root.find_by_path("cpuif_resets.rf.x.reset_y")
self.assertIsNone(root.find_by_path("cpuif_resets").cpuif_reset)
self.assertEqual(root.find_by_path("cpuif_resets.rf").cpuif_reset, reset_x)
self.assertEqual(root.find_by_path("cpuif_resets.rf.x").cpuif_reset, reset_y)
self.assertEqual(root.find_by_path("cpuif_resets.rf.x.A").cpuif_reset, reset_y)
self.assertEqual(root.find_by_path("cpuif_resets.rf.y").cpuif_reset, reset_x)
self.assertEqual(root.find_by_path("cpuif_resets.rf.y.A").cpuif_reset, reset_x)
self.assertIsNone(root.find_by_path("cpuif_resets.z").cpuif_reset)
self.assertIsNone(root.find_by_path("cpuif_resets.z.A").cpuif_reset)
def test_field_reset_err(self):
self.assertRDLCompileError(
["rdl_err_src/err_reset_signals.rdl"],
"field_resets",
r"Only one 'field_reset' signal is allowed per hierarchy. Signal 'freset_root2' is redundant."
)
self.assertRDLCompileError(
["rdl_err_src/err_reset_signals.rdl"],
"field_resets",
r"Only one 'field_reset' signal is allowed per hierarchy. Signal 'reset_b' is redundant."
)
def test_cpuif_reset_err(self):
self.assertRDLCompileError(
["rdl_err_src/err_reset_signals.rdl"],
"cpuif_resets",
r"Only one 'cpuif_reset' signal is allowed per hierarchy. Signal 'creset_root2' is redundant."
)
self.assertRDLCompileError(
["rdl_err_src/err_reset_signals.rdl"],
"cpuif_resets",
r"Only one 'cpuif_reset' signal is allowed per hierarchy. Signal 'reset_y' is redundant."
)
| 37.351064 | 106 | 0.633153 | 451 | 3,511 | 4.600887 | 0.117517 | 0.09253 | 0.115663 | 0.161928 | 0.860723 | 0.845301 | 0.819277 | 0.8 | 0.726747 | 0.666024 | 0 | 0.001512 | 0.246653 | 3,511 | 93 | 107 | 37.752688 | 0.782987 | 0 | 0 | 0.363636 | 0 | 0 | 0.326972 | 0.085731 | 0 | 0 | 0 | 0 | 0.25974 | 1 | 0.064935 | false | 0 | 0.012987 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
da0185b8c307e28e552b2483623cadbc34312822 | 159 | py | Python | src/clearskies/secrets/__init__.py | cmancone/clearskies | aaa33fef6d03205faf26f123183a46adc1dbef9c | [
"MIT"
] | 4 | 2021-04-23T18:13:06.000Z | 2022-03-26T01:51:01.000Z | src/clearskies/secrets/__init__.py | cmancone/clearskies | aaa33fef6d03205faf26f123183a46adc1dbef9c | [
"MIT"
] | null | null | null | src/clearskies/secrets/__init__.py | cmancone/clearskies | aaa33fef6d03205faf26f123183a46adc1dbef9c | [
"MIT"
] | null | null | null | from .akeyless import AKeyless
from ..binding_config import BindingConfig
def akeyless(*args, **kwargs):
return BindingConfig(AKeyless, *args, **kwargs)
| 22.714286 | 51 | 0.761006 | 18 | 159 | 6.666667 | 0.555556 | 0.2 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132075 | 159 | 6 | 52 | 26.5 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
da426621e87fe7b7e6a85ca1bc134d17584ac06c | 48 | py | Python | bugtests/test154p/testing.py | doom38/jython_v2.2.1 | 0803a0c953c294e6d14f9fc7d08edf6a3e630a15 | [
"CNRI-Jython"
] | null | null | null | bugtests/test154p/testing.py | doom38/jython_v2.2.1 | 0803a0c953c294e6d14f9fc7d08edf6a3e630a15 | [
"CNRI-Jython"
] | null | null | null | bugtests/test154p/testing.py | doom38/jython_v2.2.1 | 0803a0c953c294e6d14f9fc7d08edf6a3e630a15 | [
"CNRI-Jython"
] | null | null | null |
def testing():
return "spam"
#print "test" | 9.6 | 17 | 0.604167 | 6 | 48 | 4.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.229167 | 48 | 5 | 18 | 9.6 | 0.783784 | 0.25 | 0 | 0 | 0 | 0 | 0.114286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
da53ad91ec1f1716ce4f3e5241d55fa16b8b8621 | 40 | py | Python | examples/phobos/tests/test_std_meta.py | kinke/autowrap | 2f042df3f292aa39b1da0b9607fbe3424f56ff4a | [
"BSD-3-Clause"
] | 47 | 2019-07-16T10:38:07.000Z | 2022-03-30T16:34:24.000Z | examples/phobos/tests/test_std_meta.py | kinke/autowrap | 2f042df3f292aa39b1da0b9607fbe3424f56ff4a | [
"BSD-3-Clause"
] | 199 | 2019-06-17T23:24:40.000Z | 2021-06-16T16:41:36.000Z | examples/phobos/tests/test_std_meta.py | kinke/autowrap | 2f042df3f292aa39b1da0b9607fbe3424f56ff4a | [
"BSD-3-Clause"
] | 7 | 2019-09-13T18:03:49.000Z | 2022-01-17T03:53:00.000Z | def test_import():
import std_meta
| 10 | 19 | 0.7 | 6 | 40 | 4.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225 | 40 | 3 | 20 | 13.333333 | 0.83871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 1 | 0 | 1.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
da8e0e684d12d543c4d5645c07cd6c313b525ab9 | 15,695 | py | Python | kubekeep/backup.py | rudradevpal/kubekeep | 78774d9e9690a031300e42b04a5d5c27aaca22f5 | [
"MIT"
] | null | null | null | kubekeep/backup.py | rudradevpal/kubekeep | 78774d9e9690a031300e42b04a5d5c27aaca22f5 | [
"MIT"
] | null | null | null | kubekeep/backup.py | rudradevpal/kubekeep | 78774d9e9690a031300e42b04a5d5c27aaca22f5 | [
"MIT"
] | null | null | null | import requests
import logging
import urllib3
import json
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
logger = logging.getLogger(__name__)
log_formatter = logging.Formatter('%(asctime)-15s [%(levelname)s] %(message)s')
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(log_formatter)
logger.addHandler(stream_handler)
# file_handler = logging.FileHandler("/var/log/backup/" + str(backupDict["script_Name"]) + ".log", mode='w')
# file_handler.setFormatter(log_formatter)
# file_handler.setLevel(logging.WARNING)
# logger.addHandler(file_handler)
logger.setLevel(logging.INFO)
logging.getLogger("requests").setLevel(logging.ERROR)
class Backup:
def __init__(self, URL, token):
self.__kube_url = str(URL)
self.__kube_token = str(token)
def get_all_namespaces(self):
logger.info("Fetching all Namespaces")
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/api/v1/namespaces', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
namespaces = list()
for i in response.json()["items"]:
if i["metadata"]["name"] not in ('default', 'kube-public', 'kube-system'):
namespaces.append(i["metadata"]["name"])
return {"result": namespaces, "code": res_code}
else:
logger.error("Unable to fetch Kubernetes Namespaces - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch Kubernetes Namespaces - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_all_storageclasses(self):
logger.info("Fetching all StorageClasses")
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/apis/storage.k8s.io/v1/storageclasses', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch Kubernetes StorageClasses - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch Kubernetes StorageClasses - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_all_clusterroles(self):
logger.info("Fetching all ClusterRoles")
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/apis/rbac.authorization.k8s.io/v1/clusterroles', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch Kubernetes ClusterRoles - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch Kubernetes ClusterRoles - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_all_clusterrolebindings(self):
logger.info("Fetching all ClusterRoleBindings")
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/apis/rbac.authorization.k8s.io/v1/clusterrolebindings', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch Kubernetes ClusterRoleBindings - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch Kubernetes ClusterRoleBindings - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_deployments(self, namespace):
logger.info("Fetching Deployments for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/apis/apps/v1/namespaces/' + str(namespace) + '/deployments', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch Deployments for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch Deployments for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_configmaps(self, namespace):
logger.info("Fetching ConfigMaps for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/api/v1/namespaces/' + str(namespace) + '/configmaps', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch ConfigMaps for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch ConfigMaps for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_secrets(self, namespace):
logger.info("Fetching Secrets for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/api/v1/namespaces/' + str(namespace) + '/secrets', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch Secrets for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch Secrets for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_pvc(self, namespace):
logger.info("Fetching PVC for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/api/v1/namespaces/' + str(namespace) + '/persistentvolumeclaims', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch PVC for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch PVC for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_ingresses(self, namespace):
logger.info("Fetching Ingresses for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/apis/extensions/v1beta1/namespaces/' + str(namespace) + '/ingresses', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch Ingresses for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch Ingresses for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_services(self, namespace):
logger.info("Fetching Services for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/api/v1/namespaces/' + str(namespace) + '/services', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch Services for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch Services for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_replicationcontrollers(self, namespace):
logger.info("Fetching ReplicationControllers for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/api/v1/namespaces/' + str(namespace) + '/replicationcontrollers', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch ReplicationControllers for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch ReplicationControllers for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_daemonsets(self, namespace):
logger.info("Fetching DaemonSets for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/apis/apps/v1/namespaces/' + str(namespace) + '/daemonsets', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch DaemonSets for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch DaemonSets for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_networkpolicies(self, namespace):
logger.info("Fetching NetworkPolicies for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/apis/networking.k8s.io/v1/namespaces/' + str(namespace) + '/networkpolicies', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch NetworkPolicies for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch NetworkPolicies for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_statefulsets(self, namespace):
logger.info("Fetching StatefulSets for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/apis/apps/v1/namespaces/' + str(namespace) + '/statefulsets', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch StatefulSets for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch StatefulSets for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
def get_cronjobs(self, namespace):
logger.info("Fetching CronJobs for Namespace " + str(namespace))
try:
headers = {'Authorization': 'Bearer ' + str(self.__kube_token)}
response = requests.get(str(self.__kube_url) + '/apis/batch/v1beta1/namespaces/' + str(namespace) + '/cronjobs', headers=headers, verify=False)
res_code = response.status_code
if res_code == 200:
json_data = response.json()
return {"result": json.dumps(json_data, indent=4), "code": res_code}
else:
logger.error("Unable to fetch CronJobs for namespace - " + str(namespace) + " - Error Code " + str(res_code))
return {"result": str(response.json()["message"]), "code": res_code}
except Exception, e:
logger.error("Unable to fetch CronJobs for namespace - " + str(namespace) + " - " + str(e))
return {"result": "Internal Error - " + str(e), "code": 500}
| 54.877622 | 169 | 0.594521 | 1,732 | 15,695 | 5.241339 | 0.066397 | 0.057832 | 0.054527 | 0.087244 | 0.860101 | 0.798854 | 0.798854 | 0.786737 | 0.786737 | 0.786737 | 0 | 0.011306 | 0.26741 | 15,695 | 285 | 170 | 55.070175 | 0.778222 | 0.01389 | 0 | 0.606557 | 0 | 0 | 0.246768 | 0.027857 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.016393 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
da90290beadf7effc2fe9c683bc2b329ba77f633 | 19,026 | py | Python | nicos/devices/vendor/caress/absdev_idl.py | jkrueger1/nicos | 5f4ce66c312dedd78995f9d91e8a6e3c891b262b | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | nicos/devices/vendor/caress/absdev_idl.py | jkrueger1/nicos | 5f4ce66c312dedd78995f9d91e8a6e3c891b262b | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | nicos/devices/vendor/caress/absdev_idl.py | jkrueger1/nicos | 5f4ce66c312dedd78995f9d91e8a6e3c891b262b | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | #pylint: skip-file
# Python stubs generated by omniidl from absdev.idl
# DO NOT EDIT THIS FILE!
import _omnipy
import omniORB
from omniORB import CORBA, PortableServer
_0_CORBA = CORBA
_omnipy.checkVersion(4,2, __file__, 1)
try:
property
except NameError:
def property(*args):
return None
#
# Start of module "_GlobalIDL"
#
__name__ = "_GlobalIDL"
_0__GlobalIDL = omniORB.openModule("_GlobalIDL", r"absdev.idl")
_0__GlobalIDL__POA = omniORB.openModule("_GlobalIDL__POA", r"absdev.idl")
_0__GlobalIDL.MAX_ITEMS = 4096
# typedef ... module_info_seq_t
class module_info_seq_t:
_NP_RepositoryId = "IDL:module_info_seq_t:1.0"
def __init__(self, *args, **kw):
raise RuntimeError("Cannot construct objects of this type.")
_0__GlobalIDL.module_info_seq_t = module_info_seq_t
_0__GlobalIDL._d_module_info_seq_t = (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_any, 0)
_0__GlobalIDL._ad_module_info_seq_t = (omniORB.tcInternal.tv_alias, module_info_seq_t._NP_RepositoryId, "module_info_seq_t", (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_any, 0))
_0__GlobalIDL._tc_module_info_seq_t = omniORB.tcInternal.createTypeCode(_0__GlobalIDL._ad_module_info_seq_t)
omniORB.registerType(module_info_seq_t._NP_RepositoryId, _0__GlobalIDL._ad_module_info_seq_t, _0__GlobalIDL._tc_module_info_seq_t)
del module_info_seq_t
# typedef ... char_data_seq_t
class char_data_seq_t:
_NP_RepositoryId = "IDL:char_data_seq_t:1.0"
def __init__(self, *args, **kw):
raise RuntimeError("Cannot construct objects of this type.")
_0__GlobalIDL.char_data_seq_t = char_data_seq_t
_0__GlobalIDL._d_char_data_seq_t = (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_char, 0)
_0__GlobalIDL._ad_char_data_seq_t = (omniORB.tcInternal.tv_alias, char_data_seq_t._NP_RepositoryId, "char_data_seq_t", (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_char, 0))
_0__GlobalIDL._tc_char_data_seq_t = omniORB.tcInternal.createTypeCode(_0__GlobalIDL._ad_char_data_seq_t)
omniORB.registerType(char_data_seq_t._NP_RepositoryId, _0__GlobalIDL._ad_char_data_seq_t, _0__GlobalIDL._tc_char_data_seq_t)
del char_data_seq_t
# typedef ... short_data_seq_t
class short_data_seq_t:
_NP_RepositoryId = "IDL:short_data_seq_t:1.0"
def __init__(self, *args, **kw):
raise RuntimeError("Cannot construct objects of this type.")
_0__GlobalIDL.short_data_seq_t = short_data_seq_t
_0__GlobalIDL._d_short_data_seq_t = (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_short, 0)
_0__GlobalIDL._ad_short_data_seq_t = (omniORB.tcInternal.tv_alias, short_data_seq_t._NP_RepositoryId, "short_data_seq_t", (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_short, 0))
_0__GlobalIDL._tc_short_data_seq_t = omniORB.tcInternal.createTypeCode(_0__GlobalIDL._ad_short_data_seq_t)
omniORB.registerType(short_data_seq_t._NP_RepositoryId, _0__GlobalIDL._ad_short_data_seq_t, _0__GlobalIDL._tc_short_data_seq_t)
del short_data_seq_t
# typedef ... int_data_seq_t
class int_data_seq_t:
_NP_RepositoryId = "IDL:int_data_seq_t:1.0"
def __init__(self, *args, **kw):
raise RuntimeError("Cannot construct objects of this type.")
_0__GlobalIDL.int_data_seq_t = int_data_seq_t
_0__GlobalIDL._d_int_data_seq_t = (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_long, 0)
_0__GlobalIDL._ad_int_data_seq_t = (omniORB.tcInternal.tv_alias, int_data_seq_t._NP_RepositoryId, "int_data_seq_t", (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_long, 0))
_0__GlobalIDL._tc_int_data_seq_t = omniORB.tcInternal.createTypeCode(_0__GlobalIDL._ad_int_data_seq_t)
omniORB.registerType(int_data_seq_t._NP_RepositoryId, _0__GlobalIDL._ad_int_data_seq_t, _0__GlobalIDL._tc_int_data_seq_t)
del int_data_seq_t
# typedef ... float_data_seq_t
class float_data_seq_t:
_NP_RepositoryId = "IDL:float_data_seq_t:1.0"
def __init__(self, *args, **kw):
raise RuntimeError("Cannot construct objects of this type.")
_0__GlobalIDL.float_data_seq_t = float_data_seq_t
_0__GlobalIDL._d_float_data_seq_t = (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_float, 0)
_0__GlobalIDL._ad_float_data_seq_t = (omniORB.tcInternal.tv_alias, float_data_seq_t._NP_RepositoryId, "float_data_seq_t", (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_float, 0))
_0__GlobalIDL._tc_float_data_seq_t = omniORB.tcInternal.createTypeCode(_0__GlobalIDL._ad_float_data_seq_t)
omniORB.registerType(float_data_seq_t._NP_RepositoryId, _0__GlobalIDL._ad_float_data_seq_t, _0__GlobalIDL._tc_float_data_seq_t)
del float_data_seq_t
# typedef ... int64_data_seq_t
class int64_data_seq_t:
_NP_RepositoryId = "IDL:int64_data_seq_t:1.0"
def __init__(self, *args, **kw):
raise RuntimeError("Cannot construct objects of this type.")
_0__GlobalIDL.int64_data_seq_t = int64_data_seq_t
_0__GlobalIDL._d_int64_data_seq_t = (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_longlong, 0)
_0__GlobalIDL._ad_int64_data_seq_t = (omniORB.tcInternal.tv_alias, int64_data_seq_t._NP_RepositoryId, "int64_data_seq_t", (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_longlong, 0))
_0__GlobalIDL._tc_int64_data_seq_t = omniORB.tcInternal.createTypeCode(_0__GlobalIDL._ad_int64_data_seq_t)
omniORB.registerType(int64_data_seq_t._NP_RepositoryId, _0__GlobalIDL._ad_int64_data_seq_t, _0__GlobalIDL._tc_int64_data_seq_t)
del int64_data_seq_t
# typedef ... double_data_seq_t
class double_data_seq_t:
_NP_RepositoryId = "IDL:double_data_seq_t:1.0"
def __init__(self, *args, **kw):
raise RuntimeError("Cannot construct objects of this type.")
_0__GlobalIDL.double_data_seq_t = double_data_seq_t
_0__GlobalIDL._d_double_data_seq_t = (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_double, 0)
_0__GlobalIDL._ad_double_data_seq_t = (omniORB.tcInternal.tv_alias, double_data_seq_t._NP_RepositoryId, "double_data_seq_t", (omniORB.tcInternal.tv_sequence, omniORB.tcInternal.tv_double, 0))
_0__GlobalIDL._tc_double_data_seq_t = omniORB.tcInternal.createTypeCode(_0__GlobalIDL._ad_double_data_seq_t)
omniORB.registerType(double_data_seq_t._NP_RepositoryId, _0__GlobalIDL._ad_double_data_seq_t, _0__GlobalIDL._tc_double_data_seq_t)
del double_data_seq_t
# interface absdev
_0__GlobalIDL._d_absdev = (omniORB.tcInternal.tv_objref, "IDL:absdev:1.0", "absdev")
omniORB.typeMapping["IDL:absdev:1.0"] = _0__GlobalIDL._d_absdev
_0__GlobalIDL.absdev = omniORB.newEmptyClass()
class absdev :
_NP_RepositoryId = _0__GlobalIDL._d_absdev[1]
def __init__(self, *args, **kw):
raise RuntimeError("Cannot construct objects of this type.")
_nil = CORBA.Object._nil
_0__GlobalIDL.absdev = absdev
_0__GlobalIDL._tc_absdev = omniORB.tcInternal.createTypeCode(_0__GlobalIDL._d_absdev)
omniORB.registerType(absdev._NP_RepositoryId, _0__GlobalIDL._d_absdev, _0__GlobalIDL._tc_absdev)
# absdev operations and attributes
absdev._d_init_system_orb = ((omniORB.tcInternal.tv_long, ), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_release_system_orb = ((omniORB.tcInternal.tv_long, ), (omniORB.tcInternal.tv_long, ), None)
absdev._d_init_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, (omniORB.tcInternal.tv_string,0)), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_read_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:module_info_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:module_info_seq_t:1.0"]), None)
absdev._d_drive_module_orb = ((omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:module_info_seq_t:1.0"], omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_load_module_orb = ((omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:module_info_seq_t:1.0"], omniORB.tcInternal.tv_long), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_stop_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_stop_all_orb = ((omniORB.tcInternal.tv_long, ), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_start_acquisition_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_stop_acquisition_orb = ((omniORB.tcInternal.tv_long, ), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_readblock_params_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_char_readblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:char_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:char_data_seq_t:1.0"]), None)
absdev._d_short_readblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:short_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:short_data_seq_t:1.0"]), None)
absdev._d_int_readblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:int_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:int_data_seq_t:1.0"]), None)
absdev._d_int64_readblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:int64_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:int64_data_seq_t:1.0"]), None)
absdev._d_float_readblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:float_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:float_data_seq_t:1.0"]), None)
absdev._d_double_readblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:double_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:double_data_seq_t:1.0"]), None)
absdev._d_char_loadblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:char_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_short_loadblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:short_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_int_loadblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:int_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_int64_loadblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:int64_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_float_loadblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:float_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_double_loadblock_module_orb = ((omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:double_data_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.tcInternal.tv_long), None)
absdev._d_read_allmodules_orb = ((omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:module_info_seq_t:1.0"]), (omniORB.tcInternal.tv_long, omniORB.typeMapping["IDL:module_info_seq_t:1.0"]), None)
# absdev object reference
class _objref_absdev (CORBA.Object):
_NP_RepositoryId = absdev._NP_RepositoryId
def __init__(self, obj):
CORBA.Object.__init__(self, obj)
def init_system_orb(self, *args):
return self._obj.invoke("init_system_orb", _0__GlobalIDL.absdev._d_init_system_orb, args)
def release_system_orb(self, *args):
return self._obj.invoke("release_system_orb", _0__GlobalIDL.absdev._d_release_system_orb, args)
def init_module_orb(self, *args):
return self._obj.invoke("init_module_orb", _0__GlobalIDL.absdev._d_init_module_orb, args)
def read_module_orb(self, *args):
return self._obj.invoke("read_module_orb", _0__GlobalIDL.absdev._d_read_module_orb, args)
def drive_module_orb(self, *args):
return self._obj.invoke("drive_module_orb", _0__GlobalIDL.absdev._d_drive_module_orb, args)
def load_module_orb(self, *args):
return self._obj.invoke("load_module_orb", _0__GlobalIDL.absdev._d_load_module_orb, args)
def stop_module_orb(self, *args):
return self._obj.invoke("stop_module_orb", _0__GlobalIDL.absdev._d_stop_module_orb, args)
def stop_all_orb(self, *args):
return self._obj.invoke("stop_all_orb", _0__GlobalIDL.absdev._d_stop_all_orb, args)
def start_acquisition_orb(self, *args):
return self._obj.invoke("start_acquisition_orb", _0__GlobalIDL.absdev._d_start_acquisition_orb, args)
def stop_acquisition_orb(self, *args):
return self._obj.invoke("stop_acquisition_orb", _0__GlobalIDL.absdev._d_stop_acquisition_orb, args)
def readblock_params_orb(self, *args):
return self._obj.invoke("readblock_params_orb", _0__GlobalIDL.absdev._d_readblock_params_orb, args)
def char_readblock_module_orb(self, *args):
return self._obj.invoke("char_readblock_module_orb", _0__GlobalIDL.absdev._d_char_readblock_module_orb, args)
def short_readblock_module_orb(self, *args):
return self._obj.invoke("short_readblock_module_orb", _0__GlobalIDL.absdev._d_short_readblock_module_orb, args)
def int_readblock_module_orb(self, *args):
return self._obj.invoke("int_readblock_module_orb", _0__GlobalIDL.absdev._d_int_readblock_module_orb, args)
def int64_readblock_module_orb(self, *args):
return self._obj.invoke("int64_readblock_module_orb", _0__GlobalIDL.absdev._d_int64_readblock_module_orb, args)
def float_readblock_module_orb(self, *args):
return self._obj.invoke("float_readblock_module_orb", _0__GlobalIDL.absdev._d_float_readblock_module_orb, args)
def double_readblock_module_orb(self, *args):
return self._obj.invoke("double_readblock_module_orb", _0__GlobalIDL.absdev._d_double_readblock_module_orb, args)
def char_loadblock_module_orb(self, *args):
return self._obj.invoke("char_loadblock_module_orb", _0__GlobalIDL.absdev._d_char_loadblock_module_orb, args)
def short_loadblock_module_orb(self, *args):
return self._obj.invoke("short_loadblock_module_orb", _0__GlobalIDL.absdev._d_short_loadblock_module_orb, args)
def int_loadblock_module_orb(self, *args):
return self._obj.invoke("int_loadblock_module_orb", _0__GlobalIDL.absdev._d_int_loadblock_module_orb, args)
def int64_loadblock_module_orb(self, *args):
return self._obj.invoke("int64_loadblock_module_orb", _0__GlobalIDL.absdev._d_int64_loadblock_module_orb, args)
def float_loadblock_module_orb(self, *args):
return self._obj.invoke("float_loadblock_module_orb", _0__GlobalIDL.absdev._d_float_loadblock_module_orb, args)
def double_loadblock_module_orb(self, *args):
return self._obj.invoke("double_loadblock_module_orb", _0__GlobalIDL.absdev._d_double_loadblock_module_orb, args)
def read_allmodules_orb(self, *args):
return self._obj.invoke("read_allmodules_orb", _0__GlobalIDL.absdev._d_read_allmodules_orb, args)
omniORB.registerObjref(absdev._NP_RepositoryId, _objref_absdev)
_0__GlobalIDL._objref_absdev = _objref_absdev
del absdev, _objref_absdev
# absdev skeleton
__name__ = "_GlobalIDL__POA"
class absdev (PortableServer.Servant):
_NP_RepositoryId = _0__GlobalIDL.absdev._NP_RepositoryId
_omni_op_d = {"init_system_orb": _0__GlobalIDL.absdev._d_init_system_orb, "release_system_orb": _0__GlobalIDL.absdev._d_release_system_orb, "init_module_orb": _0__GlobalIDL.absdev._d_init_module_orb, "read_module_orb": _0__GlobalIDL.absdev._d_read_module_orb, "drive_module_orb": _0__GlobalIDL.absdev._d_drive_module_orb, "load_module_orb": _0__GlobalIDL.absdev._d_load_module_orb, "stop_module_orb": _0__GlobalIDL.absdev._d_stop_module_orb, "stop_all_orb": _0__GlobalIDL.absdev._d_stop_all_orb, "start_acquisition_orb": _0__GlobalIDL.absdev._d_start_acquisition_orb, "stop_acquisition_orb": _0__GlobalIDL.absdev._d_stop_acquisition_orb, "readblock_params_orb": _0__GlobalIDL.absdev._d_readblock_params_orb, "char_readblock_module_orb": _0__GlobalIDL.absdev._d_char_readblock_module_orb, "short_readblock_module_orb": _0__GlobalIDL.absdev._d_short_readblock_module_orb, "int_readblock_module_orb": _0__GlobalIDL.absdev._d_int_readblock_module_orb, "int64_readblock_module_orb": _0__GlobalIDL.absdev._d_int64_readblock_module_orb, "float_readblock_module_orb": _0__GlobalIDL.absdev._d_float_readblock_module_orb, "double_readblock_module_orb": _0__GlobalIDL.absdev._d_double_readblock_module_orb, "char_loadblock_module_orb": _0__GlobalIDL.absdev._d_char_loadblock_module_orb, "short_loadblock_module_orb": _0__GlobalIDL.absdev._d_short_loadblock_module_orb, "int_loadblock_module_orb": _0__GlobalIDL.absdev._d_int_loadblock_module_orb, "int64_loadblock_module_orb": _0__GlobalIDL.absdev._d_int64_loadblock_module_orb, "float_loadblock_module_orb": _0__GlobalIDL.absdev._d_float_loadblock_module_orb, "double_loadblock_module_orb": _0__GlobalIDL.absdev._d_double_loadblock_module_orb, "read_allmodules_orb": _0__GlobalIDL.absdev._d_read_allmodules_orb}
absdev._omni_skeleton = absdev
_0__GlobalIDL__POA.absdev = absdev
omniORB.registerSkeleton(absdev._NP_RepositoryId, absdev)
del absdev
__name__ = "_GlobalIDL"
#
# End of module "_GlobalIDL"
#
__name__ = "nicos.devices.vendor.caress.absdev_idl"
_exported_modules = ( "_GlobalIDL", )
# The end.
| 72.618321 | 1,755 | 0.818669 | 2,874 | 19,026 | 4.837161 | 0.041754 | 0.212775 | 0.226874 | 0.216731 | 0.885844 | 0.852971 | 0.804201 | 0.759315 | 0.687167 | 0.626313 | 0 | 0.014909 | 0.076369 | 19,026 | 261 | 1,756 | 72.896552 | 0.776191 | 0.023494 | 0 | 0.097297 | 1 | 0 | 0.125983 | 0.077541 | 0 | 0 | 0 | 0 | 0 | 1 | 0.183784 | false | 0 | 0.016216 | 0.135135 | 0.454054 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 11 |
e51efe469cac843e639553b2e5ff7977ccbd47a7 | 303 | py | Python | ignite/contrib/metrics/__init__.py | nzare/ignite | 002b595daa8a8345286c5e096c33e278948686a7 | [
"BSD-3-Clause"
] | 1 | 2020-08-29T16:49:36.000Z | 2020-08-29T16:49:36.000Z | ignite/contrib/metrics/__init__.py | alxlampe/ignite | b53c6aeef87754b3cd3638c91172b386dc73af12 | [
"BSD-3-Clause"
] | 5 | 2020-08-29T16:49:48.000Z | 2020-08-29T17:05:54.000Z | ignite/contrib/metrics/__init__.py | alxlampe/ignite | b53c6aeef87754b3cd3638c91172b386dc73af12 | [
"BSD-3-Clause"
] | 1 | 2020-10-15T06:21:01.000Z | 2020-10-15T06:21:01.000Z | import ignite.contrib.metrics.regression
from ignite.contrib.metrics.average_precision import AveragePrecision
from ignite.contrib.metrics.gpu_info import GpuInfo
from ignite.contrib.metrics.precision_recall_curve import PrecisionRecallCurve
from ignite.contrib.metrics.roc_auc import ROC_AUC, RocCurve
| 50.5 | 78 | 0.884488 | 40 | 303 | 6.55 | 0.45 | 0.248092 | 0.381679 | 0.366412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062706 | 303 | 5 | 79 | 60.6 | 0.922535 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e529bdaa0e0c82fbc9c33c4a87ed838d987388d7 | 16,351 | py | Python | codes/clase_16/reglas.py | mlares/computacion2020 | 185bfded8ef1670e80b1c2cdc1fceb365d962b0e | [
"MIT"
] | null | null | null | codes/clase_16/reglas.py | mlares/computacion2020 | 185bfded8ef1670e80b1c2cdc1fceb365d962b0e | [
"MIT"
] | null | null | null | codes/clase_16/reglas.py | mlares/computacion2020 | 185bfded8ef1670e80b1c2cdc1fceb365d962b0e | [
"MIT"
] | null | null | null | {
"cells": [
{
"cell_type": "code",
"execution_count": 59,
"id": "fatty-container",
"metadata": {},
"outputs": [],
"source": [
"def cuad_pmedio(a, b, f):\n",
" \"\"\"Implementación de la regla del punto medio\n",
" \n",
" Parameters\n",
" ----------\n",
" f: La función a integrar\n",
" a: Límite inferior del intervalo\n",
" b: Límite superior del intervalo\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla del punto medio\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" \"\"\"\n",
" if a > b:\n",
" raise ValueError(\"Oops! Debe ser a<b\")\n",
" return None\n",
" try:\n",
" x0 = (a+b)/2\n",
" h = f(x0)\n",
" aprox = h*(b-a)\n",
" except:\n",
" print('Error: no fue posible calcular la función')\n",
" return aprox\n",
"\n",
"def cuad_trapecio(a, b, f):\n",
" \"\"\"Implementación de la regla del trapecio\n",
" \n",
" Parameters\n",
" ----------\n",
" f: La función a integrar\n",
" a: Límite inferior del intervalo\n",
" b: Límite superior del intervalo\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla del trapecio\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" \"\"\"\n",
" if a > b:\n",
" raise ValueError(\"Oops! Debe ser a<b\")\n",
" return None\n",
" try:\n",
" h = f(a) + f(b)\n",
" aprox = (b-a)/2*h\n",
" except:\n",
" print('Error: no fue posible calcular la función')\n",
" return aprox\n",
"\n",
"def cuad_simpson(a, b, f):\n",
" \"\"\"Implementación de la regla de Simpson\n",
" \n",
" Parameters\n",
" ----------\n",
" f: La función a integrar\n",
" a: Límite inferior del intervalo\n",
" b: Límite superior del intervalo\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla de Simpson\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" \"\"\"\n",
" if a > b:\n",
" raise ValueError(\"Oops! Debe ser a<b\")\n",
" return None\n",
" try:\n",
" x0 = (a+b)/2\n",
" h = f(a) + f(b) + 4*f(x0)\n",
" aprox = (b-a)/6*h\n",
" except:\n",
" print('Error: no fue posible calcular la función')\n",
" return aprox\n",
"\n",
"def f(x):\n",
" return x**2\n",
"\n",
"I = cuad_pmedio(1, 2, f)\n",
"print(f'La regla del punto medio da como resultado: {I}')\n",
"\n",
"I = cuad_trapecio(1, 2, f)\n",
"print(f'La regla del trapecio da como resultado: {I}')\n",
"\n",
"I = cuad_simpson(1, 2, f)\n",
"print(f'La regla de simpson da como resultado: {I}')\n",
"\n",
"\n",
"\n",
"def cuad_pmedio(a, b, y0):\n",
" \"\"\"Implementación de la regla del punto medio\n",
" \n",
" Parameters\n",
" ----------\n",
" f: La función a integrar\n",
" a: Límite inferior del intervalo\n",
" b: Límite superior del intervalo\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla del punto medio\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" \"\"\"\n",
" try:\n",
" x0 = (a+b)/2\n",
" aprox = x0*y0\n",
" except:\n",
" print('Error: no fue posible calcular la función')\n",
" return aprox\n",
"\n",
"cuad_pmedio(0, 1, 0.5)\n",
"\n",
"def cuad_pmedio(a, b, f=None, y0=None):\n",
" \"\"\"Implementación de la regla del punto medio\n",
" \n",
" Parameters\n",
" ----------\n",
" a: float\n",
" Límite inferior del intervalo\n",
" b: float\n",
" Límite superior del intervalo\n",
" f: function (1 parameter)\n",
" La función a integrar\n",
" y0: float\n",
" El valor de y en el punto medio.\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla del punto medio\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" \"\"\"\n",
" if a > b:\n",
" raise ValueError(\"Oops! Debe ser a<b\")\n",
"\n",
" x0 = (a+b)/2\n",
" if (f is None) and (y0 is not None):\n",
" aprox = x0*y0\n",
" elif (f is not None) and (y0 is None): \n",
" try:\n",
" h = f(x0)\n",
" except:\n",
" print(('Error: no fue posible calcular la función'\n",
" ' Si desea ingresar un dato use y0='))\n",
" aprox = h*(b-a)\n",
"\n",
" else:\n",
" raise ValueError(\"Debe ingresar la función o los datos!\") \n",
" \n",
" return aprox\n",
"\n",
"cuad_pmedio(0, 1, y0=0.5)\n",
"\n",
"def cuad_trapecio(x0, x1, f=None, y0=None, y1=None):\n",
" \"\"\"Implementación de la regla del trapecio\n",
" \n",
" Parameters\n",
" ----------\n",
" x0: float\n",
" Límite inferior del intervalo\n",
" x1: float\n",
" Límite superior del intervalo\n",
" f: function (1 parameter)\n",
" La función a integrar\n",
" y0: float\n",
" El valor de y en el punto medio.\n",
" y1: float\n",
" El valor de y en el punto medio.\n",
"\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla del punto medio\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" Uso: \n",
" cuad_trapecio(x0, x1, f=f)\n",
" cuad_trapecio(x0, x1, y0=f(x0), y1=f(x1))\n",
" \"\"\"\n",
" if x0 > x1:\n",
" raise ValueError(\"Oops! Debe ser a<b\")\n",
"\n",
" if (f is None) and (y0 is not None) and (y1 is not None):\n",
" aprox = (x1-x0)*(y0+y1)/2\n",
" elif (f is not None) and (y0 is None): \n",
" try:\n",
" y0 = f(x0)\n",
" y1 = f(x1)\n",
" except:\n",
" print(('Error: no fue posible calcular la función'\n",
" ' Si desea ingresar un dato use y0='))\n",
" aprox = (x1-x0)*(y0+y1)/2\n",
"\n",
" else:\n",
" raise ValueError(\"Debe ingresar la función o los datos!\") \n",
" \n",
" return aprox\n",
"\n",
"cuad_trapecio(0, 1, f)\n",
"\n",
"cuad_trapecio(0, 1, y0=f(0), y1=f(1))\n",
"\n",
"def contar_argumentos(func):\n",
" def inner(*args, **kwargs):\n",
" nargs_in = len(args) + len(kwargs)\n",
" return func(*args, **kwargs, nargs_in=nargs_in)\n",
" return inner\n",
"\n",
"\n",
"\n",
"@contar_argumentos\n",
"def cuad_trapecio(x0, x1, f=None, y0=None, y1=None, nargs_in=None):\n",
" \"\"\"Implementación de la regla del trapecio\n",
" \n",
" Parameters\n",
" ----------\n",
" x0: float\n",
" Límite inferior del intervalo\n",
" x1: float\n",
" Límite superior del intervalo\n",
" f: function (1 parameter)\n",
" La función a integrar\n",
" y0: float\n",
" El valor de y en el punto medio.\n",
" y1: float\n",
" El valor de y en el punto medio.\n",
"\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla del punto medio\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" Uso: \n",
" cuad_trapecio(x0, x1, f=f)\n",
" cuad_trapecio(x0, x1, y0=f(x0), y1=f(x1))\n",
" \"\"\"\n",
" if nargs_in==4:\n",
" y1=y0 \n",
" y0=f\n",
" f = None\n",
" elif nargs_in==3:\n",
" if type(f) is float: \n",
" raise ValueError(\"Verificar los argumentos\")\n",
" else:\n",
" raise ValueError(\"Verificar el número de argumentos\")\n",
" \n",
" if x0 > x1:\n",
" raise ValueError(\"Oops! Debe ser a<b\")\n",
"\n",
" if (f is None) and (y0 is not None) and (y1 is not None):\n",
" aprox = (x1-x0)*(y0+y1)/2\n",
" elif (f is not None) and (y0 is None): \n",
" try:\n",
" y0 = f(x0)\n",
" y1 = f(x1)\n",
" except:\n",
" print(('Error: no fue posible calcular la función'\n",
" ' Si desea ingresar un dato use y0='))\n",
" aprox = (x1-x0)*(y0+y1)/2\n",
"\n",
" else:\n",
" raise ValueError(\"Debe ingresar la función o los datos!\") \n",
" \n",
" return aprox\n",
"\n",
"cuad_trapecio(0, 1, f)\n",
"\n",
"cuad_trapecio(0, 1, f(0), f(1))\n",
"\n",
"@contar_argumentos\n",
"def cuad_simpson(x0, x2, f=None, y0=None, y1=None, y2=None, nargs_in=None):\n",
" \"\"\"Implementación de la regla de simpson\n",
" \n",
" Parameters\n",
" ----------\n",
" x0: float\n",
" Límite inferior del intervalo\n",
" x2: float\n",
" Límite superior del intervalo\n",
" f: function (1 parameter)\n",
" La función a integrar\n",
" y0: float\n",
" El valor de y en el punto medio.\n",
" y2: float\n",
" El valor de y en el punto medio.\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla de Simpson\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" Uso: \n",
" cuad_simpson(x0, x2, f=f)\n",
" cuad_simpson(x0, x2, y0=f(x0), y2=f(x2))\n",
" cuad_simpson(x0, x2, f)\n",
" cuad_simpson(x0, x2, y0, y2)\n",
" \"\"\"\n",
" \n",
" if nargs_in==5:\n",
" y2=y1\n",
" y1=y0\n",
" y0=f\n",
" f = None\n",
" elif nargs_in==3:\n",
" if type(f) is float: \n",
" raise ValueError(\"Verificar los argumentos\")\n",
" else:\n",
" raise ValueError(\"Verificar el número de argumentos\")\n",
" \n",
" if x0 > x2:\n",
" raise ValueError(\"Oops! Debe ser a<b\")\n",
" \n",
" x1 = (x0+x2)/2\n",
"\n",
" if (f is None) and (y0 is not None) and (y1 is not None):\n",
" aprox = (x2-x0)/6 * (y0 + 4*y1 + y2)\n",
" elif (f is not None) and (y0 is None): \n",
" try:\n",
" y0 = f(x0)\n",
" y1 = f(x1)\n",
" y2 = f(x2)\n",
" except:\n",
" print(('Error: no fue posible calcular la función'\n",
" ' Si desea ingresar un dato use y0='))\n",
" aprox = (x2-x0)/6 * (y0 + 4*y1 + y2)\n",
"\n",
" else:\n",
" raise ValueError(\"Debe ingresar la función o los datos!\") \n",
" \n",
" return aprox\n",
"\n",
"cuad_simpson(0, 1, f)\n",
"\n",
"cuad_simpson(0, 1, f(0), f(0.5), f(1))\n",
"\n",
"\n",
"\n",
"import numpy as np\n",
"x = np.linspace(0, 10, 11)\n",
"\n",
"x\n",
"\n",
"np.diff(x)\n",
"\n",
"def cuad_simpson_compuesta(x, f=None, y=None):\n",
" \"\"\"Implementación de la regla de simpson\n",
" \n",
" Parameters\n",
" ----------\n",
" x: list or array\n",
" Lista de valores de x\n",
" f: function (1 parameter)\n",
" La función a integrar\n",
" y: list or array\n",
" La lista de valores de y\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla de Simpson\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" Uso: \n",
" cuad_simpson(x, y=y)\n",
" cuad_simpson(x, f=f)\n",
" \"\"\"\n",
" import numpy as np\n",
"\n",
" # Primero verificar si la particion es regular \n",
" x = np.array(x)\n",
" x.sort() \n",
" H = (max(x) - min(x))/len(x-1)\n",
" equiesp = np.std(np.diff(x)) < H*1.e-6\n",
" \n",
" # Calcular los valores de y (si se pasó una función)\n",
" if (y is None) and (f is not None):\n",
" y = f(x)\n",
" \n",
" n = len(x)\n",
" \n",
" if equiesp: \n",
" impares = y[1:-1:2].sum()\n",
" pares = y[2:-1:2].sum() \n",
" H = y[0] + 2*pares + 4*impares + y[-1] \n",
" H = H / (3*n)\n",
" aprox = (x[-1]-x[0])*H\n",
" else:\n",
" aprox = 0\n",
" for i in range(0, len(x)-2, 2):\n",
" aprox += cuad_simpson(x[i], x[i+2], y[i], y[i+1], y[i+2])\n",
" \n",
" return aprox\n",
"\n",
"def f(x):\n",
" return x**2\n",
"\n",
"x = np.linspace(0, 1, 999)\n",
"xr = np.random.uniform(0, 1, 1000)\n",
"y = f(x)\n",
"yr = f(xr)\n",
"\n",
"cuad_simpson_compuesta(x, y=y)\n",
"\n",
"cuad_simpson_compuesta(xr, y=yr)\n",
"\n",
"cuad_simpson_compuesta(x, f=f)\n",
"\n",
"from scipy import integrate\n",
"\n",
"integrate.quad(f, 0, 1)\n",
"\n",
"##### Otra opción sería dar el intervalo y la función, e ir achicando la norma de la partición hasta que el error sea menor que algún valor dado.\n",
"\n",
"def cuad_simpson_compuesta_II(f, I, eps):\n",
" \"\"\"Implementación de la regla de Simpson\n",
" \n",
" Parameters\n",
" ----------\n",
" I: list\n",
" Intervalo de integración, ingresado como lista de dos elementos\n",
" f: function (1 parameter)\n",
" La función a integrar\n",
" \n",
" Returns\n",
" -------\n",
" aprox: Aproximación de la integral por la regla de Simpson\n",
" \n",
" Notes\n",
" -----\n",
" Este código es parte del curso \"Computación\", Famaf\n",
" Uso: \n",
" cuad_simpson_compuesta_II(f, I)\n",
" cuad_simpson_compuesta_II(f, I)\n",
" \"\"\"\n",
" import numpy as np\n",
"\n",
" delta = 2*eps\n",
" n = 2\n",
" aprox_old = (I[1]-I[0])*f((I[1]+I[0])/2)\n",
"\n",
" while delta > eps:\n",
" x = np.linspace(I[0], I[1], n)\n",
" aprox = cuad_simpson_compuesta(x, f=f)\n",
" delta = abs(aprox - aprox_old)\n",
" aprox_old = aprox\n",
" n += 10\n",
" if n>5000:\n",
" break\n",
"\n",
" return aprox\n",
"\n",
"I = cuad_simpson_compuesta_II(f, [0, 1], 1.e-6)\n",
"\n",
"I"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.5"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
| 32.060784 | 154 | 0.428965 | 2,180 | 16,351 | 3.182569 | 0.095413 | 0.040934 | 0.02998 | 0.020755 | 0.811041 | 0.78452 | 0.754828 | 0.719372 | 0.697463 | 0.685356 | 0 | 0.026517 | 0.36114 | 16,351 | 509 | 155 | 32.123772 | 0.63766 | 0 | 0 | 0.742633 | 0 | 0.021611 | 0.683995 | 0.017981 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.007859 | 0 | 0.007859 | 0.021611 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e53e00889d9b721d960324b40c736638b1863f32 | 7,311 | py | Python | assets/footer.py | ForestMars/Coda.to | 55e99a8fb1867738e0bb2292461fa2bf3a7770f7 | [
"DOC",
"MIT"
] | null | null | null | assets/footer.py | ForestMars/Coda.to | 55e99a8fb1867738e0bb2292461fa2bf3a7770f7 | [
"DOC",
"MIT"
] | null | null | null | assets/footer.py | ForestMars/Coda.to | 55e99a8fb1867738e0bb2292461fa2bf3a7770f7 | [
"DOC",
"MIT"
] | null | null | null | # footer.py - module to dynamically generate a static footer.
__version__ = '0.1'
__all__ = ['get_header', 'get_menu']
import dash_html_components as html
import dash_dangerously_set_inner_html
def get_footer():
footer = html.Div([
dash_dangerously_set_inner_html.DangerouslySetInnerHTML('''
'<!--=== Footer Version 1 ===-->
<div class="footer-v1">
<div class="footer">
<div class="container">
<div class="row">
<!-- About -->
<div class="col-md-3 md-margin-bottom-40">
<a href="/">
<!-- img id="logo-footer" class="footer-logo" src="logo.png" alt="" width="200px" -->
<br />
<!-- img id="logo-footer" class="footer-logo" src="logo.png" alt="" width="200px" -->
<div id="footer-logo-svg" style="text-align:center;">
<img height="100%" id="logo-footer" class="footer-logo" src="assets/img/covid_circle.jpg" title="Covid Data Tools" alt="Covid Data Tools" />
</div>
</a>
</div><!--/col-md-3-->
<!-- End About -->
<!-- Link List -->
<div class="col-md-3">
</div><!--/col-md-3-->
<!-- Link List -->
<div class="col-md-3">
<div class="headline"><h2>Links</h2></div>
<ul class="list-unstyled link-list">
<li><a href="/about">About Us</a><i class="fa fa-angle-right"></i></li>
<!-- li><a href="/scope-and-sequence">Contribute</a><i class="fa fa-angle-right"></i></li -->
<li><a href="terms-of-service">Terms of Use</a><i class="fa fa-angle-right"></i></li>
<li><a href="privacy-policy">Privacy Policy</a><i class="fa fa-angle-right"></i></li>
<li><a href="/sitemap">Site Map</a><i class="fa fa-angle-right"></i></li>
</ul>
</div><!--/col-md-3-->
<!-- End Link List -->
<!-- Address -->
<div class="col-md-3 map-img md-margin-bottom-40">
<div class="headline"><h2>Contact</h2></div>
<address class="md-margin-bottom-40">
<a href="href="tel:3476887501" alt="347-688-7501" title="347-688-7501" class="">Phone</a><br />
<a href="mailto:coronapocalypse@gmail.com" class="">Email</a><br />
<a href="https://github.com/ForestMars/Coda.to" class="">Github</a><br />
<a href="http://twitter.com/codatatools" class="">Social</a>
</address>
</div><!--/col-md-3-->
<!-- End Address -->
</div>
</div>
</div><!--/footer-->
<div class="copyright">
<div class="container">
<div class="row">
<div class="col-md-18">
<p color="#ff0000"><br /><small><i>
© 2020 Coda.to, All Rights Reserved. “Coda.to” is a registered trademarks of Coda Computing LLC. Some Patents Pending.<a name="ngss-tm"></a> Covid Data Tools are free to use however no warranty whatsoever is intended or implied. Data sources are explicitly stated and no responsibility for the accuracy of the data is assumed. These tools are provided for collaboration and insight.
</i></small></p>
</div>
</div>
</div>
</div><!--/copyright-->
</div>
<!--=== End Footer Version 1 ===-->
'''),
],
)
return footer
def get_footer_links():
footer = html.Div([
dash_dangerously_set_inner_html.DangerouslySetInnerHTML('''
'<!--=== Footer Version 1 ===-->
<div class="footer-v1">
<div class="footer">
<div class="container">
<div class="row">
<!-- About -->
<div class="col-md-3 md-margin-bottom-40">
<a href="/">
<!-- img id="logo-footer" class="footer-logo" src="logo.png" alt="" width="200px" -->
<div id="footer-logo-svg">
<img height="100%" id="logo-footer" class="footer-logo" src="dk-bkgrnd.svg" alt="">
</div>
</a>
</div><!--/col-md-3-->
<!-- End About -->
<!-- Link List -->
<div class="col-md-3 md-margin-bottom-40">
<div class="headline"><h2>Links</h2></div>
<ul class="list-unstyled link-list">
<li><a href="/about">About Us</a><i class="fa fa-angle-right"></i></li>
<!-- li><a href="/scope-and-sequence">Contribute</a><i class="fa fa-angle-right"></i></li -->
<li><a href="terms-of-service">Terms of Use</a><i class="fa fa-angle-right"></i></li>
<li><a href="privacy-policy">Privacy Policy</a><i class="fa fa-angle-right"></i></li>
<li><a href="/sitemap">Site Map</a><i class="fa fa-angle-right"></i></li>
</ul>
</div><!--/col-md-3-->
<!-- End Link List -->
<!-- Address -->
<div class="col-md-3 map-img md-margin-bottom-40">
<div class="headline"><h2>Contact</h2></div>
<address class="md-margin-bottom-40">
<a href="href="tel:3476887501" title="347-688-7501" class="">Phone</a><br />
<a href="mailto:coronapocalypse@gmail.com" class="">Email</a><br />
<a href="https://github.com/ForestMars/Coda.to" class="">Github</a><br />
<a href="http://twitter.com/codatatools" class="">Social</a>
</address>
</div><!--/col-md-3-->
<!-- End Address -->
</div>
</div>
</div><!--/footer-->
<!--=== Link Footer above /// TOS Footer below ===-->
'''),
],
)
return footer
def get_footer_tos():
footer = html.Div([
dash_dangerously_set_inner_html.DangerouslySetInnerHTML('''
<div class="copyright">
<div class="container">
<div class="row">
<div class="col-md-18">
<p color="#ff0000"><br /><small><i>
© 2020 Coda.to, All Rights Reserved. “Coda.to” is a registered trademarks of Coda Computing LLC. Some Patents Pending.<a name="ngss-tm"></a> Covid Data Tools are free to use however no warranty whatsoever is intended or implied. Data sources are explicitly stated and no responsibility for the accuracy of the data is assumed. These tools are provided for collaboration and insight.
</i></small></p>
</div>
</div>
</div>
</div><!--/copyright-->
</div>
<!--=== End Footer Version 1 ===-->
'''),
],
)
return footer
| 39.306452 | 410 | 0.46601 | 831 | 7,311 | 4.061372 | 0.188929 | 0.064 | 0.024889 | 0.026667 | 0.918519 | 0.899852 | 0.899852 | 0.899852 | 0.892444 | 0.869037 | 0 | 0.027267 | 0.352893 | 7,311 | 185 | 411 | 39.518919 | 0.68569 | 0.00807 | 0 | 0.870504 | 1 | 0.151079 | 0.923034 | 0.179862 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021583 | false | 0 | 0.014388 | 0 | 0.057554 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e54be1e6a29867303e0f6984d21ed633b7a0ad9e | 10,108 | py | Python | JigsawPuzzles/Lozenge_variableCuts_SinglePass.py | Sequynth/Lasercuts | 88bdf91f60f51592f63328a08c7adf6f74618718 | [
"MIT"
] | null | null | null | JigsawPuzzles/Lozenge_variableCuts_SinglePass.py | Sequynth/Lasercuts | 88bdf91f60f51592f63328a08c7adf6f74618718 | [
"MIT"
] | null | null | null | JigsawPuzzles/Lozenge_variableCuts_SinglePass.py | Sequynth/Lasercuts | 88bdf91f60f51592f63328a08c7adf6f74618718 | [
"MIT"
] | null | null | null | # Johannes Fischer
# 28.12.2017
import math
import SVGtools
import numpy as np
def DrawLine(x1,y1,x2,y2,c):
global id
f.write('<path id="path_{0:d}" style="fill:none;fill-rule:evenodd;stroke:#{5:s};stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="M {1:.2f},{2:.2f} {3:.2f},{4:.2f}"/>\n'.format(id,x1,y1,x2,y2,c));
id = id+1;
return;
def RoundedRect(x0,y0,w,h,r,c):
# w: width
# h: height
# r: radius in all 4 corners
# c: color of rectangle
# start with top left corner
global id
f.write('<path id="path_{0:d}" style="fill:none;fill-rule:evenodd;stroke:#{1:s};stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" \
d="M {2:f},{3:f} H {4:f} A {5:f} {5:f} 0 0 1 {6:f} {7:f} V {8:f} A {5:f} {5:f} 0 0 1 {9:f} {10:f} H {11:f} A {5:f} {5:f} 0 0 1 {12:f} {13:f} V {14:f} A {5:f} {5:f} 0 0 1 {15:f} {16:f}"/>'\
.format(id,c,x0+r,y0,x0+w-r,r,x0+w,y0+r,y0+h-r,x0+w-r,y0+h,x0+r,x0,y0+h-r,y0+r,x0+r,y0));
return;
f = open('Lozenge_variableCuts_singlePass.svg','w');
# size of workbed
Y = 600;
X = 1210;
#write svg header
SVGtools.SVGheader(f,Y,X);
# cell length
a = 10;#mm
# margin
mx = 15;#mm
my = 15 + a*math.sqrt(2)/2;#mm
# number of elements along one side
N = 8;
p = np.array([[ 8, 6, 2, 2, 2, 2, 3, 3],
[ 8, 6, 2, 2, 2, 2, 3, 0],
[ 8, 6, 6, 2, 2, 2, 3, 3],
[ 8, 8, 6, 9, 2, 2, 3, 0],
[ 8, 1, 8, 6, 9, 2, 9, 3],
[ 1, 1, 8, 6, 9, 9, 3, 0],
[ 1,13, 1, 1, 6, 9, 9 ,9],
[ 1,13, 1, 1, 6, 1, 9, 0],
[ 1,13,13, 1, 6, 1, 9, 4],
[ 1,13,13, 1, 1, 9, 4, 0],
[ 7, 1,13,13, 1, 9, 4,10],
[ 7, 1,13, 7, 1, 4,10, 0],
[11, 7,13, 7, 4, 4, 4,10],
[11, 7, 7, 4, 4, 4,10, 0],
[11,11, 7, 4, 4, 4,10,10]]);
np.transpose(p);
id = 0;
# draw inner structure
for ii in range(1,N-1):
for jj in range(1,N-1):
# top left
if p[2*ii][jj] != p[2*ii-1][jj-1]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2),c);
# top right
if p[2*ii][jj] != p[2*ii-1][jj]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+ii*a*math.sqrt(2),c);
# bottom left
if p[2*ii][jj] != p[2*ii+1][jj-1]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2),c);
# bottom right
if p[2*ii][jj] != p[2*ii+1][jj]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+ii*a*math.sqrt(2),c);
jj = 0;
for ii in range(0,N):
c = '0000ff';
# top left
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2),c);
# bottom left
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2),c);
# top right
if p[2*ii][jj] != p[2*ii-1][jj]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+ii*a*math.sqrt(2),c);
# bottom right
if ii == N-1 or p[2*ii][jj] != p[2*ii+1][jj]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+ii*a*math.sqrt(2),c);
jj = N-1;
for ii in range(0,N):
c = '0000ff';
# top right
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+ii*a*math.sqrt(2),c);
# bottom right
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+(ii)*a*math.sqrt(2),c);
# top left
if p[2*ii][jj] != p[2*ii-1][jj-1]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2),c);
# bottom left
if ii == N-1 or p[2*ii][jj] != p[2*ii+1][jj-1]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2),c);
ii = 0;
for jj in range(1,N-1):
c = '0000ff';
# top left
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2),c);
# top right
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+ii*a*math.sqrt(2),c);
# bottom left
if p[2*ii][jj] != p[2*ii+1][jj-1]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2),c);
# bottom right
if p[2*ii][jj] != p[2*ii+1][jj]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+ii*a*math.sqrt(2),c);
ii = N-1;
for jj in range(1,N-1):
c = '0000ff';
# bottom left
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2),c);
# bottom right
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii+0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+(ii)*a*math.sqrt(2),c);
# top left
if p[2*ii][jj] != p[2*ii-1][jj-1]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+jj*a*math.sqrt(2), my+ii*a*math.sqrt(2), mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2),c);
# top right
if p[2*ii][jj] != p[2*ii-1][jj]:
c = '0000ff';
else:
c = '000000';
DrawLine(mx+(jj+0.5)*a*math.sqrt(2), my+(ii-0.5)*a*math.sqrt(2), mx+(jj+1)*a*math.sqrt(2), my+ii*a*math.sqrt(2),c);
RoundedRect(0,0,2*mx+N*a*math.sqrt(2), 2*my+(N-1)*a*math.sqrt(2),10,'ff0000');
#f.write('<path style="fill:none;fill-rule:evenodd;stroke:#0000ff;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="M 4,44 H 39 V 59 H 29 v -5 h 5 V 49 H 24 V 64 H 59 V 79 H 49 v -5 h 5 V 69 H 44 V 84 H 79 V 69 H 69 v 5 h 5 v 5 H 64 V 64 H 99 V 49 H 89 v 5 h 5 v 5 H 84 V 29 h 10 v 5 h -5 v 5 H 99 V 24" id="path1088"/><path style="fill:none;fill-rule:evenodd;stroke:#0000ff;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="M 79,84 V 99 H 69 v -5 h 5 V 89 H 64 v 30 h 10 v -5 h -5 v -5 h 10 v 30 H 69 v -5 h 5 v -5 H 64 v 30 h 10 v -5 h -5 v -5 h 10 v 30 H 69 v -5 h 5 v -5 H 64 v 30 h 10 v -5 h -5 v -5 h 10 v 30 H 69 v -5 h 5 v -5 H 64 v 30 h 10 v -5 h -5 v -5 h 10 v 15 H 44 v 15 h 10 v -5 h -5 v -5 h 10 v 30 H 49 v -5 h 5 v -5 H 44 v 30 h 10 v -5 h -5 v -5 h 10 v 15" id="path1090"/><path style="fill:none;fill-rule:evenodd;stroke:#0000ff;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="m 64,304 v -15 h 10 v 5 h -5 v 5 H 79 V 269 H 69 v 5 h 5 v 5 H 64 v -15 h 35 v 15 H 89 v -5 h 5 v -5 H 84 v 15 h 35 v -15 h -10 v 5 h 5 v 5 h -10 v -30 h 10 v 5 h -5 v 5 h 10 v -30 h -10 v 5 h 5 v 5 h -10 v -15 h 35 v 15 h -10 v -5 h 5 v -5 h -10 v 15 h 35 v -15 h -10 v 5 h 5 v 5 h -10 v -30 h 10 v 5 h -5 v 5 h 10 v -15 h -35 v -15 h 10 v 5 h -5 v 5 h 10 v -15 h -35 v -15 h 10 v 5 h -5 v 5 h 10 V 164 H 84 v 15 h 10 v -5 h -5 v -5 h 10 v 15 H 64" id="path1092"/><path style="fill:none;fill-rule:evenodd;stroke:#0000ff;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="m 184,24 v 15 h 10 v -5 h -5 v -5 h 10 v 30 h -10 v -5 h 5 v -5 h -10 v 30 h 10 v -5 h -5 v -5 h 10 V 84 H 164 V 69 h 10 v 5 h -5 v 5 h 10 V 64 h -35 v 15 h 10 v -5 h -5 v -5 h 10 v 30 h -10 v -5 h 5 v -5 h -10 v 30 h 10 v -5 h -5 v -5 h 10 v 15 h -35 v 15 h 10 v -5 h -5 v -5 h 10 v 15 h -35 v -15 h 10 v 5 h -5 v 5 h 10 V 124 H 84 v -15 h 10 v 5 h -5 v 5 H 99 V 104 H 64" id="path1094"/><path style="fill:none;fill-rule:evenodd;stroke:#0000ff;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="m 144,224 h 35 v -15 h -10 v 5 h 5 v 5 h -10 v -15 h 35 v -15 h -10 v 5 h 5 v 5 h -10 v -30 h 10 v 5 h -5 v 5 h 10 v -30 h -10 v 5 h 5 v 5 h -10 v -15 h 35 v -15 h -10 v 5 h 5 v 5 h -10 v -30 h 10 v 5 h -5 v 5 h 10 V 104 H 184 V 89 h 10 v 5 h -5 v 5 h 10 V 84" id="path1096"/><path style="fill:none;fill-rule:evenodd;stroke:#0000ff;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="M 319,84 H 284 V 69 h 10 v 5 h -5 v 5 h 10 V 64 h -35 v 15 h 10 v -5 h -5 v -5 h 10 V 84 H 244 V 69 h 10 v 5 h -5 v 5 h 10 V 64 H 224 V 49 h 10 v 5 h -5 v 5 h 10 V 44 h -35 v 15 h 10 v -5 h -5 v -5 h 10 v 15 h -35" id="path1098"/><path style="fill:none;fill-rule:evenodd;stroke:#0000ff;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="m 219,144 v 15 h -10 v -5 h 5 v -5 h -10 v 30 h 10 v -5 h -5 v -5 h 10 v 30 h -10 v -5 h 5 v -5 h -10 v 30 h 10 v -5 h -5 v -5 h 10 v 30 h -10 v -5 h 5 v -5 h -10 v 30 h 10 v -5 h -5 v -5 h 10 v 15 h -35 v 15 h 10 v -5 h -5 v -5 h 10 v 30 h -10 v -5 h 5 v -5 h -10 v 15" id="path1100"/><path style="fill:none;fill-rule:evenodd;stroke:#0000ff;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="m 319,164 h -35 v -15 h 10 v 5 h -5 v 5 h 10 v -15 h -35 v -15 h 10 v 5 h -5 v 5 h 10 v -15 h -35 v 15 h 10 v -5 h -5 v -5 h 10 v 15 h -35 v -15 h 10 v 5 h -5 v 5 h 10 v -15 h -35" id="path1102"/><path style="fill:none;fill-rule:evenodd;stroke:#0000ff;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" d="m 204,204 h 35 v 15 h -10 v -5 h 5 v -5 h -10 v 15 h 35 v 15 h -10 v -5 h 5 v -5 h -10 v 15 h 35 v -15 h -10 v 5 h 5 v 5 h -10 v -15 h 35 v 15 h -10 v -5 h 5 v -5 h -10 v 15 h 35" id="path1104"/>');
f.write('</svg>');
f.close();
| 54.053476 | 3,841 | 0.544816 | 2,529 | 10,108 | 2.175959 | 0.063266 | 0.046884 | 0.070325 | 0.150827 | 0.843903 | 0.82555 | 0.823551 | 0.81501 | 0.81501 | 0.791932 | 0 | 0.199217 | 0.242184 | 10,108 | 186 | 3,842 | 54.344086 | 0.519191 | 0.423922 | 0 | 0.637097 | 0 | 0.024194 | 0.123269 | 0.047957 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016129 | false | 0.008065 | 0.024194 | 0 | 0.040323 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e5992087720fb5936d1b3d28ae2da96e1d72b5d0 | 6,026 | py | Python | d10-python/python-tests/test_lab.py | vstroebel/d10 | 2b2539fd728e42b1c1b126c2b90377a2c262adb0 | [
"Apache-2.0",
"MIT"
] | null | null | null | d10-python/python-tests/test_lab.py | vstroebel/d10 | 2b2539fd728e42b1c1b126c2b90377a2c262adb0 | [
"Apache-2.0",
"MIT"
] | null | null | null | d10-python/python-tests/test_lab.py | vstroebel/d10 | 2b2539fd728e42b1c1b126c2b90377a2c262adb0 | [
"Apache-2.0",
"MIT"
] | null | null | null | import unittest
from d10 import Lab
from d10 import Lch
delta = 0.0001
class TestLab(unittest.TestCase):
def assertChannelValue(self, first, second):
self.assertAlmostEqual(first, second, delta=delta)
def test_new(self):
color = Lab(1.0, 0.666, 0.333, 0.5)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.a, 0.666)
self.assertChannelValue(color.b, 0.333)
self.assertChannelValue(color.alpha, 0.5)
def test_setters(self):
color = Lab(0.1, 0.3, 0.5, 0.7)
self.assertChannelValue(color.l, 0.1)
self.assertChannelValue(color.a, 0.3)
self.assertChannelValue(color.b, 0.5)
self.assertChannelValue(color.alpha, 0.7)
color.l = 0.2
color.a = 0.4
color.b = 0.6
color.alpha = 0.8
self.assertChannelValue(color.l, 0.2)
self.assertChannelValue(color.a, 0.4)
self.assertChannelValue(color.b, 0.6)
self.assertChannelValue(color.alpha, 0.8)
def test_with_channels(self):
color = Lab(0.0, 0.0, 0.0, 0.0)
self.assertChannelValue(color.l, 0.0)
self.assertChannelValue(color.a, 0.0)
self.assertChannelValue(color.b, 0.0)
self.assertChannelValue(color.alpha, 0.0)
color = color.with_l(1.0)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.a, 0.0)
self.assertChannelValue(color.b, 0.0)
self.assertChannelValue(color.alpha, 0.0)
color = color.with_a(1.0)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.a, 1.0)
self.assertChannelValue(color.b, 0.0)
self.assertChannelValue(color.alpha, 0.0)
color = color.with_b(1.0)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.a, 1.0)
self.assertChannelValue(color.b, 1.0)
self.assertChannelValue(color.alpha, 0.0)
color = color.with_alpha(1.0)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.a, 1.0)
self.assertChannelValue(color.b, 1.0)
self.assertChannelValue(color.alpha, 1.0)
def test_conversion(self):
color = Lab(0.5, 0.6, 0.4, 0.1)
color = color.to_rgb().to_lab()
self.assertChannelValue(color.l, 0.5)
self.assertChannelValue(color.a, 0.6)
self.assertChannelValue(color.b, 0.4)
self.assertChannelValue(color.alpha, 0.1)
def test_lab_types(self):
self.assertEqual(Lab(1, 1, 1, 1).type_name, "lab<D65,O2>")
self.assertEqual(Lab(1, 1, 1, 1, 'D65', '2').type_name, "lab<D65,O2>")
self.assertEqual(Lab(1, 1, 1, 1, 'D65', '10').type_name, "lab<D65,O10>")
self.assertEqual(Lab(1, 1, 1, 1, 'D50', '2').type_name, "lab<D50,O2>")
self.assertEqual(Lab(1, 1, 1, 1, 'D50', '10').type_name, "lab<D50,O10>")
self.assertEqual(Lab(1, 1, 1, 1, 'E', '2').type_name, "lab<E,O2>")
self.assertEqual(Lab(1, 1, 1, 1, 'E', '10').type_name, "lab<E,O10>")
class TestLch(unittest.TestCase):
def assertChannelValue(self, first, second):
self.assertAlmostEqual(first, second, delta=delta)
def test_new(self):
color = Lch(1.0, 0.666, 0.333, 0.5)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.c, 0.666)
self.assertChannelValue(color.h, 0.333)
self.assertChannelValue(color.alpha, 0.5)
def test_setters(self):
color = Lch(0.1, 0.3, 0.5, 0.7)
self.assertChannelValue(color.l, 0.1)
self.assertChannelValue(color.c, 0.3)
self.assertChannelValue(color.h, 0.5)
self.assertChannelValue(color.alpha, 0.7)
color.l = 0.2
color.c = 0.4
color.h = 0.6
color.alpha = 0.8
self.assertChannelValue(color.l, 0.2)
self.assertChannelValue(color.c, 0.4)
self.assertChannelValue(color.h, 0.6)
self.assertChannelValue(color.alpha, 0.8)
def test_with_channels(self):
color = Lch(0.0, 0.0, 0.0, 0.0)
self.assertChannelValue(color.l, 0.0)
self.assertChannelValue(color.c, 0.0)
self.assertChannelValue(color.h, 0.0)
self.assertChannelValue(color.alpha, 0.0)
color = color.with_l(1.0)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.c, 0.0)
self.assertChannelValue(color.h, 0.0)
self.assertChannelValue(color.alpha, 0.0)
color = color.with_c(1.0)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.c, 1.0)
self.assertChannelValue(color.h, 0.0)
self.assertChannelValue(color.alpha, 0.0)
color = color.with_h(1.0)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.c, 1.0)
self.assertChannelValue(color.h, 1.0)
self.assertChannelValue(color.alpha, 0.0)
color = color.with_alpha(1.0)
self.assertChannelValue(color.l, 1.0)
self.assertChannelValue(color.c, 1.0)
self.assertChannelValue(color.h, 1.0)
self.assertChannelValue(color.alpha, 1.0)
def test_conversion(self):
color = Lch(0.5, 0.6, 0.4, 0.1)
color = color.to_rgb().to_lch()
self.assertChannelValue(color.l, 0.5)
self.assertChannelValue(color.c, 0.6)
self.assertChannelValue(color.h, 0.4)
self.assertChannelValue(color.alpha, 0.1)
def test_lch_types(self):
self.assertEqual(Lch(1, 1, 1, 1).type_name, "lch<D65,O2>")
self.assertEqual(Lch(1, 1, 1, 1, 'D65', '2').type_name, "lch<D65,O2>")
self.assertEqual(Lch(1, 1, 1, 1, 'D65', '10').type_name, "lch<D65,O10>")
self.assertEqual(Lch(1, 1, 1, 1, 'D50', '2').type_name, "lch<D50,O2>")
self.assertEqual(Lch(1, 1, 1, 1, 'D50', '10').type_name, "lch<D50,O10>")
self.assertEqual(Lch(1, 1, 1, 1, 'E', '2').type_name, "lch<E,O2>")
self.assertEqual(Lch(1, 1, 1, 1, 'E', '10').type_name, "lch<E,O10>")
| 35.869048 | 80 | 0.615002 | 883 | 6,026 | 4.151755 | 0.061155 | 0.432079 | 0.530278 | 0.320786 | 0.942717 | 0.875068 | 0.873977 | 0.861975 | 0.777414 | 0.746318 | 0 | 0.085129 | 0.230003 | 6,026 | 167 | 81 | 36.083832 | 0.704957 | 0 | 0 | 0.575758 | 0 | 0 | 0.032858 | 0 | 0 | 0 | 0 | 0 | 0.681818 | 1 | 0.090909 | false | 0 | 0.022727 | 0 | 0.128788 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
e5a2199d6749fe77dad072d4347bc2fc5faa131f | 13,018 | py | Python | python/openlattice/api/authorizations_api.py | openlattice/api-clients | 1d5be9861785b295089b732f37464e31bf80c8ca | [
"Apache-2.0"
] | null | null | null | python/openlattice/api/authorizations_api.py | openlattice/api-clients | 1d5be9861785b295089b732f37464e31bf80c8ca | [
"Apache-2.0"
] | 1 | 2021-01-20T00:20:01.000Z | 2021-01-20T00:20:01.000Z | python/openlattice/api/authorizations_api.py | openlattice/api-clients | 1d5be9861785b295089b732f37464e31bf80c8ca | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
OpenLattice API
OpenLattice API # noqa: E501
The version of the OpenAPI document: 0.0.1
Contact: support@openlattice.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from openlattice.api_client import ApiClient
from openlattice.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class AuthorizationsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def check_authorizations(self, access_check, **kwargs): # noqa: E501
"""Check authorizations # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.check_authorizations(access_check, async_req=True)
>>> result = thread.get()
:param access_check: (required)
:type access_check: AccessCheck
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[Authorization]
"""
kwargs['_return_http_data_only'] = True
return self.check_authorizations_with_http_info(access_check, **kwargs) # noqa: E501
def check_authorizations_with_http_info(self, access_check, **kwargs): # noqa: E501
"""Check authorizations # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.check_authorizations_with_http_info(access_check, async_req=True)
>>> result = thread.get()
:param access_check: (required)
:type access_check: AccessCheck
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[Authorization], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'access_check'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method check_authorizations" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'access_check' is set
if self.api_client.client_side_validation and ('access_check' not in local_var_params or # noqa: E501
local_var_params['access_check'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `access_check` when calling `check_authorizations`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'access_check' in local_var_params:
body_params = local_var_params['access_check']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['http_auth', 'openlattice_auth'] # noqa: E501
return self.api_client.call_api(
'/datastore/authorizations', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[Authorization]', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def get_accessible_objects(self, **kwargs): # noqa: E501
"""Returns paged results for all authorized objects of specified objectType, that the current user has specified permission for. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_accessible_objects(async_req=True)
>>> result = thread.get()
:param object_type:
:type object_type: str
:param permission:
:type permission: str
:param paging_token:
:type paging_token: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: AuthorizedObjectsSearchResult
"""
kwargs['_return_http_data_only'] = True
return self.get_accessible_objects_with_http_info(**kwargs) # noqa: E501
def get_accessible_objects_with_http_info(self, **kwargs): # noqa: E501
"""Returns paged results for all authorized objects of specified objectType, that the current user has specified permission for. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_accessible_objects_with_http_info(async_req=True)
>>> result = thread.get()
:param object_type:
:type object_type: str
:param permission:
:type permission: str
:param paging_token:
:type paging_token: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(AuthorizedObjectsSearchResult, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'object_type',
'permission',
'paging_token'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_accessible_objects" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'object_type' in local_var_params and local_var_params['object_type'] is not None: # noqa: E501
query_params.append(('objectType', local_var_params['object_type'])) # noqa: E501
if 'permission' in local_var_params and local_var_params['permission'] is not None: # noqa: E501
query_params.append(('permission', local_var_params['permission'])) # noqa: E501
if 'paging_token' in local_var_params and local_var_params['paging_token'] is not None: # noqa: E501
query_params.append(('pagingToken', local_var_params['paging_token'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['http_auth', 'openlattice_auth'] # noqa: E501
return self.api_client.call_api(
'/datastore/authorizations', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AuthorizedObjectsSearchResult', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 41.858521 | 150 | 0.605393 | 1,420 | 13,018 | 5.292254 | 0.140845 | 0.037259 | 0.057751 | 0.028743 | 0.836194 | 0.802661 | 0.782435 | 0.77525 | 0.73839 | 0.73839 | 0 | 0.012039 | 0.323629 | 13,018 | 310 | 151 | 41.993548 | 0.841454 | 0.481794 | 0 | 0.607692 | 1 | 0 | 0.180359 | 0.044011 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.038462 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e5daa8ae9a065038559aa561a7f5aba915ffbf37 | 42 | py | Python | src/__init__.py | marcojob/BQ40Z50_Analyzerx | d444f893132b9142a458db94f1971b8abc29723d | [
"MIT"
] | 3 | 2021-01-22T09:54:25.000Z | 2021-06-22T18:12:49.000Z | src/__init__.py | marcojob/BQ40Z50_Analyzerx | d444f893132b9142a458db94f1971b8abc29723d | [
"MIT"
] | 1 | 2021-07-13T15:37:54.000Z | 2021-07-13T15:37:54.000Z | src/__init__.py | marcojob/BQ40Z50_Analyzerx | d444f893132b9142a458db94f1971b8abc29723d | [
"MIT"
] | 1 | 2021-06-24T09:11:43.000Z | 2021-06-24T09:11:43.000Z | from . import ev2300
from . import bq40z50 | 21 | 21 | 0.785714 | 6 | 42 | 5.5 | 0.666667 | 0.606061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228571 | 0.166667 | 42 | 2 | 21 | 21 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
e5ed1b8ac24e4c01e15c4073d6163843c9c28a32 | 1,247 | py | Python | replacements.py | dreamPathsProjekt/yamlsub | fbd4515641a411347b0283e1ae5ccd056d59497e | [
"MIT"
] | null | null | null | replacements.py | dreamPathsProjekt/yamlsub | fbd4515641a411347b0283e1ae5ccd056d59497e | [
"MIT"
] | null | null | null | replacements.py | dreamPathsProjekt/yamlsub | fbd4515641a411347b0283e1ae5ccd056d59497e | [
"MIT"
] | null | null | null | import yaml
from helpers import find_key, get_key, replace_key, replace_property, escape_dquotes, escape_url_encode
def replace_yaml(field, original_value, replacement_value, filename):
replacement_value=escape_url_encode(replacement_value)
with open(filename) as file:
data = yaml.load(file)
replace_key(field, original_value, replacement_value, data)
with open(filename, mode='w') as file:
yaml.dump(data, file)
def replace_application_properties(field, original_value, replacement_value, filename):
replacement_value=escape_url_encode(replacement_value)
with open(filename) as file:
data = [ line for line in file.readlines() ]
new_data = replace_property(field, original_value, replacement_value, data, '=')
with open(filename, mode='w') as file:
file.writelines(new_data)
def replace_ini(field, original_value, replacement_value, filename):
replacement_value=escape_url_encode(replacement_value)
with open(filename) as file:
data = [ line for line in file.readlines() ]
new_data = replace_property(field, original_value, replacement_value, data, ' = ')
with open(filename, mode='w') as file:
file.writelines(new_data)
| 28.340909 | 103 | 0.728949 | 161 | 1,247 | 5.385093 | 0.229814 | 0.221453 | 0.124567 | 0.200692 | 0.782007 | 0.782007 | 0.782007 | 0.782007 | 0.782007 | 0.782007 | 0 | 0 | 0.180433 | 1,247 | 43 | 104 | 29 | 0.848337 | 0 | 0 | 0.652174 | 0 | 0 | 0.005645 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.130435 | false | 0 | 0.086957 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e5fdc6559bfbf5ae50d285465118ff1c373971e2 | 24,315 | py | Python | RL_based_ATSC/multi-intersection/Network/map.py | sue04206/traffic-signal-optimization | f0891c0df8a3f84bf5011af85467e67a0091371b | [
"Apache-2.0"
] | 6 | 2020-08-27T05:45:27.000Z | 2021-12-27T05:11:29.000Z | RL_based_ATSC/multi-intersection/Network/map.py | sue04206/traffic-signal-optimization | f0891c0df8a3f84bf5011af85467e67a0091371b | [
"Apache-2.0"
] | 4 | 2021-02-03T16:26:45.000Z | 2022-03-12T12:11:58.000Z | RL_based_ATSC/multi-intersection/Network/map.py | sue04206/traffic-signal-optimization | f0891c0df8a3f84bf5011af85467e67a0091371b | [
"Apache-2.0"
] | 3 | 2021-12-14T06:59:52.000Z | 2022-02-21T04:37:49.000Z |
import os
import torch
from xml.etree.ElementTree import parse
from gen_net import Network
from configs import EXP_CONFIGS
class MapNetwork(Network):
def __init__(self, configs):
super().__init__(configs)
self.configs = configs
self.tl_rl_list = list()
self.offset_list = list()
self.phase_list = list()
self.common_phase = list()
self.net_file_path = os.path.join(
self.configs['current_path'], 'Network', self.configs['load_file_name']+'.net.xml')
self.rou_file_path = os.path.join(
self.configs['current_path'], 'Network', self.configs['load_file_name']+'.rou.xml')
def get_tl_from_add_xml(self):
add_file_path = os.path.join(
self.configs['current_path'], 'Network', self.configs['load_file_name']+'.add.xml')
NET_CONFIGS = dict()
NET_CONFIGS['phase_num_actions'] = {2: [[0, 0], [1, -1]],
3: [[0, 0, 0], [1, 0, -1], [1, -1, 0], [0, 1, -1], [-1, 0, 1], [0, -1, 1], [-1, 1, 0]],
4: [[1, 0, 0, -1], [1, 0, -1, 0], [1, -1, 0, 0], [0, 1, 0, -1], [0, 1, -1, 0], [0, 0, 1, -1], [0, 0, 0, 0],
[-1, 0, 0, 1], [0, -1, 0, 1], [0, -1, 1, 0], [-1, 1, 0, 0], [-1, 0, 1, 0], [0, 0, -1, 1],
[1, 1, -1, -1], [1, -1, 1, -1], [-1, 1, 1, -1], [-1, -1, 1, 1], [-1, 1, -1, 1],[1,-1,-1,1]],
5: [[0, 0, 0, 0, 0]],
6: [[0, 0, 0, 0, 0, 0]], }
NET_CONFIGS['phase_type'] = list()
NET_CONFIGS['rate_action_space'] = dict()
for i in range(2, 7): # rate action_space 지정
NET_CONFIGS['rate_action_space'][i] = len(
NET_CONFIGS['phase_num_actions'][i])
NET_CONFIGS['tl_period'] = list()
traffic_info = dict()
print(add_file_path)
add_net_tree = parse(add_file_path)
tlLogicList = add_net_tree.findall('tlLogic')
NET_CONFIGS['time_action_space'] = list()
# traffic info 저장
for tlLogic in tlLogicList:
tl_id = tlLogic.attrib['id']
traffic_info[tl_id] = dict()
traffic_node_info = traffic_info[tl_id]
traffic_node_info['min_phase'] = list()
traffic_node_info['phase_duration'] = list()
traffic_node_info['max_phase'] = list()
traffic_node_info['min_phase'] = list()
traffic_node_info['min_phase'] = list()
# rl agent 갯수 정리
self.tl_rl_list.append(tlLogic.attrib['id']) # rl 조종할 tl_rl추가
# offset 저장
traffic_node_info['offset'] = int(tlLogic.attrib['offset'])
self.offset_list.append(traffic_node_info['offset'])
# phase전체 찾기
phaseList = tlLogic.findall('phase')
phase_state_list = list()
phase_duration_list = list()
common_phase_list = list()
phase_index_list = list()
min_duration_list = list()
max_duration_list = list()
dif_max_list = list()
dif_min_list = list()
tl_period = 0 # phase set의 전체 길이
# 각 phase에 대해서 길이 찾기 등등
num_phase = 0 # phase갯수 filtering
for i, phase in enumerate(phaseList):
phase_state_list.append(phase.attrib['state'])
phase_duration_list.append(int(phase.attrib['duration']))
tl_period += int(phase.attrib['duration'])
if int(phase.attrib['duration']) > 5: # Phase 로 간주할 숫자
num_phase += 1
min_duration_list.append(int(phase.attrib['minDur']))
max_duration_list.append(int(phase.attrib['maxDur']))
dif_max_list.append(
(int(phase.attrib['maxDur'])-int(phase.attrib['duration']))/100.0)
dif_min_list.append(
(int(phase.attrib['duration'])-int(phase.attrib['minDur']))/100.0)
phase_index_list.append(i)
common_phase_list.append(int(phase.attrib['duration']))
# dictionary에 담기
traffic_node_info['phase_list'] = phase_state_list
traffic_node_info['phase_duration'] = phase_duration_list
traffic_node_info['common_phase'] = common_phase_list
traffic_node_info['phase_index'] = phase_index_list
traffic_node_info['dif_min'] = dif_min_list
traffic_node_info['dif_max'] = dif_max_list
# 각 신호별 길이
traffic_node_info['period'] = tl_period
NET_CONFIGS['tl_period'].append(tl_period)
traffic_node_info['matrix_actions'] = NET_CONFIGS['phase_num_actions'][num_phase]
traffic_node_info['min_phase'] = min_duration_list
traffic_node_info['max_phase'] = max_duration_list
traffic_node_info['num_phase'] = num_phase
# 각 tl_rl의 time_action_space지정
# NET_CONFIGS['time_action_space'].append(abs(round((torch.min(torch.tensor(traffic_node_info['max_phase'])-torch.tensor(
# traffic_node_info['common_phase']), torch.tensor(traffic_node_info['common_phase'])-torch.tensor(traffic_node_info['min_phase']))/2).mean().item())))
NET_CONFIGS['time_action_space'].append(4) # 임의 초 지정
if 'grid' in self.configs['network']:
NET_CONFIGS['phase_type'].append([0, 0])
self.phase_list.append(phase_state_list)
self.common_phase.append(phase_duration_list)
if 'dunsan' in self.configs['network']:
NET_CONFIGS['phase_type'] = [[0, 0], [0, 0], [0, 1], [
1, 0], [1, 0], [1, 1], [1, 1], [1, 1], [0, 1], [1, 0]]
# TODO node interest pair 계산기 network base에 생성
maximum = 0
for key in traffic_info.keys():
if maximum < len(traffic_info[key]['phase_duration']):
maximum = len(traffic_info[key]['phase_duration'])
NET_CONFIGS['max_phase_num'] = maximum
# road용
# edge info 저장
self.configs['edge_info'] = list()
edge_list = list() # edge존재 확인용
net_tree = parse(self.net_file_path)
edges = net_tree.findall('edge')
for edge in edges:
if 'function' not in edge.attrib.keys():
edge_list.append({
'id': edge.attrib['id'],
'from': edge.attrib['from'],
'to': edge.attrib['to'],
})
self.configs['edge_info'].append(edge.attrib['id']) # 모든 엣지 저장
# node info 저장
self.configs['node_info'] = list()
node_list = list()
# interest list
interest_list = list()
# node interest pair
node_interest_pair = dict()
junctions = net_tree.findall('junction')
# state space size 결정
inflow_size = 0
# network용
for junction in junctions:
node_id = junction.attrib['id']
if junction.attrib['type'] == "traffic_light": # 정상 node만 분리, 신호등 노드
node_list.append({
'id': node_id,
'type': junction.attrib['type'],
})
if node_id in self.tl_rl_list: # 학습하는 tl만 저장
i = 0
interests = list()
for edge in edge_list:
interest = dict()
if edge['to'] == node_id: # inflow
interest['id'] = node_id+'_{}'.format(i)
interest['inflow'] = edge['id']
for target_edge in edge_list:
if target_edge['from'] == edge['to'] and target_edge['to'] == edge['from']:
interest['outflow'] = target_edge['id']
break
else:
interest['outflow'] = None
interests.append(interest)
i += 1 # index표기용
elif edge['from'] == node_id:
interest['id'] = node_id+'_{}'.format(i)
interest['outflow'] = edge['id']
for target_edge in edge_list:
if target_edge['from'] == edge['to'] and target_edge['to'] == edge['from']:
interest['inflow'] = target_edge['id']
break
else:
interest['inflow'] = None
interests.append(interest)
i += 1 # index표기용
# 중복이 존재하는지 확인 후 list에 삽입
no_dup_outflow_list = list()
no_dup_interest_list = list()
for interest_comp in interests:
if interest_comp['outflow'] not in no_dup_outflow_list:
no_dup_outflow_list.append(
interest_comp['outflow'])
no_dup_interest_list.append(interest_comp)
interest_list.append(no_dup_interest_list)
node_interest_pair[node_id] = no_dup_interest_list
if inflow_size < len(no_dup_interest_list):
inflow_size = len(no_dup_interest_list)
# 일반 노드
elif junction.attrib['type'] == "priority": # 정상 node만 분리
node_list.append({
'id': node_id,
'type': junction.attrib['type'],
})
else:
pass
self.configs['node_info'].append({
'id': node_id,
'type': junction.attrib['type'],
})
# 정리
NET_CONFIGS['edge_info'] = self.configs['edge_info']
NET_CONFIGS['node_info'] = self.configs['node_info']
NET_CONFIGS['traffic_node_info'] = traffic_info
NET_CONFIGS['interest_list'] = interest_list
NET_CONFIGS['node_interest_pair'] = node_interest_pair
NET_CONFIGS['tl_rl_list'] = self.tl_rl_list
NET_CONFIGS['offset'] = self.offset_list
NET_CONFIGS['phase_list'] = self.phase_list
NET_CONFIGS['common_phase'] = self.common_phase
NET_CONFIGS['state_space'] = inflow_size*2 + \
2+2 # 좌회전,직전, 2는 phase set형태, 2는 phase dif dur(min max)
print("Agent Num:{}, Traffic Num:{}".format(
len(self.tl_rl_list), len(node_list)))
return NET_CONFIGS
def get_tl_from_xml(self):
if os.path.exists(os.path.join(self.configs['current_path'], 'Network', self.configs['load_file_name']+'.add.xml')):
print("additional exists")
return self.get_tl_from_add_xml()
else:
NET_CONFIGS = dict()
NET_CONFIGS['phase_type'] = list()
NET_CONFIGS['phase_num_actions'] = {2: [[0, 0], [1, -1], [-1, 1]],
3: [[0, 0, 0], [1, 0, -1], [1, -1, 0], [0, 1, -1], [-1, 0, 1], [0, -1, 1], [-1, 1, 0]],
4: [[1, 0, 0, -1], [1, 0, -1, 0], [1, -1, 0, 0], [0, 1, 0, -1], [0, 1, -1, 0], [0, 0, 1, -1], [0, 0, 0, 0],
[-1, 0, 0, 1], [0, -1, 0, 1], [0, -1, 1, 0], [-1, 1, 0, 0], [-1, 0, 1, 0], [0, 0, -1, 1],
[1, 1, -1, -1], [1, -1, 1, -1], [-1, 1, 1, -1], [-1, -1, 1, 1], [-1, 1, -1, 1],[1,-1,-1,1]],
# 5: [[0, 0, 0, 0, 0]],
# 6: [[0, 0, 0, 0, 0, 0]],
}
NET_CONFIGS['rate_action_space'] = dict()
for i in NET_CONFIGS['phase_num_actions'].keys(): # rate action_space 지정
NET_CONFIGS['rate_action_space'][i] = len(
NET_CONFIGS['phase_num_actions'][i])
NET_CONFIGS['tl_period'] = list()
traffic_info = dict()
net_tree = parse(self.net_file_path)
tlLogicList = net_tree.findall('tlLogic')
NET_CONFIGS['time_action_space'] = list()
if 'dunsan' == self.configs['network']:
NET_CONFIGS['phase_type'] = [[0, 0], [0, 0], [0, 1], [
1, 0], [1, 0], [1, 1], [1, 1], [1, 1], [0, 1], [1, 0]]
# traffic info 저장
for tlLogic in tlLogicList:
if 'grid' in self.configs['network']:
NET_CONFIGS['phase_type'].append([0, 0])
tl_id = tlLogic.attrib['id']
traffic_info[tl_id] = dict()
traffic_node_info = traffic_info[tl_id]
traffic_node_info['min_phase'] = list()
traffic_node_info['phase_duration'] = list()
traffic_node_info['max_phase'] = list()
traffic_node_info['min_phase'] = list()
traffic_node_info['min_phase'] = list()
# rl agent 갯수 정리
self.tl_rl_list.append(tlLogic.attrib['id']) # rl 조종할 tl_rl추가
# offset 저장
traffic_node_info['offset'] = int(tlLogic.attrib['offset'])
self.offset_list.append(traffic_node_info['offset'])
# phase전체 찾기
phaseList = tlLogic.findall('phase')
phase_state_list = list()
phase_duration_list = list()
common_phase_list = list()
phase_index_list = list()
min_duration_list = list()
max_duration_list = list()
dif_min_list = list()
dif_max_list = list()
tl_period = 0 # phase set의 전체 길이
# 각 phase에 대해서 길이 찾기 등등
num_phase = 0 # phase갯수 filtering
for i, phase in enumerate(phaseList):
phase_state_list.append(phase.attrib['state'])
this_phase_dur = phase.attrib['duration']
phase_duration_list.append(int(this_phase_dur))
tl_period += int(this_phase_dur)
# Phase 로 간주할 숫자
if int(this_phase_dur) > 5 and 'minDur' in phase.attrib.keys() and 'maxDur' in phase.attrib.keys():
num_phase += 1
min_duration_list.append(
int(phase.attrib['minDur']))
max_duration_list.append(
int(phase.attrib['maxDur']))
dif_max_list.append(
(int(phase.attrib['maxDur'])-int(phase.attrib['duration']))/100.0)
dif_min_list.append(
(int(phase.attrib['duration'])-int(phase.attrib['minDur']))/100.0)
phase_index_list.append(i)
common_phase_list.append(int(this_phase_dur))
elif int(this_phase_dur) > 5:
num_phase += 1
min_duration_list.append(
int(this_phase_dur)-5)
max_duration_list.append(
int(this_phase_dur)+5)
dif_max_list.append(5)
dif_min_list.append(5)
phase_index_list.append(i)
common_phase_list.append(int(this_phase_dur))
# dictionary에 담기
traffic_node_info['phase_list'] = phase_state_list
traffic_node_info['phase_duration'] = phase_duration_list
traffic_node_info['common_phase'] = common_phase_list
traffic_node_info['phase_index'] = phase_index_list
traffic_node_info['dif_max'] = dif_max_list # max dur의 차이
traffic_node_info['dif_min'] = dif_min_list # min dur의 차이
# 각 신호별 길이
traffic_node_info['period'] = tl_period
NET_CONFIGS['tl_period'].append(tl_period)
traffic_node_info['matrix_actions'] = NET_CONFIGS['phase_num_actions'][num_phase]
traffic_node_info['min_phase'] = min_duration_list
traffic_node_info['max_phase'] = max_duration_list
traffic_node_info['num_phase'] = num_phase
# 각 tl_rl의 time_action_space지정
# NET_CONFIGS['time_action_space'].append(abs(round((torch.min(torch.tensor(traffic_node_info['max_phase'])-torch.tensor(
# traffic_node_info['common_phase']), torch.tensor(traffic_node_info['common_phase'])-torch.tensor(traffic_node_info['min_phase'])).float()).mean().item())))
NET_CONFIGS['time_action_space'].append(4)
self.phase_list.append(phase_state_list)
self.common_phase.append(phase_duration_list)
# TODO node interest pair 계산기 network base에 생성
maximum = 0
for key in traffic_info.keys():
if maximum < len(traffic_info[key]['phase_duration']):
maximum = len(traffic_info[key]['phase_duration'])
NET_CONFIGS['max_phase_num'] = maximum
# road용
# edge info 저장
self.configs['edge_info'] = list()
edges = net_tree.findall('edge')
for edge in edges:
if 'function' not in edge.attrib.keys():
self.configs['edge_info'].append({
'id': edge.attrib['id'],
'from': edge.attrib['from'],
'to': edge.attrib['to'],
})
# node info 저장
self.configs['node_info'] = list()
node_list = list()
# interest list
interest_list = list()
# node interest pair
node_interest_pair = dict()
junctions = net_tree.findall('junction')
# state space size 결정
inflow_size = 0
# network용
for junction in junctions:
node_id = junction.attrib['id']
if junction.attrib['type'] == "traffic_light": # 정상 node만 분리, 신호등 노드
node_list.append({
'id': node_id,
'type': junction.attrib['type'],
})
# node 결정 완료
# edge는?
i = 0
interests = list()
for edge in self.configs['edge_info']:
interest = dict()
if edge['to'] == node_id: # inflow
interest['id'] = node_id+'_{}'.format(i)
interest['inflow'] = edge['id']
for tmpEdge in self.configs['edge_info']: # outflow
if tmpEdge['from'] == node_id and edge['from'] == tmpEdge['to']:
interest['outflow'] = tmpEdge['id']
break
else:
interest['outflow'] = None
# tmp_edge=str(-int(edge['id']))
# if tmp_edge in edge_list:
# interest['outflow']=tmp_edge
# else:
# interest['outflow']=None
interests.append(interest)
i += 1 # index표기용
elif edge['from'] == node_id:
interest['id'] = node_id+'_{}'.format(i)
interest['outflow'] = edge['id']
for tmpEdge in self.configs['edge_info']: # outflow
if tmpEdge['to'] == node_id and edge['to'] == tmpEdge['from']:
interest['inflow'] = tmpEdge['id']
break
else:
interest['inflow'] = None
# tmp_edge=str(-int(edge['id']))
# if tmp_edge in edge_list:
# interest['inflow']=tmp_edge
# else:
# interest['inflow']=None
interests.append(interest)
i += 1 # index표기용
# 중복이 존재하는지 확인 후 list에 삽입
no_dup_outflow_list = list()
no_dup_interest_list = list()
for interest_comp in interests:
if interest_comp['outflow'] not in no_dup_outflow_list:
no_dup_outflow_list.append(
interest_comp['outflow'])
no_dup_interest_list.append(interest_comp)
interest_list.append(no_dup_interest_list)
node_interest_pair[node_id] = no_dup_interest_list
if inflow_size < len(no_dup_interest_list):
inflow_size = len(no_dup_interest_list)
# 일반 노드
elif junction.attrib['type'] == "priority": # 정상 node만 분리
node_list.append({
'id': node_id,
'type': junction.attrib['type'],
})
else:
pass
self.configs['node_info'].append({
'id': node_id,
'type': junction.attrib['type'],
})
# 정리
NET_CONFIGS['node_info'] = self.configs['node_info']
NET_CONFIGS['edge_info'] = self.configs['edge_info']
NET_CONFIGS['traffic_node_info'] = traffic_info
NET_CONFIGS['interest_list'] = interest_list
NET_CONFIGS['node_interest_pair'] = node_interest_pair
NET_CONFIGS['tl_rl_list'] = self.tl_rl_list
NET_CONFIGS['offset'] = self.offset_list
NET_CONFIGS['phase_list'] = self.phase_list
NET_CONFIGS['common_phase'] = self.common_phase
# 좌회전,직전 , 2개 phase type(one hot), 2개 phaseduration(min max)
NET_CONFIGS['state_space'] = inflow_size*2+2+2
return NET_CONFIGS
def gen_net_from_xml(self):
net_tree = parse(self.net_file_path)
if self.configs['mode'] == 'train' or self.configs['mode'] == 'test':
gen_file_name = str(os.path.join(self.configs['current_path'], 'training_data',
self.configs['time_data'], 'net_data', self.configs['time_data']+'.net.xml'))
net_tree.write(gen_file_name, encoding='UTF-8',
xml_declaration=True)
else: # simulate
gen_file_name = str(os.path.join(
self.configs['current_path'], 'Net_data', self.configs['time_data']+'.net.xml'))
net_tree.write(gen_file_name, encoding='UTF-8',
xml_declaration=True)
def gen_rou_from_xml(self):
net_tree = parse(self.rou_file_path)
if self.configs['mode'] == 'train' or self.configs['mode'] == 'test':
gen_file_name = str(os.path.join(self.configs['current_path'], 'training_data',
self.configs['time_data'], 'net_data', self.configs['time_data']+'.rou.xml'))
net_tree.write(gen_file_name, encoding='UTF-8',
xml_declaration=True)
else:
gen_file_name = str(os.path.join(self.configs['current_path'], 'Net_data',
self.configs['time_data']+'.rou.xml'))
net_tree.write(gen_file_name, encoding='UTF-8',
xml_declaration=True)
| 50.134021 | 177 | 0.48394 | 2,698 | 24,315 | 4.088213 | 0.075982 | 0.017044 | 0.017951 | 0.019946 | 0.899365 | 0.882593 | 0.863917 | 0.836718 | 0.825657 | 0.802629 | 0 | 0.023027 | 0.39453 | 24,315 | 484 | 178 | 50.237603 | 0.726192 | 0.080197 | 0 | 0.792839 | 0 | 0 | 0.099614 | 0 | 0 | 0 | 0 | 0.002066 | 0 | 1 | 0.012788 | false | 0.005115 | 0.012788 | 0 | 0.035806 | 0.007673 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f92dc8bb728bab75cd824d6b4e394a8685b32a92 | 4,264 | py | Python | tests/rules/test_truthy.py | xavierhardy/yamlfix | 21a2585b79ff2d708bb32baeb06984991fa62c75 | [
"Apache-2.0"
] | 2 | 2020-07-05T09:33:35.000Z | 2021-05-09T04:11:58.000Z | tests/rules/test_truthy.py | xavierhardy/yamlfix | 21a2585b79ff2d708bb32baeb06984991fa62c75 | [
"Apache-2.0"
] | 3 | 2020-07-04T13:57:36.000Z | 2021-02-08T21:06:57.000Z | tests/rules/test_truthy.py | xavierhardy/yamlfix | 21a2585b79ff2d708bb32baeb06984991fa62c75 | [
"Apache-2.0"
] | null | null | null | import unittest
from yamllint.config import YamlLintConfig
from tests.utils import LoggingTester
from yamlfix.formatting import read_and_format_text
class TruthyRuleTest(LoggingTester):
"""truthy"""
def test_no_config(self):
expected = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
"off": "FALSE"
"""
content = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
off: FALSE
"""
output = read_and_format_text(content)
self.assertEqual(expected, output)
def test_enabled(self):
config_content = '{extends: default, rules: {truthy: "enable"}}'
expected = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
"off": "FALSE"
"""
content = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
off: FALSE
"""
output = read_and_format_text(content, YamlLintConfig(content=config_content))
self.assertEqual(expected, output)
def test_single_quote(self):
config_content = (
"{extends: default,"
" rules: {truthy: enable, quoted-strings: {quote-type: single, required: false}}}"
)
expected = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
'off': 'FALSE'
"""
content = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
off: FALSE
"""
output = read_and_format_text(content, YamlLintConfig(content=config_content))
self.assertEqual(expected, output)
def test_double_quote(self):
config_content = (
"{extends: default,"
" rules: {truthy: enable, quoted-strings: {quote-type: double, required: false}}}"
)
expected = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: "TRuE"
"off": "FALSE"
"""
content = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
off: FALSE
"""
output = read_and_format_text(content, YamlLintConfig(content=config_content))
self.assertEqual(expected, output)
def test_any_quote(self):
config_content = (
"{extends: default,"
" rules: {truthy: enable, quoted-strings: {quote-type: any, required: false}}}"
)
expected = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
'off': "FALSE"
"""
content = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
'off': FALSE
"""
output = read_and_format_text(content, YamlLintConfig(content=config_content))
self.assertEqual(expected, output)
def test_allowed_values(self):
config_content = "{extends: default, rules: {truthy: {allowed-values: ['no']}}}"
expected = """---
test42: {something469: "true"}
test43:
noquote: no
hey879: 'TRuE'
"off": "FALSE"
"""
content = """---
test42: {something469: true}
test43:
noquote: no
hey879: 'TRuE'
off: FALSE
"""
output = read_and_format_text(content, YamlLintConfig(content=config_content))
self.assertEqual(expected, output)
def test_check_keys(self):
config_content = "{extends: default, rules: {truthy: {check-keys: false}}}"
expected = """---
test42: {something469: true}
test43:
noquote: "no"
hey879: 'TRuE'
off: "FALSE"
"""
content = """---
test42: {something469: true}
test43:
noquote: no
hey879: 'TRuE'
off: FALSE
"""
output = read_and_format_text(content, YamlLintConfig(content=config_content))
self.assertEqual(expected, output)
def test_disabled(self):
config_content = "{extends: default, rules: {truthy: disable}}"
expected = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
off: FALSE
"""
content = """---
test42: {something469: true}
test43:
noquote: TRuE
hey879: 'TRuE'
off: FALSE
"""
output = read_and_format_text(content, YamlLintConfig(content=config_content))
self.assertEqual(expected, output)
if __name__ == "__main__":
unittest.main()
| 21.755102 | 94 | 0.608583 | 425 | 4,264 | 5.957647 | 0.131765 | 0.113744 | 0.139021 | 0.176935 | 0.879147 | 0.879147 | 0.879147 | 0.813191 | 0.794234 | 0.794234 | 0 | 0.049922 | 0.248358 | 4,264 | 195 | 95 | 21.866667 | 0.740094 | 0.001407 | 0 | 0.792453 | 0 | 0.018868 | 0.474365 | 0 | 0 | 0 | 0 | 0 | 0.050314 | 1 | 0.050314 | false | 0 | 0.025157 | 0 | 0.081761 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
dac68940855b0021a7f181689c9875a71ce925a4 | 5,535 | py | Python | StanCode Projects/boggle_game_solver/boggle.py | chengti-wang/stanCode-Projects | dec4ebb548e0b8a77478056775dca697ee1e11be | [
"MIT"
] | null | null | null | StanCode Projects/boggle_game_solver/boggle.py | chengti-wang/stanCode-Projects | dec4ebb548e0b8a77478056775dca697ee1e11be | [
"MIT"
] | null | null | null | StanCode Projects/boggle_game_solver/boggle.py | chengti-wang/stanCode-Projects | dec4ebb548e0b8a77478056775dca697ee1e11be | [
"MIT"
] | null | null | null | # DICTIONARY VERSION
# """
# File: boggle.py
# Name:
# ----------------------------------------
# TODO:
# """
#
# import time
# from sys import exit
#
# # This is the file name of the dictionary txt file
# # we will be checking if a word exists by searching through it
# FILE = 'dictionary.txt'
# ROWS = 4
#
# word_dict = {}
#
# class TrieNode:
# def __init__(self):
# self.children = {}
# self.end = False
#
#
# class Trie:
# def __init__(self):
# self.root = TrieNode()
#
# def insert(self, word):
# cur = self.root
# for ch in word:
# if ch not in cur.children:
# cur.children[ch] = TrieNode()
# cur = cur.children[ch]
# else:
# cur = cur.children[ch]
#
# def search(self, word):
# cur = self.root
# for ch in word:
# if ch not in cur.children:
# return False
# cur = cur.children[ch]
# return cur.end
#
#
# def main():
# """
# TODO:
# """
#
# #input
# letters = []
# for i in range(ROWS):
# string = input(f"{i+1} row of letters: ").replace(" ", "")
# if len(string) != ROWS:
# print("Illegal input")
# exit(0)
# letters.append(string)
#
# start = time.time()
# d = {}
# read_dictionary(d)
# for i in range(ROWS):
# for j in range(ROWS):
# find_words(letters, letters[i][j], i, j, -1, -1, d)
# print(f"There are {len(word_dict)} in total")
# end = time.time()
# print('----------------------------------')
# print(f'The speed of your boggle algorithm: {end - start} seconds.')
#
#
# def read_dictionary(d):
# """
# This function reads file "dictionary.txt" stored in FILE
# and appends words in each line into a Python list
# """
# with open(FILE, "r") as f:
# for line in f:
# line = line.strip()
# if len(line) >= 4:
# if line[0] in d:
# d[line[0]].append(line)
# else:
# d[line[0]] = [line]
#
#
#
# def find_words(letters, current_s, row, col, prev_row, prev_col, d):
# # print(current_s)
# if len(current_s) >= 4 and current_s in d[current_s[0]]:
# if current_s not in word_dict:
# print(f"Found: {current_s}")
# word_dict[current_s] = 0
# if not has_prefix(current_s, d):
# return 0
# else:
# for i in [-1, 0, 1]:
# for j in [-1, 0, 1]:
# if 0 <= row+i < ROWS and 0 <= col+j < ROWS and not (i == 0 and j == 0) and not (row+i == prev_row and col+j ==prev_col):
# current_s += letters[row+i][col+j]
# find_words(letters, current_s, row+i, col+j, row, col, d)
# current_s = current_s[:-1]
#
#
# def has_prefix(sub_s, d):
# """
# :param sub_s: (str) A substring that is constructed by neighboring letters on a 4x4 square grid
# :return: (bool) If there is any words with prefix stored in sub_s
# """
# for key in d[sub_s[0]]:
# if key.startswith(sub_s):
# return True
#
#
#
# if __name__ == '__main__':
# main()
#
#
# """
# f y c l
# i o m g
# o r i l
# h j h u
# """
# TRIE VERSION
"""
File: boggle.py
Name:
----------------------------------------
TODO:
"""
import time
from sys import exit
# This is the file name of the dictionary txt file
# we will be checking if a word exists by searching through it
FILE = 'dictionary.txt'
ROWS = 4
word_dict = {}
class TrieNode:
def __init__(self):
self.children = {}
self.end = False
class Trie:
def __init__(self):
self.root = TrieNode()
def insert(self, word):
cur = self.root
for ch in word:
if ch not in cur.children:
cur.children[ch] = TrieNode()
cur = cur.children[ch]
else:
cur = cur.children[ch]
cur.end = True
def search(self, word):
cur = self.root
for ch in word:
if ch not in cur.children:
return False
cur = cur.children[ch]
return cur.end
def startswith(self, word):
cur = self.root
for ch in word:
if ch not in cur.children:
return False
cur = cur.children[ch]
return True
def main():
"""
TODO:
"""
#input
letters = []
for i in range(ROWS):
string = input(f"{i+1} row of letters: ").replace(" ", "")
if len(string) != ROWS:
print("Illegal input")
exit(0)
letters.append(string)
start = time.time()
d = Trie()
read_dictionary(d)
for i in range(ROWS):
for j in range(ROWS):
find_words(letters, letters[i][j], i, j, -1, -1, d)
print(f"There are {len(word_dict)} in total")
end = time.time()
print('----------------------------------')
print(f'The speed of your boggle algorithm: {end - start} seconds.')
def read_dictionary(d):
"""
This function reads file "dictionary.txt" stored in FILE
and appends words in each line into a Python list
"""
with open(FILE, "r") as f:
for line in f:
line = line.strip()
if len(line) >= 4:
d.insert(line)
def find_words(letters, current_s, row, col, prev_row, prev_col, d):
# print(current_s)
if len(current_s) >= 4 and d.search(current_s):
if current_s not in word_dict:
print(f"Found: {current_s}")
word_dict[current_s] = 0
if not d.startswith(current_s):
return 0
else:
for i in [-1, 0, 1]:
for j in [-1, 0, 1]:
if 0 <= row+i < ROWS and 0 <= col+j < ROWS and not (i == 0 and j == 0) and not (row+i == prev_row and col+j ==prev_col):
current_s += letters[row+i][col+j]
find_words(letters, current_s, row+i, col+j, row, col, d)
current_s = current_s[:-1]
def has_prefix(sub_s, d):
"""
:param sub_s: (str) A substring that is constructed by neighboring letters on a 4x4 square grid
:return: (bool) If there is any words with prefix stored in sub_s
"""
for key in d[sub_s[0]]:
if key.startswith(sub_s):
return True
if __name__ == '__main__':
main()
"""
f y c l
i o m g
o r i l
h j h u
""" | 21.206897 | 126 | 0.593135 | 901 | 5,535 | 3.537181 | 0.138735 | 0.062755 | 0.036712 | 0.035143 | 0.950737 | 0.950737 | 0.950737 | 0.950737 | 0.950737 | 0.950737 | 0 | 0.011696 | 0.227642 | 5,535 | 261 | 127 | 21.206897 | 0.733801 | 0.55393 | 0 | 0.291139 | 0 | 0 | 0.090748 | 0.015051 | 0 | 0 | 0 | 0.011494 | 0 | 1 | 0.113924 | false | 0 | 0.025316 | 0 | 0.240506 | 0.063291 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
975439075345c5ec82bf7b86fc37e8fe1aeb74a0 | 70 | py | Python | day1/day1-08 operators precedence.py | hajin-kim/2020-HighSchool-Python-Tutoring | 352025a954bff37d21cc3d59e7d5e0f0269a1f17 | [
"MIT"
] | null | null | null | day1/day1-08 operators precedence.py | hajin-kim/2020-HighSchool-Python-Tutoring | 352025a954bff37d21cc3d59e7d5e0f0269a1f17 | [
"MIT"
] | null | null | null | day1/day1-08 operators precedence.py | hajin-kim/2020-HighSchool-Python-Tutoring | 352025a954bff37d21cc3d59e7d5e0f0269a1f17 | [
"MIT"
] | null | null | null | a = 5 * -1 / 5 + 4 - 3
b = 5 * -1 / (5 + 4) - 3
print(a)
print(b)
| 14 | 25 | 0.342857 | 16 | 70 | 1.5 | 0.4375 | 0.166667 | 0.25 | 0.333333 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238095 | 0.4 | 70 | 4 | 26 | 17.5 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
97aebcb30846a6d77df7d8c7a09c745f567b9753 | 48,310 | py | Python | code/python/FactSetGeoRev/v1/fds/sdk/FactSetGeoRev/api/regions_api.py | factset/enterprise-sdk | 3fd4d1360756c515c9737a0c9a992c7451d7de7e | [
"Apache-2.0"
] | 6 | 2022-02-07T16:34:18.000Z | 2022-03-30T08:04:57.000Z | code/python/FactSetGeoRev/v1/fds/sdk/FactSetGeoRev/api/regions_api.py | factset/enterprise-sdk | 3fd4d1360756c515c9737a0c9a992c7451d7de7e | [
"Apache-2.0"
] | 2 | 2022-02-07T05:25:57.000Z | 2022-03-07T14:18:04.000Z | code/python/FactSetGeoRev/v1/fds/sdk/FactSetGeoRev/api/regions_api.py | factset/enterprise-sdk | 3fd4d1360756c515c9737a0c9a992c7451d7de7e | [
"Apache-2.0"
] | null | null | null | """
FactSet GeoRev API
FactSet Revere Geographic Revenue (\"GeoRev\") Exposure data provides a highly structured and normalized display of companies’ revenues by geography. Using a four level taxonomy structure, understand the companies' Super-Region-->Region-->Area-->Country revenue breakdowns. Quickly understand a company’s revenue exposure in countries impacted by geopolitical, macroeconomic, and market risk. Understand the geographic footprint of a company based on sources of revenue versus country of domicile, and analyze global revenue exposures at the company, index, or portfolio level.<p> Geographic revenue has historically been difficult to analyze due to companies’ non-standard and incomplete reporting. Investors relying solely on this as-reported data are limited in their ability to compare, aggregate or screen exposures across a universe or portfolio of companies. To achieve normalization, FactSet GeoRev captures data through a proprietary four-level geographic classification structure. An estimation algorithm based on GDP weighting and accounting logic is then applied to solve for any non-explicit disclosures. The result is a consistent, accurate, and flexible dataset that can take a company’s revenues and break them down into any geographic country or region categories.</p><p>As markets become more integrated and companies expand operations beyond their domestic markets, GeoRev provides a new and valuable country factor to help investors discover alpha, model risk exposure, optimize portfolio weighting, and improve fund administration and reporting.</p><p>Data Frequency - Annual; Update Frequency - Daily. 49,000+ Publically Listed Companies. All Russell 3000 and MSCI ACWI Index Consituents. U.S. Data is available from 2003, with Non-US data from 2007. For more details, visit [OA 17555](https://my.apps.factset.com/oa/pages/17555)</p> # noqa: E501
The version of the OpenAPI document: 1.0.1
Contact: api@factset.com
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from multiprocessing.pool import ApplyResult
import typing
from fds.sdk.FactSetGeoRev.api_client import ApiClient, Endpoint as _Endpoint
from fds.sdk.FactSetGeoRev.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from fds.sdk.FactSetGeoRev.exceptions import ApiException
from fds.sdk.FactSetGeoRev.model.error_response import ErrorResponse
from fds.sdk.FactSetGeoRev.model.region_request import RegionRequest
from fds.sdk.FactSetGeoRev.model.region_response import RegionResponse
class RegionsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
self.get_regions_endpoint = _Endpoint(
settings={
'response_type': (
{ 200: (RegionResponse,), 400: (ErrorResponse,), 401: (ErrorResponse,), 403: (ErrorResponse,), 415: (ErrorResponse,), 500: (ErrorResponse,), },
None
),
'auth': [
'FactSetApiKey',
'FactSetOAuth2'
],
'endpoint_path': '/factset-georev/v1/regions',
'operation_id': 'get_regions',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'ids',
'region_ids',
'start_date',
'end_date',
'frequency',
'currency',
],
'required': [
'ids',
],
'nullable': [
],
'enum': [
'frequency',
],
'validation': [
'ids',
'region_ids',
]
},
root_map={
'validations': {
('ids',): {
'max_items': 300,
'min_items': 1,
},
('region_ids',): {
'max_items': 15,
'min_items': 1,
},
},
'allowed_values': {
('frequency',): {
"D": "D",
"W": "W",
"M": "M",
"AM": "AM",
"CQ": "CQ",
"FQ": "FQ",
"AY": "AY",
"CY": "CY",
"FY": "FY",
"EMPTY": ""
},
},
'openapi_types': {
'ids':
([str],),
'region_ids':
([str],),
'start_date':
(str,),
'end_date':
(str,),
'frequency':
(str,),
'currency':
(str,),
},
'attribute_map': {
'ids': 'ids',
'region_ids': 'regionIds',
'start_date': 'startDate',
'end_date': 'endDate',
'frequency': 'frequency',
'currency': 'currency',
},
'location_map': {
'ids': 'query',
'region_ids': 'query',
'start_date': 'query',
'end_date': 'query',
'frequency': 'query',
'currency': 'query',
},
'collection_format_map': {
'ids': 'csv',
'region_ids': 'csv',
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client
)
self.get_regions_for_list_endpoint = _Endpoint(
settings={
'response_type': (
{ 200: (RegionResponse,), 400: (ErrorResponse,), 401: (ErrorResponse,), 403: (ErrorResponse,), 415: (ErrorResponse,), 500: (ErrorResponse,), },
None
),
'auth': [
'FactSetApiKey',
'FactSetOAuth2'
],
'endpoint_path': '/factset-georev/v1/regions',
'operation_id': 'get_regions_for_list',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'region_request',
],
'required': [
'region_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'region_request':
(RegionRequest,),
},
'attribute_map': {
},
'location_map': {
'region_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client
)
@staticmethod
def apply_kwargs_defaults(kwargs, return_http_data_only, async_req):
kwargs["async_req"] = async_req
kwargs["_return_http_data_only"] = return_http_data_only
kwargs["_preload_content"] = kwargs.get("_preload_content", True)
kwargs["_request_timeout"] = kwargs.get("_request_timeout", None)
kwargs["_check_input_type"] = kwargs.get("_check_input_type", True)
kwargs["_check_return_type"] = kwargs.get("_check_return_type", True)
kwargs["_spec_property_naming"] = kwargs.get("_spec_property_naming", False)
kwargs["_content_type"] = kwargs.get("_content_type")
kwargs["_host_index"] = kwargs.get("_host_index")
def get_regions(
self,
ids,
**kwargs
) -> RegionResponse:
"""Gets the revenue details for the requested Regions # noqa: E501
Gets the Geographic Revenue, Percents, Confidence, and Ranks for a requested list of ids and Regions, for a given start-date and end-date. Regions represent a taxonomy of Super Regions, Regions, and Areas, with Super Regions being the largest collection. *Country specific revenue can be requested in the /countries endpoint.* # noqa: E501
This method makes a synchronous HTTP request. Returns the http data only
Args:
ids ([str]): Security or Entity identifiers. FactSet Identifiers, tickers, CUSIP and SEDOL are accepted input. <p>***ids limit** = 300 per request*</p> *<p>Make note, GET Method URL request lines are also limited to a total length of 8192 bytes (8KB). In cases where the service allows for thousands of ids, which may lead to exceeding this request line limit of 8KB, its advised for any requests with large request lines to be requested through the respective \"POST\" method.</p>*
Keyword Args:
region_ids ([str]): The Regional Identifier or Regional Group. Groups include \"SUPER_REGIONS\", \"REGIONS\", and \"AREAS\". When requesting a group, all regionIds within that group will be requested. To limit or specify select regions returned in the response, provide a comma-separated list of the below regionIds. |Regional Group|regionId|Descriptions| |---|---|---| |Group|SUPER_REGIONS|Fetchs all regionIds for Super Regions| |Group|REGIONS|Fetchs all regionIds for Regions| |Group|AREAS|Fetchs all regionIds for Areas| |Level|regionId|Parent|Description|Level|regionId|Parent|Description| |---|---|---|---|---|---|---|---| |__**Super Region**__||||__**Area**__|||| |Super Region|R1|R100|Africa and Middle East|Area|R3|R2|Eastern Africa| |Super Region|R101|R100|Americas|Area|R18|R2|Southern Africa| |Super Region|R170|R100|Asia/Pacific|Area|R45|R2|Western Africa| |Super Region|R274|R100|Europe|Area|R69|R68|Central Middle East| |Super Region|R349|R100|Non-Disclosed Revenue|Area|R87|R68|Eastern Middle East| |Super Region|R354|R100|No Operations|Area|R97|R68|Western Middle East| |Super Region|R393|R100|Non-Geographic Revenue|Area|R103|R102|Caribbean| |__**Region**__||||Area|R135|R102|Central America| |Region|R2|R1|Africa|Area|R145|R102|South America| |Region|R68|R1|Middle East|Area|R165|R164|Other North America| |Region|R102|R101|Latin America|Area|R167|R164|United States and Canada| |Region|R164|R101|North America|Area|R172|R171|Far East| |Region|R171|R170|Asia|Area|R219|R171|Indian Region| |Region|R233|R170|Oceania|Area|R234|R233|Australia and New Zealand| |Region|R275|R274|European Union|Area|R237|R233|Pacific Islands| |Region|R314|R274|Non-European Union|Area|R276|R275|Eastern European Union| |Region|R350|R349|Region Unspecified|Area|R286|R275|Northern European Union| |Region|R355|R354|Non-Operating Region|Area|R292|R275|Southern European Union| |Region|R394|R393|Non-Geographic Revenue Region|Area|R298|R275|Western European Union| |Region|R398|R1|Africa and Middle East Unallocated Region|Area|R315|R314|Eastern Non-European Union| |Region|R401|R170|Asia/Pacific Unallocated Region|Area|R323|R314|Northern Non-European Union| |Region|R404|R101|Americas Unallocated Revenue Region|Area|R328|R314|Southern Non-European Union| |Region|R407|R274|Europe Unallocated Region|Area|R340|R314|Western Non-European Union| |||||Area|R351|R350|Area Unspecified| |||||Area|R356|R355|Non-Operating Area| |||||Area|R395|R394|Non-Geographic Revenue Area| |||||Area|R399|R398|Africa and Middle East Unallocated Area| . [optional] if omitted the server will use the default value of ["SUPER_REGIONS"]
start_date (str): The start date requested for a given date range in **YYYY-MM-DD** format. Data is available on a Fiscal Annual periodicity and updated Daily. If left blank, the API will default to latest available fiscal period. Future dates (T+1) are not accepted in this endpoint. . [optional]
end_date (str): The end date requested for a given date range in **YYYY-MM-DD** format. Data is available on a Fiscal Annual periodicity and updated daily. If left blank, the API will default to latest available fiscal period. Future dates (T+1) are not accepted in this endpoint. . [optional]
frequency (str): Controls the display frequency of the data returned. * **D** = Daily * **W** = Weekly, based on the last day of the week of the start date. * **M** = Monthly, based on the last trading day of the month. * **AM** = Monthly, based on the start date (e.g., if the start date is June 16, data is displayed for June 16, May 16, April 16 etc.). * **CQ** = Quarterly based on the last trading day of the calendar quarter (March, June, September, or December). * **FQ** = Fiscal Quarter of the company. * **AY** = Actual Annual, based on the start date. * **CY** = Calendar Annual, based on the last trading day of the calendar year. * **FY** = Fiscal Annual, based on the last trading day of the company's fiscal year. . [optional] if omitted the server will use the default value of "FY"
currency (str): Currency code for adjusting the data. For a list of currency ISO codes, visit [Online Assistant Page #1470](https://oa.apps.factset.com/pages/1470).. [optional]
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
Returns:
RegionResponse
Response Object
"""
self.apply_kwargs_defaults(kwargs=kwargs, return_http_data_only=True, async_req=False)
kwargs['ids'] = \
ids
return self.get_regions_endpoint.call_with_http_info(**kwargs)
def get_regions_with_http_info(
self,
ids,
**kwargs
) -> typing.Tuple[RegionResponse, int, typing.MutableMapping]:
"""Gets the revenue details for the requested Regions # noqa: E501
Gets the Geographic Revenue, Percents, Confidence, and Ranks for a requested list of ids and Regions, for a given start-date and end-date. Regions represent a taxonomy of Super Regions, Regions, and Areas, with Super Regions being the largest collection. *Country specific revenue can be requested in the /countries endpoint.* # noqa: E501
This method makes a synchronous HTTP request. Returns http data, http status and headers
Args:
ids ([str]): Security or Entity identifiers. FactSet Identifiers, tickers, CUSIP and SEDOL are accepted input. <p>***ids limit** = 300 per request*</p> *<p>Make note, GET Method URL request lines are also limited to a total length of 8192 bytes (8KB). In cases where the service allows for thousands of ids, which may lead to exceeding this request line limit of 8KB, its advised for any requests with large request lines to be requested through the respective \"POST\" method.</p>*
Keyword Args:
region_ids ([str]): The Regional Identifier or Regional Group. Groups include \"SUPER_REGIONS\", \"REGIONS\", and \"AREAS\". When requesting a group, all regionIds within that group will be requested. To limit or specify select regions returned in the response, provide a comma-separated list of the below regionIds. |Regional Group|regionId|Descriptions| |---|---|---| |Group|SUPER_REGIONS|Fetchs all regionIds for Super Regions| |Group|REGIONS|Fetchs all regionIds for Regions| |Group|AREAS|Fetchs all regionIds for Areas| |Level|regionId|Parent|Description|Level|regionId|Parent|Description| |---|---|---|---|---|---|---|---| |__**Super Region**__||||__**Area**__|||| |Super Region|R1|R100|Africa and Middle East|Area|R3|R2|Eastern Africa| |Super Region|R101|R100|Americas|Area|R18|R2|Southern Africa| |Super Region|R170|R100|Asia/Pacific|Area|R45|R2|Western Africa| |Super Region|R274|R100|Europe|Area|R69|R68|Central Middle East| |Super Region|R349|R100|Non-Disclosed Revenue|Area|R87|R68|Eastern Middle East| |Super Region|R354|R100|No Operations|Area|R97|R68|Western Middle East| |Super Region|R393|R100|Non-Geographic Revenue|Area|R103|R102|Caribbean| |__**Region**__||||Area|R135|R102|Central America| |Region|R2|R1|Africa|Area|R145|R102|South America| |Region|R68|R1|Middle East|Area|R165|R164|Other North America| |Region|R102|R101|Latin America|Area|R167|R164|United States and Canada| |Region|R164|R101|North America|Area|R172|R171|Far East| |Region|R171|R170|Asia|Area|R219|R171|Indian Region| |Region|R233|R170|Oceania|Area|R234|R233|Australia and New Zealand| |Region|R275|R274|European Union|Area|R237|R233|Pacific Islands| |Region|R314|R274|Non-European Union|Area|R276|R275|Eastern European Union| |Region|R350|R349|Region Unspecified|Area|R286|R275|Northern European Union| |Region|R355|R354|Non-Operating Region|Area|R292|R275|Southern European Union| |Region|R394|R393|Non-Geographic Revenue Region|Area|R298|R275|Western European Union| |Region|R398|R1|Africa and Middle East Unallocated Region|Area|R315|R314|Eastern Non-European Union| |Region|R401|R170|Asia/Pacific Unallocated Region|Area|R323|R314|Northern Non-European Union| |Region|R404|R101|Americas Unallocated Revenue Region|Area|R328|R314|Southern Non-European Union| |Region|R407|R274|Europe Unallocated Region|Area|R340|R314|Western Non-European Union| |||||Area|R351|R350|Area Unspecified| |||||Area|R356|R355|Non-Operating Area| |||||Area|R395|R394|Non-Geographic Revenue Area| |||||Area|R399|R398|Africa and Middle East Unallocated Area| . [optional] if omitted the server will use the default value of ["SUPER_REGIONS"]
start_date (str): The start date requested for a given date range in **YYYY-MM-DD** format. Data is available on a Fiscal Annual periodicity and updated Daily. If left blank, the API will default to latest available fiscal period. Future dates (T+1) are not accepted in this endpoint. . [optional]
end_date (str): The end date requested for a given date range in **YYYY-MM-DD** format. Data is available on a Fiscal Annual periodicity and updated daily. If left blank, the API will default to latest available fiscal period. Future dates (T+1) are not accepted in this endpoint. . [optional]
frequency (str): Controls the display frequency of the data returned. * **D** = Daily * **W** = Weekly, based on the last day of the week of the start date. * **M** = Monthly, based on the last trading day of the month. * **AM** = Monthly, based on the start date (e.g., if the start date is June 16, data is displayed for June 16, May 16, April 16 etc.). * **CQ** = Quarterly based on the last trading day of the calendar quarter (March, June, September, or December). * **FQ** = Fiscal Quarter of the company. * **AY** = Actual Annual, based on the start date. * **CY** = Calendar Annual, based on the last trading day of the calendar year. * **FY** = Fiscal Annual, based on the last trading day of the company's fiscal year. . [optional] if omitted the server will use the default value of "FY"
currency (str): Currency code for adjusting the data. For a list of currency ISO codes, visit [Online Assistant Page #1470](https://oa.apps.factset.com/pages/1470).. [optional]
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
Returns:
RegionResponse
Response Object
int
Http Status Code
dict
Dictionary of the response headers
"""
self.apply_kwargs_defaults(kwargs=kwargs, return_http_data_only=False, async_req=False)
kwargs['ids'] = \
ids
return self.get_regions_endpoint.call_with_http_info(**kwargs)
def get_regions_async(
self,
ids,
**kwargs
) -> "ApplyResult[RegionResponse]":
"""Gets the revenue details for the requested Regions # noqa: E501
Gets the Geographic Revenue, Percents, Confidence, and Ranks for a requested list of ids and Regions, for a given start-date and end-date. Regions represent a taxonomy of Super Regions, Regions, and Areas, with Super Regions being the largest collection. *Country specific revenue can be requested in the /countries endpoint.* # noqa: E501
This method makes a asynchronous HTTP request. Returns the http data, wrapped in ApplyResult
Args:
ids ([str]): Security or Entity identifiers. FactSet Identifiers, tickers, CUSIP and SEDOL are accepted input. <p>***ids limit** = 300 per request*</p> *<p>Make note, GET Method URL request lines are also limited to a total length of 8192 bytes (8KB). In cases where the service allows for thousands of ids, which may lead to exceeding this request line limit of 8KB, its advised for any requests with large request lines to be requested through the respective \"POST\" method.</p>*
Keyword Args:
region_ids ([str]): The Regional Identifier or Regional Group. Groups include \"SUPER_REGIONS\", \"REGIONS\", and \"AREAS\". When requesting a group, all regionIds within that group will be requested. To limit or specify select regions returned in the response, provide a comma-separated list of the below regionIds. |Regional Group|regionId|Descriptions| |---|---|---| |Group|SUPER_REGIONS|Fetchs all regionIds for Super Regions| |Group|REGIONS|Fetchs all regionIds for Regions| |Group|AREAS|Fetchs all regionIds for Areas| |Level|regionId|Parent|Description|Level|regionId|Parent|Description| |---|---|---|---|---|---|---|---| |__**Super Region**__||||__**Area**__|||| |Super Region|R1|R100|Africa and Middle East|Area|R3|R2|Eastern Africa| |Super Region|R101|R100|Americas|Area|R18|R2|Southern Africa| |Super Region|R170|R100|Asia/Pacific|Area|R45|R2|Western Africa| |Super Region|R274|R100|Europe|Area|R69|R68|Central Middle East| |Super Region|R349|R100|Non-Disclosed Revenue|Area|R87|R68|Eastern Middle East| |Super Region|R354|R100|No Operations|Area|R97|R68|Western Middle East| |Super Region|R393|R100|Non-Geographic Revenue|Area|R103|R102|Caribbean| |__**Region**__||||Area|R135|R102|Central America| |Region|R2|R1|Africa|Area|R145|R102|South America| |Region|R68|R1|Middle East|Area|R165|R164|Other North America| |Region|R102|R101|Latin America|Area|R167|R164|United States and Canada| |Region|R164|R101|North America|Area|R172|R171|Far East| |Region|R171|R170|Asia|Area|R219|R171|Indian Region| |Region|R233|R170|Oceania|Area|R234|R233|Australia and New Zealand| |Region|R275|R274|European Union|Area|R237|R233|Pacific Islands| |Region|R314|R274|Non-European Union|Area|R276|R275|Eastern European Union| |Region|R350|R349|Region Unspecified|Area|R286|R275|Northern European Union| |Region|R355|R354|Non-Operating Region|Area|R292|R275|Southern European Union| |Region|R394|R393|Non-Geographic Revenue Region|Area|R298|R275|Western European Union| |Region|R398|R1|Africa and Middle East Unallocated Region|Area|R315|R314|Eastern Non-European Union| |Region|R401|R170|Asia/Pacific Unallocated Region|Area|R323|R314|Northern Non-European Union| |Region|R404|R101|Americas Unallocated Revenue Region|Area|R328|R314|Southern Non-European Union| |Region|R407|R274|Europe Unallocated Region|Area|R340|R314|Western Non-European Union| |||||Area|R351|R350|Area Unspecified| |||||Area|R356|R355|Non-Operating Area| |||||Area|R395|R394|Non-Geographic Revenue Area| |||||Area|R399|R398|Africa and Middle East Unallocated Area| . [optional] if omitted the server will use the default value of ["SUPER_REGIONS"]
start_date (str): The start date requested for a given date range in **YYYY-MM-DD** format. Data is available on a Fiscal Annual periodicity and updated Daily. If left blank, the API will default to latest available fiscal period. Future dates (T+1) are not accepted in this endpoint. . [optional]
end_date (str): The end date requested for a given date range in **YYYY-MM-DD** format. Data is available on a Fiscal Annual periodicity and updated daily. If left blank, the API will default to latest available fiscal period. Future dates (T+1) are not accepted in this endpoint. . [optional]
frequency (str): Controls the display frequency of the data returned. * **D** = Daily * **W** = Weekly, based on the last day of the week of the start date. * **M** = Monthly, based on the last trading day of the month. * **AM** = Monthly, based on the start date (e.g., if the start date is June 16, data is displayed for June 16, May 16, April 16 etc.). * **CQ** = Quarterly based on the last trading day of the calendar quarter (March, June, September, or December). * **FQ** = Fiscal Quarter of the company. * **AY** = Actual Annual, based on the start date. * **CY** = Calendar Annual, based on the last trading day of the calendar year. * **FY** = Fiscal Annual, based on the last trading day of the company's fiscal year. . [optional] if omitted the server will use the default value of "FY"
currency (str): Currency code for adjusting the data. For a list of currency ISO codes, visit [Online Assistant Page #1470](https://oa.apps.factset.com/pages/1470).. [optional]
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
Returns:
ApplyResult[RegionResponse]
"""
self.apply_kwargs_defaults(kwargs=kwargs, return_http_data_only=True, async_req=True)
kwargs['ids'] = \
ids
return self.get_regions_endpoint.call_with_http_info(**kwargs)
def get_regions_with_http_info_async(
self,
ids,
**kwargs
) -> "ApplyResult[typing.Tuple[RegionResponse, int, typing.MutableMapping]]":
"""Gets the revenue details for the requested Regions # noqa: E501
Gets the Geographic Revenue, Percents, Confidence, and Ranks for a requested list of ids and Regions, for a given start-date and end-date. Regions represent a taxonomy of Super Regions, Regions, and Areas, with Super Regions being the largest collection. *Country specific revenue can be requested in the /countries endpoint.* # noqa: E501
This method makes a asynchronous HTTP request. Returns http data, http status and headers, wrapped in ApplyResult
Args:
ids ([str]): Security or Entity identifiers. FactSet Identifiers, tickers, CUSIP and SEDOL are accepted input. <p>***ids limit** = 300 per request*</p> *<p>Make note, GET Method URL request lines are also limited to a total length of 8192 bytes (8KB). In cases where the service allows for thousands of ids, which may lead to exceeding this request line limit of 8KB, its advised for any requests with large request lines to be requested through the respective \"POST\" method.</p>*
Keyword Args:
region_ids ([str]): The Regional Identifier or Regional Group. Groups include \"SUPER_REGIONS\", \"REGIONS\", and \"AREAS\". When requesting a group, all regionIds within that group will be requested. To limit or specify select regions returned in the response, provide a comma-separated list of the below regionIds. |Regional Group|regionId|Descriptions| |---|---|---| |Group|SUPER_REGIONS|Fetchs all regionIds for Super Regions| |Group|REGIONS|Fetchs all regionIds for Regions| |Group|AREAS|Fetchs all regionIds for Areas| |Level|regionId|Parent|Description|Level|regionId|Parent|Description| |---|---|---|---|---|---|---|---| |__**Super Region**__||||__**Area**__|||| |Super Region|R1|R100|Africa and Middle East|Area|R3|R2|Eastern Africa| |Super Region|R101|R100|Americas|Area|R18|R2|Southern Africa| |Super Region|R170|R100|Asia/Pacific|Area|R45|R2|Western Africa| |Super Region|R274|R100|Europe|Area|R69|R68|Central Middle East| |Super Region|R349|R100|Non-Disclosed Revenue|Area|R87|R68|Eastern Middle East| |Super Region|R354|R100|No Operations|Area|R97|R68|Western Middle East| |Super Region|R393|R100|Non-Geographic Revenue|Area|R103|R102|Caribbean| |__**Region**__||||Area|R135|R102|Central America| |Region|R2|R1|Africa|Area|R145|R102|South America| |Region|R68|R1|Middle East|Area|R165|R164|Other North America| |Region|R102|R101|Latin America|Area|R167|R164|United States and Canada| |Region|R164|R101|North America|Area|R172|R171|Far East| |Region|R171|R170|Asia|Area|R219|R171|Indian Region| |Region|R233|R170|Oceania|Area|R234|R233|Australia and New Zealand| |Region|R275|R274|European Union|Area|R237|R233|Pacific Islands| |Region|R314|R274|Non-European Union|Area|R276|R275|Eastern European Union| |Region|R350|R349|Region Unspecified|Area|R286|R275|Northern European Union| |Region|R355|R354|Non-Operating Region|Area|R292|R275|Southern European Union| |Region|R394|R393|Non-Geographic Revenue Region|Area|R298|R275|Western European Union| |Region|R398|R1|Africa and Middle East Unallocated Region|Area|R315|R314|Eastern Non-European Union| |Region|R401|R170|Asia/Pacific Unallocated Region|Area|R323|R314|Northern Non-European Union| |Region|R404|R101|Americas Unallocated Revenue Region|Area|R328|R314|Southern Non-European Union| |Region|R407|R274|Europe Unallocated Region|Area|R340|R314|Western Non-European Union| |||||Area|R351|R350|Area Unspecified| |||||Area|R356|R355|Non-Operating Area| |||||Area|R395|R394|Non-Geographic Revenue Area| |||||Area|R399|R398|Africa and Middle East Unallocated Area| . [optional] if omitted the server will use the default value of ["SUPER_REGIONS"]
start_date (str): The start date requested for a given date range in **YYYY-MM-DD** format. Data is available on a Fiscal Annual periodicity and updated Daily. If left blank, the API will default to latest available fiscal period. Future dates (T+1) are not accepted in this endpoint. . [optional]
end_date (str): The end date requested for a given date range in **YYYY-MM-DD** format. Data is available on a Fiscal Annual periodicity and updated daily. If left blank, the API will default to latest available fiscal period. Future dates (T+1) are not accepted in this endpoint. . [optional]
frequency (str): Controls the display frequency of the data returned. * **D** = Daily * **W** = Weekly, based on the last day of the week of the start date. * **M** = Monthly, based on the last trading day of the month. * **AM** = Monthly, based on the start date (e.g., if the start date is June 16, data is displayed for June 16, May 16, April 16 etc.). * **CQ** = Quarterly based on the last trading day of the calendar quarter (March, June, September, or December). * **FQ** = Fiscal Quarter of the company. * **AY** = Actual Annual, based on the start date. * **CY** = Calendar Annual, based on the last trading day of the calendar year. * **FY** = Fiscal Annual, based on the last trading day of the company's fiscal year. . [optional] if omitted the server will use the default value of "FY"
currency (str): Currency code for adjusting the data. For a list of currency ISO codes, visit [Online Assistant Page #1470](https://oa.apps.factset.com/pages/1470).. [optional]
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
Returns:
ApplyResult[(RegionResponse, int, typing.Dict)]
"""
self.apply_kwargs_defaults(kwargs=kwargs, return_http_data_only=False, async_req=True)
kwargs['ids'] = \
ids
return self.get_regions_endpoint.call_with_http_info(**kwargs)
def get_regions_for_list(
self,
region_request,
**kwargs
) -> RegionResponse:
"""Gets the revenue details for the requested Regions. Use for large lists of company ids. # noqa: E501
Gets the Geographic Revenue, Percents, Confidence, and Ranks for a requested list of ids and Regions, for a given start-date and end-date. Regions represent a taxonomy of Super Regions, Regions, and Areas, with Super Regions being the largest collection. *Country specific revenue can be requested in the /countries endpoint.* # noqa: E501
This method makes a synchronous HTTP request. Returns the http data only
Args:
region_request (RegionRequest): The Region request body, allowing the user to specify a list of ids, time range, and regionIds.
Keyword Args:
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
Returns:
RegionResponse
Response Object
"""
self.apply_kwargs_defaults(kwargs=kwargs, return_http_data_only=True, async_req=False)
kwargs['region_request'] = \
region_request
return self.get_regions_for_list_endpoint.call_with_http_info(**kwargs)
def get_regions_for_list_with_http_info(
self,
region_request,
**kwargs
) -> typing.Tuple[RegionResponse, int, typing.MutableMapping]:
"""Gets the revenue details for the requested Regions. Use for large lists of company ids. # noqa: E501
Gets the Geographic Revenue, Percents, Confidence, and Ranks for a requested list of ids and Regions, for a given start-date and end-date. Regions represent a taxonomy of Super Regions, Regions, and Areas, with Super Regions being the largest collection. *Country specific revenue can be requested in the /countries endpoint.* # noqa: E501
This method makes a synchronous HTTP request. Returns http data, http status and headers
Args:
region_request (RegionRequest): The Region request body, allowing the user to specify a list of ids, time range, and regionIds.
Keyword Args:
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
Returns:
RegionResponse
Response Object
int
Http Status Code
dict
Dictionary of the response headers
"""
self.apply_kwargs_defaults(kwargs=kwargs, return_http_data_only=False, async_req=False)
kwargs['region_request'] = \
region_request
return self.get_regions_for_list_endpoint.call_with_http_info(**kwargs)
def get_regions_for_list_async(
self,
region_request,
**kwargs
) -> "ApplyResult[RegionResponse]":
"""Gets the revenue details for the requested Regions. Use for large lists of company ids. # noqa: E501
Gets the Geographic Revenue, Percents, Confidence, and Ranks for a requested list of ids and Regions, for a given start-date and end-date. Regions represent a taxonomy of Super Regions, Regions, and Areas, with Super Regions being the largest collection. *Country specific revenue can be requested in the /countries endpoint.* # noqa: E501
This method makes a asynchronous HTTP request. Returns the http data, wrapped in ApplyResult
Args:
region_request (RegionRequest): The Region request body, allowing the user to specify a list of ids, time range, and regionIds.
Keyword Args:
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
Returns:
ApplyResult[RegionResponse]
"""
self.apply_kwargs_defaults(kwargs=kwargs, return_http_data_only=True, async_req=True)
kwargs['region_request'] = \
region_request
return self.get_regions_for_list_endpoint.call_with_http_info(**kwargs)
def get_regions_for_list_with_http_info_async(
self,
region_request,
**kwargs
) -> "ApplyResult[typing.Tuple[RegionResponse, int, typing.MutableMapping]]":
"""Gets the revenue details for the requested Regions. Use for large lists of company ids. # noqa: E501
Gets the Geographic Revenue, Percents, Confidence, and Ranks for a requested list of ids and Regions, for a given start-date and end-date. Regions represent a taxonomy of Super Regions, Regions, and Areas, with Super Regions being the largest collection. *Country specific revenue can be requested in the /countries endpoint.* # noqa: E501
This method makes a asynchronous HTTP request. Returns http data, http status and headers, wrapped in ApplyResult
Args:
region_request (RegionRequest): The Region request body, allowing the user to specify a list of ids, time range, and regionIds.
Keyword Args:
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
Returns:
ApplyResult[(RegionResponse, int, typing.Dict)]
"""
self.apply_kwargs_defaults(kwargs=kwargs, return_http_data_only=False, async_req=True)
kwargs['region_request'] = \
region_request
return self.get_regions_for_list_endpoint.call_with_http_info(**kwargs)
| 78.425325 | 2,623 | 0.653446 | 6,278 | 48,310 | 4.950143 | 0.088882 | 0.013901 | 0.017119 | 0.00901 | 0.889275 | 0.883676 | 0.879847 | 0.875084 | 0.875084 | 0.875084 | 0 | 0.036904 | 0.259035 | 48,310 | 615 | 2,624 | 78.552846 | 0.831266 | 0.750279 | 0 | 0.554307 | 0 | 0 | 0.160838 | 0.034871 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037453 | false | 0 | 0.037453 | 0 | 0.108614 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c19f8092b2d6b3f354f54e6a0411c8806151758c | 1,560 | py | Python | pcstac/tests/resources/test_queryables.py | gadomski/planetary-computer-apis | 53a04c0b24b9ccc06812bfb8ac2961bbfe58a108 | [
"MIT"
] | 36 | 2021-11-02T16:13:47.000Z | 2022-03-29T16:34:58.000Z | pcstac/tests/resources/test_queryables.py | gadomski/planetary-computer-apis | 53a04c0b24b9ccc06812bfb8ac2961bbfe58a108 | [
"MIT"
] | 25 | 2021-11-01T15:27:40.000Z | 2022-03-29T17:53:05.000Z | pcstac/tests/resources/test_queryables.py | gadomski/planetary-computer-apis | 53a04c0b24b9ccc06812bfb8ac2961bbfe58a108 | [
"MIT"
] | 10 | 2021-11-02T16:09:58.000Z | 2022-03-25T18:32:15.000Z | import pytest
@pytest.mark.asyncio
async def test_queryables(app_client):
resp = await app_client.get("/queryables")
assert resp.status_code == 200
properties = resp.json()["properties"]
assert "id" in properties
assert "datetime" in properties
assert "naip:year" in properties
assert "naip:state" in properties
@pytest.mark.asyncio
async def test_queryables_io_lulc(app_client):
resp = await app_client.get("/queryables")
assert resp.status_code == 200
properties = resp.json()["properties"]
assert "id" in properties
assert "datetime" in properties
assert "naip:year" in properties
assert "naip:state" in properties
@pytest.mark.asyncio
async def test_collection_queryables_io_lulc(app_client):
resp = await app_client.get("/collections/io-lulc/queryables")
assert resp.status_code == 200
properties = resp.json()["properties"]
assert "id" in properties
assert "datetime" in properties
assert "io:supercell_id" in properties
@pytest.mark.asyncio
async def test_collection_queryables_naip(app_client):
resp = await app_client.get("/collections/naip/queryables")
assert resp.status_code == 200
properties = resp.json()["properties"]
assert "id" in properties
assert "datetime" in properties
assert "naip:year" in properties
assert "naip:state" in properties
@pytest.mark.asyncio
async def test_collection_queryables_404(app_client):
resp = await app_client.get("/collections/does-not-exist/queryables")
assert resp.status_code == 404
| 30.588235 | 73 | 0.732051 | 204 | 1,560 | 5.45098 | 0.166667 | 0.215827 | 0.178058 | 0.118705 | 0.948741 | 0.921763 | 0.921763 | 0.886691 | 0.803058 | 0.803058 | 0 | 0.013846 | 0.166667 | 1,560 | 50 | 74 | 31.2 | 0.841538 | 0 | 0 | 0.725 | 0 | 0 | 0.173718 | 0.062179 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | false | 0 | 0.025 | 0 | 0.025 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c1d31c7eb071effb8ec83a7e1eee2d1d56def7ee | 2,148 | py | Python | functional_tests/test_login_page.py | HelloMelanieC/FiveUp | ab97d311f163b09146fe330e4360d8e75d769f95 | [
"MIT"
] | 12 | 2017-09-10T01:43:42.000Z | 2020-09-20T01:17:20.000Z | functional_tests/test_login_page.py | HelloMelanieC/FiveUp | ab97d311f163b09146fe330e4360d8e75d769f95 | [
"MIT"
] | 22 | 2016-12-26T21:46:10.000Z | 2022-02-10T08:01:52.000Z | functional_tests/test_login_page.py | HelloMelanieC/FiveUp | ab97d311f163b09146fe330e4360d8e75d769f95 | [
"MIT"
] | 4 | 2017-08-24T16:01:37.000Z | 2019-02-14T23:50:17.000Z | from fuauth.models import User
from .utils import SeleniumTestCase
class LoginTest(SeleniumTestCase):
def setUp(self):
User.objects.create_user(
"Melanie",
"6192222222",
User.ATT,
User.HAWAII,
email="test@gmail.com",
password="testpants",
)
def test_successful_login(self):
with self.wait_for_page_load():
self.browser.get(self.live_server_url + "/login/")
self.browser.find_element_by_name("username").send_keys("test@gmail.com")
self.browser.find_element_by_name("password").send_keys("testpants")
with self.wait_for_page_load():
self.browser.find_element_by_css_selector("*[type=submit]").click()
text = self.browser.find_element_by_tag_name("body").text
self.assertIn("Hi " + "Melanie" + ".", text)
def test_unsuccessful_login(self):
self.browser.get(self.live_server_url + "/login/")
self.browser.find_element_by_name("username").send_keys("test@gmail.com")
self.browser.find_element_by_name("password").send_keys("notgonnawork")
with self.wait_for_page_load():
self.browser.find_element_by_css_selector("*[type=submit]").click()
text = self.browser.find_element_by_tag_name("body").text
self.assertIn("Please enter a correct your email address and password", text)
class ForgotPasswordTest(SeleniumTestCase):
def setUp(self):
User.objects.create_user(
"Melanie",
"6192222222",
User.ATT,
User.HAWAII,
email="test@gmail.com",
password="testpants",
)
def test_existing_user(self):
with self.wait_for_page_load():
self.browser.get(self.live_server_url + "/password_reset/")
self.browser.find_element_by_name("email").send_keys("test@gmail.com")
with self.wait_for_page_load():
self.browser.find_element_by_css_selector("*[type=submit]").click()
text = self.browser.find_element_by_tag_name("body").text
self.assertIn("Password reset sent", text)
| 38.357143 | 85 | 0.643855 | 262 | 2,148 | 4.996183 | 0.251908 | 0.117647 | 0.12605 | 0.184874 | 0.792208 | 0.776929 | 0.755539 | 0.755539 | 0.755539 | 0.755539 | 0 | 0.012099 | 0.230447 | 2,148 | 55 | 86 | 39.054545 | 0.779794 | 0 | 0 | 0.659574 | 0 | 0 | 0.162011 | 0 | 0 | 0 | 0 | 0 | 0.06383 | 1 | 0.106383 | false | 0.170213 | 0.042553 | 0 | 0.191489 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
de0cc0b60e3881338475ea49d638cddfc417f788 | 1,422 | py | Python | biobb_analysis/test/unitests/test_ambertools/test_cpptraj_rmsf.py | bioexcel/biobb_analysis | 794683daf65eb13ddaaaf6cf3c19da6d1322a949 | [
"Apache-2.0"
] | 3 | 2019-05-18T14:52:30.000Z | 2020-10-18T06:20:00.000Z | biobb_analysis/test/unitests/test_ambertools/test_cpptraj_rmsf.py | bioexcel/biobb_analysis | 794683daf65eb13ddaaaf6cf3c19da6d1322a949 | [
"Apache-2.0"
] | 7 | 2019-03-04T15:04:28.000Z | 2021-06-17T10:57:25.000Z | biobb_analysis/test/unitests/test_ambertools/test_cpptraj_rmsf.py | bioexcel/biobb_analysis | 794683daf65eb13ddaaaf6cf3c19da6d1322a949 | [
"Apache-2.0"
] | null | null | null | from biobb_common.tools import test_fixtures as fx
from biobb_analysis.ambertools.cpptraj_rmsf import cpptraj_rmsf
class TestCpptrajRmsfFirst():
def setUp(self):
fx.test_setup(self,'cpptraj_rmsf_first')
def tearDown(self):
fx.test_teardown(self)
pass
def test_rmsf_first(self):
cpptraj_rmsf(properties=self.properties, **self.paths)
assert fx.not_empty(self.paths['output_cpptraj_path'])
assert fx.equal(self.paths['output_cpptraj_path'], self.paths['ref_output_cpptraj_path'])
class TestCpptrajRmsfAverage():
def setUp(self):
fx.test_setup(self,'cpptraj_rmsf_average')
def tearDown(self):
fx.test_teardown(self)
pass
def test_rmsf_average(self):
cpptraj_rmsf(properties=self.properties, **self.paths)
assert fx.not_empty(self.paths['output_cpptraj_path'])
assert fx.equal(self.paths['output_cpptraj_path'], self.paths['ref_output_cpptraj_path'])
class TestCpptrajRmsfExperimental():
def setUp(self):
fx.test_setup(self,'cpptraj_rmsf_experimental')
def tearDown(self):
fx.test_teardown(self)
pass
def test_rmsf_experimental(self):
cpptraj_rmsf(properties=self.properties, **self.paths)
assert fx.not_empty(self.paths['output_cpptraj_path'])
assert fx.equal(self.paths['output_cpptraj_path'], self.paths['ref_output_cpptraj_path'])
| 32.318182 | 97 | 0.71097 | 183 | 1,422 | 5.251366 | 0.185792 | 0.112383 | 0.159209 | 0.137357 | 0.772112 | 0.772112 | 0.772112 | 0.772112 | 0.772112 | 0.653486 | 0 | 0 | 0.179325 | 1,422 | 43 | 98 | 33.069767 | 0.823479 | 0 | 0 | 0.65625 | 0 | 0 | 0.173118 | 0.066151 | 0 | 0 | 0 | 0 | 0.1875 | 1 | 0.28125 | false | 0.09375 | 0.0625 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
a9e6172741f5d712d1f2ae52169993598dc5fcff | 2,009 | py | Python | mocat/src/tests/test_kernels.py | SamDuffield/mocat | 60d38ed8a6f01a4f23cd1c0ebe21905442e0af8f | [
"MIT"
] | 13 | 2020-06-16T19:18:12.000Z | 2022-03-01T15:53:26.000Z | mocat/src/tests/test_kernels.py | SamDuffield/mocat | 60d38ed8a6f01a4f23cd1c0ebe21905442e0af8f | [
"MIT"
] | null | null | null | mocat/src/tests/test_kernels.py | SamDuffield/mocat | 60d38ed8a6f01a4f23cd1c0ebe21905442e0af8f | [
"MIT"
] | null | null | null | ########################################################################################################################
# Module: tests/test_kernels.py
# Description: Tests for kernels
#
# Web: https://github.com/SamDuffield/mocat
########################################################################################################################
import unittest
import jax.numpy as jnp
import numpy.testing as npt
from mocat.src import kernels
class TestGaussianKernel(unittest.TestCase):
kernel = kernels.Gaussian()
def test_call(self):
npt.assert_array_almost_equal(self.kernel(jnp.zeros(5), jnp.zeros(5)), 1.)
npt.assert_array_almost_equal(self.kernel(jnp.zeros(5), jnp.ones(5)), 0.082085006)
def test_grad_x(self):
npt.assert_array_almost_equal(self.kernel.grad_x(jnp.zeros(5), jnp.zeros(5)), jnp.zeros(5))
npt.assert_array_almost_equal(self.kernel.grad_x(jnp.zeros(5), jnp.ones(5)), jnp.ones(5) * 0.082085006)
def test_grad_y(self):
npt.assert_array_almost_equal(self.kernel.grad_y(jnp.zeros(5), jnp.zeros(5)), jnp.zeros(5))
npt.assert_array_almost_equal(self.kernel.grad_y(jnp.zeros(5), jnp.ones(5)), jnp.ones(5) * -0.082085006)
class TestIMQKernel(unittest.TestCase):
kernel = kernels.IMQ()
def test_call(self):
npt.assert_array_almost_equal(self.kernel(jnp.zeros(5), jnp.zeros(5)), 1.)
npt.assert_array_almost_equal(self.kernel(jnp.zeros(5), jnp.ones(5)), 0.5345225)
def test_grad_x(self):
npt.assert_array_almost_equal(self.kernel.grad_x(jnp.zeros(5), jnp.zeros(5)), jnp.zeros(5))
npt.assert_array_almost_equal(self.kernel.grad_x(jnp.zeros(5), jnp.ones(5)), jnp.ones(5) * 0.07636035)
def test_grad_y(self):
npt.assert_array_almost_equal(self.kernel.grad_y(jnp.zeros(5), jnp.zeros(5)), jnp.zeros(5))
npt.assert_array_almost_equal(self.kernel.grad_y(jnp.zeros(5), jnp.ones(5)), jnp.ones(5) * -0.07636035)
if __name__ == '__main__':
unittest.main()
| 40.18 | 120 | 0.621702 | 289 | 2,009 | 4.103806 | 0.176471 | 0.148398 | 0.166948 | 0.161889 | 0.744519 | 0.744519 | 0.744519 | 0.744519 | 0.744519 | 0.736931 | 0 | 0.051049 | 0.122449 | 2,009 | 49 | 121 | 41 | 0.621668 | 0.050772 | 0 | 0.428571 | 0 | 0 | 0.004813 | 0 | 0 | 0 | 0 | 0 | 0.428571 | 1 | 0.214286 | false | 0 | 0.142857 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e70f134b953da5997faf1ed7d728c9075acfea8d | 6,279 | py | Python | codes/utils/CNN3Ds.py | FesianXu/LipNet_ChineseWordsClassification | e75a3093d7999f4efd6fec0aebf0111dd0d7d1a6 | [
"Apache-2.0"
] | 39 | 2019-11-17T11:31:26.000Z | 2022-01-11T12:53:51.000Z | codes/utils/CNN3Ds.py | FesianXu/LipNet_ChineseWordsClassification | e75a3093d7999f4efd6fec0aebf0111dd0d7d1a6 | [
"Apache-2.0"
] | null | null | null | codes/utils/CNN3Ds.py | FesianXu/LipNet_ChineseWordsClassification | e75a3093d7999f4efd6fec0aebf0111dd0d7d1a6 | [
"Apache-2.0"
] | 6 | 2019-12-09T14:07:30.000Z | 2021-08-01T02:12:26.000Z |
import torch
import torch.nn as nn
import torch.nn.functional as F
from collections import OrderedDict
class Naive3DCNN(nn.Module):
def __init__(self, cnnDropout=0.5):
super().__init__()
self.features = nn.Sequential(OrderedDict([
('conv', nn.Conv3d(3, 32, kernel_size=(3, 5,5), stride=(1, 2,2), padding=(1,2,2))),
('norm', nn.BatchNorm3d(32)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
self.features2 = nn.Sequential(OrderedDict([
('conv', nn.Conv3d(32, 64, kernel_size=(3, 5,5), stride=(1, 1, 1), padding=(1,2,2))),
('norm', nn.BatchNorm3d(64)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
self.features3 = nn.Sequential(OrderedDict([
('conv', nn.Conv3d(64, 96, kernel_size=(3, 3, 3), stride=(1, 1, 1), padding=(1,1,1))),
('norm', nn.BatchNorm3d(96)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
def forward(self, inputv):
cnn = self.features(inputv)
cnn = self.features2(cnn)
cnn = self.features3(cnn)
cnn = cnn.permute(0, 2, 1, 3, 4).contiguous()
batch, seq, channel, height, width = cnn.size()
cnn = cnn.view(batch, seq, -1)
return cnn
@staticmethod
def get_outshape():
return 96*3*7
class ST_splitted_CNN(nn.Module):
def __init__(self, cnnDropout=0.5):
super().__init__()
self.features = nn.Sequential(OrderedDict([
('s_conv', nn.Conv3d(3, 32, kernel_size=(3, 3,3), stride=(1, 1,1), padding=(1,1,1))),
('norm', nn.BatchNorm3d(32)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
self.features2 = nn.Sequential(OrderedDict([
('s_conv', nn.Conv3d(32, 64, kernel_size=(3,3,3), stride=(1,1,1), padding=(1,1,1))),
('norm', nn.BatchNorm3d(64)),
('relu', nn.ReLU(inplace=True)),
('dropout', nn.Dropout(p=cnnDropout)),
('t_conv', nn.Conv3d(64, 64, kernel_size=(3, 3,3), stride=(1,1, 1), padding=(1,1,1))),
('norm', nn.BatchNorm3d(64)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
self.features3 = nn.Sequential(OrderedDict([
('conv', nn.Conv3d(64, 96, kernel_size=(3, 3, 3), stride=(1, 1, 1), padding=(1,1,1))),
('norm', nn.BatchNorm3d(96)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
self.features4 = nn.Sequential(OrderedDict([
('conv', nn.Conv3d(96, 128, kernel_size=(3, 3, 3), stride=(1, 1, 1), padding=(1,1,1))),
('norm', nn.BatchNorm3d(128)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
def forward(self, inputv):
cnn = self.features(inputv)
cnn = self.features2(cnn)
cnn = self.features3(cnn)
cnn = self.features4(cnn)
cnn = cnn.permute(0, 2, 1, 3, 4).contiguous()
batch, seq, channel, height, width = cnn.size()
cnn = cnn.view(batch, seq, -1)
return cnn
@staticmethod
def get_outshape():
return 2688
class ShringkedNaiveCNN(nn.Module):
def __init__(self, cnnDropout=0.5):
super().__init__()
self.features = nn.Sequential(OrderedDict([
('conv', nn.Conv3d(3, 32, kernel_size=(3, 5,5), stride=(1, 2,2), padding=(1,2,2))),
('norm', nn.BatchNorm3d(32)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
self.features2 = nn.Sequential(OrderedDict([
('conv', nn.Conv3d(32, 64, kernel_size=(3, 5,5), stride=(1, 1, 1), padding=(1,2,2))),
('norm', nn.BatchNorm3d(64)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
self.features3 = nn.Sequential(OrderedDict([
('conv', nn.Conv3d(64, 96, kernel_size=(3, 3, 3), stride=(1, 1, 1), padding=(1,1,1))),
('norm', nn.BatchNorm3d(96)),
('relu', nn.ReLU(inplace=True)),
('pool', nn.AvgPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))),
('dropout', nn.Dropout(p=cnnDropout))
]))
def forward(self, inputv):
cnn = self.features(inputv)
cnn = self.features2(cnn)
cnn = self.features3(cnn)
cnn = cnn.permute(0, 2, 1, 3, 4).contiguous()
batch, seq, channel, height, width = cnn.size()
cnn = cnn.view(batch, seq, -1)
return cnn
@staticmethod
def get_outshape():
return 96*3*7
def naive_3dcnn(**kwargs):
model = Naive3DCNN(**kwargs)
return {
'model': model,
'output_size': Naive3DCNN.get_outshape()
}
def st_splitted_cnn(**kwargs):
model = ST_splitted_CNN(**kwargs)
return {
'model': model,
'output_size': ST_splitted_CNN.get_outshape()
}
def shrinked_naive_cnn(**kwargs):
Model = ShringkedNaiveCNN
model = Model(**kwargs)
return {
'model': model,
'output_size': Model.get_outshape()
}
if __name__ == '__main__':
model = ShringkedNaiveCNN(cnnDropout=0.3)
inputv = torch.rand(size=(8, 3, 24, 60, 120))
out = model(inputv)
print(out.shape) | 36.935294 | 99 | 0.533365 | 802 | 6,279 | 4.081047 | 0.107232 | 0.019554 | 0.021998 | 0.030247 | 0.849985 | 0.841735 | 0.801711 | 0.8011 | 0.79224 | 0.79224 | 0 | 0.070643 | 0.276318 | 6,279 | 170 | 100 | 36.935294 | 0.649648 | 0 | 0 | 0.758621 | 0 | 0 | 0.04953 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.082759 | false | 0 | 0.027586 | 0.02069 | 0.193103 | 0.006897 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e75a0b948acc45276a069618f25a51b9ba4ee8e7 | 20,015 | py | Python | retrieve.py | vcc-LG/brachy-spreadsheet-audit | 39ea2d45a6592d4623f2d0a4ad8a9bc47244154e | [
"MIT"
] | 1 | 2018-05-03T09:34:54.000Z | 2018-05-03T09:34:54.000Z | retrieve.py | vcc-LG/brachy-spreadsheet-audit | 39ea2d45a6592d4623f2d0a4ad8a9bc47244154e | [
"MIT"
] | null | null | null | retrieve.py | vcc-LG/brachy-spreadsheet-audit | 39ea2d45a6592d4623f2d0a4ad8a9bc47244154e | [
"MIT"
] | null | null | null | """
A tool to gather date from the Brachytherapy Vienna spreadsheet, insert it into a database, and run some
basic queries and visualisations. For research purposes only.
"""
from pymongo import MongoClient
from bokeh.plotting import figure, output_file, show
from bokeh.models import Span, Label, DatetimeTickFormatter
import numpy as np
import re
import selenium.webdriver
from datetime import datetime
import matplotlib.pyplot as plt
def get_quantity(quantity_name):
"""fetches a given quantity from the collection"""
data_list = []
db_string = 'insertions.'+quantity_name
for patient in db.patients.find({}, {db_string: 1, '_id': 0}):
for insertion in patient['insertions']:
try:
if insertion[quantity_name]:
data_list.append(insertion[quantity_name])
except KeyError:
pass
data_list_clean = []
for el in data_list:
if isinstance(el, str):
try:
data_list_clean.append(float(re.findall("\d+\.\d+", el)[0]))
except IndexError:
pass
else:
data_list_clean.append(el)
return data_list_clean
def run_query(query):
"""fetches a given quantity from collection given two requirements"""
for patient in db.patients.find({}, {db_string1: 1, db_string2: 1}):
for insertion in patient['insertions']:
try:
if insertion[quantity_name]:
data_list.append(insertion[quantity_name])
except KeyError:
pass
data_list_clean = []
for el in data_list:
if isinstance(el, str):
try:
data_list_clean.append(float(re.findall("\d+\.\d+", el)[0]))
except IndexError:
pass
else:
data_list_clean.append(el)
return data_list_clean
#connect to mongodb
client = MongoClient()
db = client.patient_database
patients_data = db.patients
quantity = 'mean_point_a'
data_out = get_quantity(quantity)
output_file("output.html")
TOOLS = 'box_zoom,box_select,resize,reset'
p = figure(plot_width=1200, plot_height=800,
title='IGBT audit: '+quantity,
x_axis_label='patient #',
y_axis_label='Mean Point A dose (Gy)',
title_text_font_size='40pt',
tools=TOOLS)
p.xaxis.axis_label_text_font_size = "32pt"
p.yaxis.axis_label_text_font_size = "32pt"
p.xaxis.major_label_text_font_size = "24pt"
p.yaxis.major_label_text_font_size = "24pt"
# add a line renderer with legend and line thickness
p.circle(range(len(data_out)), data_out, fill_color="red", line_color="red", size=6)
hline = Span(location=np.mean(data_out), dimension='width', line_color='green', line_width=1)
p.renderers.extend([hline])
hline = Span(location=np.mean(data_out)+np.std(data_out), dimension='width', line_color='blue', line_width=1,line_dash='dashed')
p.renderers.extend([hline])
hline = Span(location=np.mean(data_out)-np.std(data_out), dimension='width', line_color='blue', line_width=1,line_dash='dashed')
p.renderers.extend([hline])
mean_str = 'Mean = '+str(round(np.mean(data_out),2))
citation = Label(x=260, y=np.mean(data_out),
text=mean_str, render_mode='css',text_color = 'green')
p.add_layout(citation)
# show the results
show(p)
driver = selenium.webdriver.PhantomJS(executable_path=r'C:\Users\le165208\Apps\PhantomJS\phantomjs-2.1.1-windows\bin\phantomjs')
driver.get('file:///C:/Users/le165208/githubprojects/brachy_dose_audit/output.html')
save_str = 'screens/'+quantity+'.png'
driver.save_screenshot(save_str)
print(np.mean(data_out))
print(np.std(data_out))
date_list = []
d90_list = []
A = db.patients.find({'insertions.hr_ctv_d90_gy':{'$exists': True}},{'insertions.hr_ctv_d90_gy':1, 'insertions.insertion_date':1})
for patient in A:
for insertion in patient['insertions']:
try:
d90_list.append(float(insertion['hr_ctv_d90_gy']))
date_list.append(datetime.strptime(insertion['insertion_date'], '%Y-%m-%d'))
except:
pass
output_file("output.html")
TOOLS = 'box_zoom,box_select,resize,reset'
p = figure(plot_width=1200, plot_height=800,
title='IGBT audit: HR-CTV D90',
x_axis_label='Date',
y_axis_label='HR-CTV D90 (Gy)',
x_axis_type="datetime",
title_text_font_size='40pt',
tools=TOOLS)
p.xaxis.formatter = DatetimeTickFormatter(
formats=dict(
months=["%m/%Y"],
years=["%m/%Y"],
)
)
p.xaxis.axis_label_text_font_size = "32pt"
p.yaxis.axis_label_text_font_size = "32pt"
p.xaxis.major_label_text_font_size = "20pt"
p.yaxis.major_label_text_font_size = "24pt"
p.circle(date_list, d90_list, fill_color="red", line_color="red", size=6)
hline = Span(location=np.mean(d90_list), dimension='width', line_color='green', line_width=1)
p.renderers.extend([hline])
hline = Span(location=np.mean(d90_list) + np.std(d90_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
hline = Span(location=np.mean(d90_list) - np.std(d90_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
mean_str = 'Mean = ' + str(round(np.mean(d90_list), 2))
citation = Label(x=260, y=np.mean(d90_list),
text=mean_str, render_mode='css', text_color='green')
p.add_layout(citation)
show(p)
driver = selenium.webdriver.PhantomJS(
executable_path=r'C:\Users\le165208\Apps\PhantomJS\phantomjs-2.1.1-windows\bin\phantomjs')
driver.get('file:///C:/Users/le165208/githubprojects/brachy_dose_audit/output.html')
save_str = 'screens/' + 'hr_ctv_d90' + '.png'
driver.save_screenshot(save_str)
volume_list = []
d90_list = []
A = db.patients.find({'insertions.hr_ctv_d90_gy':{'$exists': True}},{'insertions.hr_ctv_d90_gy':1, 'insertions.hr_ctv_volume_cm3':1})
for patient in A:
for insertion in patient['insertions']:
try:
d90_list.append(float(insertion['hr_ctv_d90_gy']))
volume_list.append(float(insertion['hr_ctv_volume_cm3']))
except:
pass
output_file("output.html")
TOOLS = 'box_zoom,box_select,resize,reset'
p = figure(plot_width=1200, plot_height=800,
title='IGBT audit: HR-CTV D90',
x_axis_label='HR-CTV volume (cm3)',
y_axis_label='HR-CTV D90 (Gy)',
# x_axis_type="datetime",
title_text_font_size='40pt',
tools=TOOLS)
p.xaxis.axis_label_text_font_size = "32pt"
p.yaxis.axis_label_text_font_size = "32pt"
p.xaxis.major_label_text_font_size = "24pt"
p.yaxis.major_label_text_font_size = "24pt"
p.circle(volume_list, d90_list, fill_color="red", line_color="red", size=6)
hline = Span(location=np.mean(d90_list), dimension='width', line_color='green', line_width=1)
p.renderers.extend([hline])
hline = Span(location=np.mean(d90_list) + np.std(d90_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
hline = Span(location=np.mean(d90_list) - np.std(d90_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
mean_str = 'Mean = ' + str(round(np.mean(d90_list), 2))
citation = Label(x=260, y=np.mean(d90_list),
text=mean_str, render_mode='css', text_color='green')
p.add_layout(citation)
show(p)
driver = selenium.webdriver.PhantomJS(
executable_path=r'C:\Users\le165208\Apps\PhantomJS\phantomjs-2.1.1-windows\bin\phantomjs')
driver.get('file:///C:/Users/le165208/githubprojects/brachy_dose_audit/output.html')
save_str = 'screens/' + 'hr_ctv_d90_volume2' + '.png'
driver.save_screenshot(save_str)
client = MongoClient()
db = client.patient_database
volume_list = []
bladder_2cc_list = []
A = db.patients.find({'insertions.bladder_d2cc_gy':{'$exists': True}},{'insertions.bladder_d2cc_gy':1, 'insertions.bladder_volume_cm3':1})
for patient in A:
for insertion in patient['insertions']:
try:
bladder_2cc_list.append(float(insertion['bladder_d2cc_gy']))
volume_list.append(float(insertion['bladder_volume_cm3']))
except:
pass
output_file("output.html")
TOOLS = 'box_zoom,box_select,resize,reset'
p = figure(plot_width=1200, plot_height=800,
title='IGBT audit: Bladder D2cc',
x_axis_label='Bladder volume (cm3)',
y_axis_label='Bladder D2cc (Gy)',
# x_axis_type="datetime",
title_text_font_size='40pt',
tools=TOOLS)
p.xaxis.axis_label_text_font_size = "32pt"
p.yaxis.axis_label_text_font_size = "32pt"
p.xaxis.major_label_text_font_size = "24pt"
p.yaxis.major_label_text_font_size = "24pt"
p.circle(volume_list, bladder_2cc_list, fill_color="red", line_color="red", size=6)
hline = Span(location=np.mean(bladder_2cc_list), dimension='width', line_color='green', line_width=1)
p.renderers.extend([hline])
hline = Span(location=np.mean(bladder_2cc_list) + np.std(bladder_2cc_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
hline = Span(location=np.mean(bladder_2cc_list) - np.std(bladder_2cc_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
mean_str = 'Mean = ' + str(round(np.mean(bladder_2cc_list), 2))
citation = Label(x=260, y=np.mean(bladder_2cc_list),
text=mean_str, render_mode='css', text_color='green')
p.add_layout(citation)
show(p)
driver = selenium.webdriver.PhantomJS(
executable_path=r'C:\Users\le165208\Apps\PhantomJS\phantomjs-2.1.1-windows\bin\phantomjs')
driver.get('file:///C:/Users/le165208/githubprojects/brachy_dose_audit/output.html')
save_str = 'screens/' + 'bladder_volume_2cc' + '.png'
driver.save_screenshot(save_str)
client = MongoClient()
db = client.patient_database
bowel_d2cc_list = []
rectum_d2cc_list = []
A = db.patients.find({'insertions.rectum_d2cc_gy':{'$exists': True}},{'insertions.rectum_d2cc_gy':1, 'insertions.bowel_d2cc_gy':1})
for patient in A:
for insertion in patient['insertions']:
try:
rectum_d2cc_list.append(float(insertion['rectum_d2cc_gy']))
bowel_d2cc_list.append(float(insertion['bowel_d2cc_gy']))
except:
pass
output_file("output.html")
TOOLS = 'box_zoom,box_select,resize,reset'
p = figure(plot_width=1200, plot_height=800,
title='IGBT audit: bowel D2cc',
x_axis_label='bowel D2cc (Gy)',
y_axis_label='rectum D2cc (Gy)',
# x_axis_type="datetime",
title_text_font_size='40pt',
tools=TOOLS)
p.xaxis.axis_label_text_font_size = "32pt"
p.yaxis.axis_label_text_font_size = "32pt"
p.xaxis.major_label_text_font_size = "24pt"
p.yaxis.major_label_text_font_size = "24pt"
p.circle(bowel_d2cc_list, rectum_d2cc_list, fill_color="red", line_color="red", size=6)
hline = Span(location=np.mean(rectum_d2cc_list), dimension='width', line_color='green', line_width=1)
p.renderers.extend([hline])
hline = Span(location=np.mean(rectum_d2cc_list) + np.std(rectum_d2cc_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
hline = Span(location=np.mean(rectum_d2cc_list) - np.std(rectum_d2cc_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
mean_str = 'Mean = ' + str(round(np.mean(rectum_d2cc_list), 2))
citation = Label(x=260, y=np.mean(rectum_d2cc_list),
text=mean_str, render_mode='css', text_color='green')
p.add_layout(citation)
show(p)
driver = selenium.webdriver.PhantomJS(
executable_path=r'C:\Users\le165208\Apps\PhantomJS\phantomjs-2.1.1-windows\bin\phantomjs')
driver.get('file:///C:/Users/le165208/githubprojects/brachy_dose_audit/output.html')
save_str = 'screens/' + 'bowelvshrctv' + '.png'
driver.save_screenshot(save_str)
client = MongoClient()
db = client.patient_database
insertion_num_list = []
hr_ctv_d90_list = []
A = db.patients.find({'insertions.hr_ctv_d90_gy':{'$exists': True}},{'insertions.hr_ctv_d90_gy':1, 'insertions.insertion_number':1})
for patient in A:
for insertion in patient['insertions']:
try:
hr_ctv_d90_list.append(float(insertion['hr_ctv_d90_gy']))
insertion_num_list.append(int(insertion['insertion_number']))
except:
pass
output_file("output.html")
TOOLS = 'box_zoom,box_select,resize,reset'
p = figure(plot_width=1200, plot_height=800,
title='IGBT audit: HR-CTV D90',
x_axis_label='Insertion number',
y_axis_label='HR-CTV D90 (Gy)',
# x_axis_type="datetime",
title_text_font_size='40pt',
tools=TOOLS)
p.xaxis.axis_label_text_font_size = "32pt"
p.yaxis.axis_label_text_font_size = "32pt"
p.xaxis.major_label_text_font_size = "24pt"
p.yaxis.major_label_text_font_size = "24pt"
p.circle(insertion_num_list, hr_ctv_d90_list, fill_color="red", line_color="red", size=6)
hline = Span(location=np.mean(hr_ctv_d90_list), dimension='width', line_color='green', line_width=1)
p.renderers.extend([hline])
hline = Span(location=np.mean(hr_ctv_d90_list) + np.std(hr_ctv_d90_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
hline = Span(location=np.mean(hr_ctv_d90_list) - np.std(hr_ctv_d90_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.xaxis[0].ticker=FixedTicker(ticks=[1, 2, 3])
p.renderers.extend([hline])
mean_str = 'Mean = ' + str(round(np.mean(hr_ctv_d90_list), 2))
citation = Label(x=260, y=np.mean(hr_ctv_d90_list),
text=mean_str, render_mode='css', text_color='green')
p.add_layout(citation)
show(p)
driver = selenium.webdriver.PhantomJS(
executable_path=r'C:\Users\le165208\Apps\PhantomJS\phantomjs-2.1.1-windows\bin\phantomjs')
driver.get('file:///C:/Users/le165208/githubprojects/brachy_dose_audit/output.html')
save_str = 'screens/' + 'hr_ctv_insertion' + '.png'
driver.save_screenshot(save_str)
from bokeh.charts import BoxPlot
import pandas as pd
data = dict(insertion_number = insertion_num_list, hr_ctv_d90 = hr_ctv_d90_list)
data_to_plot = []
ins1 = [hr_ctv_d90_list[j] for j in [i for i in range(len(insertion_num_list)) if insertion_num_list[i]==1]]
ins2 = [hr_ctv_d90_list[j] for j in [i for i in range(len(insertion_num_list)) if insertion_num_list[i]==2]]
ins3 = [hr_ctv_d90_list[j] for j in [i for i in range(len(insertion_num_list)) if insertion_num_list[i]==3]]
data_to_plot = [ins1,ins2,ins3]
fig, ax1 = plt.subplots(figsize=(15, 10))
ax1.set_title('HR-CTV D90 (Gy) vs. insertion number',
fontsize = 30)
fig.canvas.set_window_title('HR-CTV D90 (Gy)')
bp = plt.boxplot(data_to_plot, widths = 0.2,notch=0, sym='+', vert=1, whis=1.5,patch_artist=True)
xtickNames = plt.setp(ax1,xticklabels= ['Insertion 1', 'Insertion 2', 'Insertion 3'])
ax1.set_ylabel("HR-CTV D90 (Gy)", fontsize=26)
for tick in ax1.yaxis.get_major_ticks():
tick.label.set_fontsize(24)
plt.setp(xtickNames, fontsize=26)
plt.setp(bp['boxes'], # customise box appearance
color='grey', # outline colour
linewidth=1.5, # outline line width
facecolor='SkyBlue') # fill box with colour
plt.setp(bp['whiskers'], # customise whisker appearence
color='DarkMagenta', # whisker colour
linewidth=1.5) # whisker thickness
plt.setp(bp['caps'], # customize lines at the end of whiskers
color='DarkMagenta', # cap colour
linewidth=1.5) # cap thickness
plt.setp(bp['fliers'], # customize marks for extreme values
color='Tomato', # set mark colour
marker='o', # maker shape
markersize=10) # marker size
plt.setp(bp['medians'], # customize median lines
color='Tomato', # line colour
linewidth=1.5) # line thickness
plt.show()
client = MongoClient()
db = client.patient_database
insertion_num_list = []
hr_ctv_volume_list = []
A = db.patients.find({'insertions.hr_ctv_volume_cm3':{'$exists': True}},{'insertions.hr_ctv_volume_cm3':1, 'insertions.insertion_number':1})
for patient in A:
for insertion in patient['insertions']:
try:
hr_ctv_volume_list.append(float(insertion['hr_ctv_volume_cm3']))
insertion_num_list.append(int(insertion['insertion_number']))
except:
pass
data_to_plot = []
ins1 = [hr_ctv_volume_list[j] for j in [i for i in range(len(insertion_num_list)) if insertion_num_list[i]==1]]
ins2 = [hr_ctv_volume_list[j] for j in [i for i in range(len(insertion_num_list)) if insertion_num_list[i]==2]]
ins3 = [hr_ctv_volume_list[j] for j in [i for i in range(len(insertion_num_list)) if insertion_num_list[i]==3]]
data_to_plot = [ins1,ins2,ins3]
fig, ax1 = plt.subplots(figsize=(15, 10))
ax1.set_title('HR-CTV volume (cm3) vs. insertion number',
fontsize = 30)
fig.canvas.set_window_title('HR-CTV volume (cm3))')
bp = plt.boxplot(data_to_plot, widths = 0.2,notch=0, sym='+', vert=1, whis=1.5,patch_artist=True)
xtickNames = plt.setp(ax1,xticklabels= ['Insertion 1', 'Insertion 2', 'Insertion 3'])
ax1.set_ylabel("HR-CTV volume (cm3)", fontsize=26)
for tick in ax1.yaxis.get_major_ticks():
tick.label.set_fontsize(24)
plt.setp(xtickNames, fontsize=26)
plt.setp(bp['boxes'], # customise box appearance
color='grey', # outline colour
linewidth=1.5, # outline line width
facecolor='SkyBlue') # fill box with colour
plt.setp(bp['whiskers'], # customise whisker appearence
color='DarkMagenta', # whisker colour
linewidth=1.5) # whisker thickness
plt.setp(bp['caps'], # customize lines at the end of whiskers
color='DarkMagenta', # cap colour
linewidth=1.5) # cap thickness
plt.setp(bp['fliers'], # customize marks for extreme values
color='Tomato', # set mark colour
marker='o', # maker shape
markersize=10) # marker size
plt.setp(bp['medians'], # customize median lines
color='Tomato', # line colour
linewidth=1.5) # line thickness
plt.show()
output_file("output.html")
TOOLS = 'box_zoom,box_select,resize,reset'
p = figure(plot_width=1200, plot_height=800,
title='IGBT audit: HR-CTV D90',
x_axis_label='Insertion number',
y_axis_label='HR-CTV D90 (Gy)',
# x_axis_type="datetime",
title_text_font_size='40pt',
tools=TOOLS)
p.xaxis.axis_label_text_font_size = "32pt"
p.yaxis.axis_label_text_font_size = "32pt"
p.xaxis.major_label_text_font_size = "24pt"
p.yaxis.major_label_text_font_size = "24pt"
p.circle(insertion_num_list, hr_ctv_volume_list, fill_color="red", line_color="red", size=6)
hline = Span(location=np.mean(hr_ctv_volume_list), dimension='width', line_color='green', line_width=1)
p.renderers.extend([hline])
hline = Span(location=np.mean(hr_ctv_volume_list) + np.std(hr_ctv_volume_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.renderers.extend([hline])
hline = Span(location=np.mean(hr_ctv_volume_list) - np.std(hr_ctv_volume_list), dimension='width', line_color='blue', line_width=1,
line_dash='dashed')
p.xaxis[0].ticker=FixedTicker(ticks=[1, 2, 3])
p.renderers.extend([hline])
mean_str = 'Mean = ' + str(round(np.mean(hr_ctv_volume_list), 2))
citation = Label(x=260, y=np.mean(hr_ctv_volume_list),
text=mean_str, render_mode='css', text_color='green')
p.add_layout(citation)
show(p)
| 39.091797 | 140 | 0.678941 | 2,919 | 20,015 | 4.413155 | 0.093868 | 0.023288 | 0.022978 | 0.036951 | 0.901723 | 0.875252 | 0.847384 | 0.841562 | 0.835429 | 0.821456 | 0 | 0.033309 | 0.179515 | 20,015 | 511 | 141 | 39.168297 | 0.751127 | 0.057257 | 0 | 0.72549 | 0 | 0.014706 | 0.181287 | 0.081127 | 0.012255 | 0 | 0 | 0 | 0 | 1 | 0.004902 | false | 0.02451 | 0.02451 | 0 | 0.034314 | 0.004902 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
99d99c4102b00b6d922211b778c6b0d778eed24c | 26,585 | py | Python | tschartslib/statechange/statechange.py | DaleProctor/tscharts | 5447395e0aef0b949bef8426febdec2093cf37ef | [
"Apache-2.0"
] | 16 | 2016-08-17T21:39:10.000Z | 2021-11-24T12:14:28.000Z | tschartslib/statechange/statechange.py | DaleProctor/tscharts | 5447395e0aef0b949bef8426febdec2093cf37ef | [
"Apache-2.0"
] | 55 | 2017-04-23T18:12:04.000Z | 2021-08-08T08:25:18.000Z | tschartslib/statechange/statechange.py | DaleProctor/tscharts | 5447395e0aef0b949bef8426febdec2093cf37ef | [
"Apache-2.0"
] | 8 | 2017-08-11T02:11:46.000Z | 2021-07-06T22:58:42.000Z | #(C) Copyright Syd Logan 2017-2020
#(C) Copyright Thousand Smiles Foundation 2017-2020
#
#Licensed under the Apache License, Version 2.0 (the "License");
#you may not use this file except in compliance with the License.
#
#You may obtain a copy of the License at
#http://www.apache.org/licenses/LICENSE-2.0
#
#Unless required by applicable law or agreed to in writing, software
#distributed under the License is distributed on an "AS IS" BASIS,
#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#See the License for the specific language governing permissions and
#limitations under the License.
'''
unit tests for statechange application. Assumes django server is up
and running on the specified host and port
'''
import unittest
import getopt, sys
import json
from tschartslib.service.serviceapi import ServiceAPI
from tschartslib.tscharts.tscharts import Login, Logout
from tschartslib.patient.patient import CreatePatient, DeletePatient
from tschartslib.clinic.clinic import CreateClinic, DeleteClinic
from tschartslib.station.station import CreateStation, DeleteStation
from tschartslib.clinicstation.clinicstation import CreateClinicStation, DeleteClinicStation
class CreateStateChange(ServiceAPI):
def __init__(self, host, port, token):
super(CreateStateChange, self).__init__()
self.setHttpMethod("POST")
self.setHost(host)
self.setPort(port)
self.setToken(token)
self._payload = {}
self.setPayload(self._payload)
self.setURL("tscharts/v1/statechange/")
def setClinicStation(self, clinic_station):
self._payload["clinicstation"] = clinic_station
self.setPayload(self._payload)
def setPatient(self, patient):
self._payload["patient"] = patient
self.setPayload(self._payload)
def setState(self, state):
self._payload["state"] = state
self.setPayload(self._payload)
class GetStateChange(ServiceAPI):
def makeURL(self):
hasQArgs = False
if not self._id == None:
base = "tscharts/v1/statechange/{}/".format(self._id)
else:
base = "tscharts/v1/statechange/".format(self._id)
if not self._patient == None:
if not hasQArgs:
base += "?"
else:
base += "&"
base += "patient={}".format(self._patient)
hasQArgs = True
if not self._clinic == None:
if not hasQArgs:
base += "?"
else:
base += "&"
base += "clinic={}".format(self._clinic)
hasQArgs = True
if not self._clinicstation == None:
if not hasQArgs:
base += "?"
else:
base += "&"
base += "clinicstation={}".format(self._clinicstation)
hasQArgs = True
self.setURL(base)
def __init__(self, host, port, token, id=None):
super(GetStateChange, self).__init__()
self.setHttpMethod("GET")
self.setHost(host)
self.setPort(port)
self.setToken(token)
self.clearArgs()
self.makeURL()
def clearArgs(self):
self._patient = None
self._clinic = None
self._clinicstation = None
self._id = None
def setClinicStation(self, clinic_station):
self._clinicstation = clinic_station
self.makeURL()
def setClinic(self, clinic):
self._clinic = clinic
self.makeURL()
def setPatient(self, patient):
self._patient = patient
self.makeURL()
def setId(self, id):
self._id = id
self.makeURL()
class DeleteStateChange(ServiceAPI):
def __init__(self, host, port, token, id):
super(DeleteStateChange, self).__init__()
self.setHttpMethod("DELETE")
self.setHost(host)
self.setPort(port)
self.setToken(token)
self.setURL("tscharts/v1/statechange/{}/".format(id))
class TestTSStateChange(unittest.TestCase):
def setUp(self):
login = Login(host, port, username, password)
ret = login.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("token" in ret[1])
global token
token = ret[1]["token"]
def testCreateStateChange(self):
x = CreateClinic(host, port, token, "Ensenada", "02/05/2016", "02/06/2016")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
clinicid = int(ret[1]["id"])
x = CreateStation(host, port, token, "ENT")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
stationid = int(ret[1]["id"])
x = CreateClinicStation(host, port, token, clinicid, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
clinicstationid = int(ret[1]["id"])
data = {}
data["paternal_last"] = "abcd1234"
data["maternal_last"] = "yyyyyy"
data["first"] = "zzzzzzz"
data["middle"] = ""
data["suffix"] = "Jr."
data["prefix"] = ""
data["dob"] = "04/01/1962"
data["gender"] = "Female"
data["street1"] = "1234 First Ave"
data["street2"] = ""
data["city"] = "Ensenada"
data["colonia"] = ""
data["state"] = u"Baja California"
data["phone1"] = "1-111-111-1111"
data["phone2"] = ""
data["email"] = "patient@example.com"
data["emergencyfullname"] = "Maria Sanchez"
data["emergencyphone"] = "1-222-222-2222"
data["emergencyemail"] = "maria.sanchez@example.com"
x = CreatePatient(host, port, token, data)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
patientid = int(ret[1]["id"])
x = CreateStateChange(host, port, token)
x.setClinicStation(clinicstationid)
x.setPatient(patientid)
x.setState("in")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
statechangeid = int(ret[1]["id"])
x = GetStateChange(host, port, token)
x.setId(statechangeid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
self.assertTrue(int(ret[1]["id"]) == statechangeid)
self.assertTrue(int(ret[1]["clinicstation"] == clinicstationid))
self.assertTrue(int(ret[1]["patient"] == patientid))
self.assertTrue("time" in ret[1]);
self.assertTrue("state" in ret[1]);
self.assertTrue(ret[1]["state"] == "in");
x = DeleteStateChange(host, port, token, statechangeid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteClinicStation(host, port, token, clinicstationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteStation(host, port, token, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteClinic(host, port, token, clinicid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeletePatient(host, port, token, patientid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
# create with invalid clinicstation
def testCreateStateChangeBadClinicStation(self):
data = {}
data["paternal_last"] = "abcd1234"
data["maternal_last"] = "yyyyyy"
data["first"] = "zzzzzzz"
data["middle"] = ""
data["suffix"] = "Jr."
data["prefix"] = ""
data["dob"] = "04/01/1962"
data["gender"] = "Female"
data["street1"] = "1234 First Ave"
data["street2"] = ""
data["city"] = "Ensenada"
data["colonia"] = ""
data["state"] = u"Baja California"
data["phone1"] = "1-111-111-1111"
data["phone2"] = ""
data["email"] = "patient@example.com"
data["emergencyfullname"] = "Maria Sanchez"
data["emergencyphone"] = "1-222-222-2222"
data["emergencyemail"] = "maria.sanchez@example.com"
x = CreatePatient(host, port, token, data)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
patientid = int(ret[1]["id"])
x = CreateStateChange(host, port, token)
x.setClinicStation(9999)
x.setPatient(patientid)
x.setState("in")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 404)
x = DeletePatient(host, port, token, patientid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
# create with invalid patient
def testCreateStateChangeBadPatient(self):
x = CreateClinic(host, port, token, "Ensenada", "02/05/2016", "02/06/2016")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
clinicid = int(ret[1]["id"])
x = CreateStation(host, port, token, "ENT")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
stationid = int(ret[1]["id"])
x = CreateClinicStation(host, port, token, clinicid, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
clinicstationid = int(ret[1]["id"])
x = CreateStateChange(host, port, token)
x.setClinicStation(clinicstationid)
x.setPatient(9999)
x.setState("in")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 404)
x = DeleteClinicStation(host, port, token, clinicstationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteStation(host, port, token, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteClinic(host, port, token, clinicid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
# create with invalid state
def testCreateStateChangeBadState(self):
x = CreateClinic(host, port, token, "Ensenada", "02/05/2016", "02/06/2016")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
clinicid = int(ret[1]["id"])
x = CreateStation(host, port, token, "ENT")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
stationid = int(ret[1]["id"])
x = CreateClinicStation(host, port, token, clinicid, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
clinicstationid = int(ret[1]["id"])
data = {}
data["paternal_last"] = "abcd1234"
data["maternal_last"] = "yyyyyy"
data["first"] = "zzzzzzz"
data["middle"] = ""
data["suffix"] = "Jr."
data["prefix"] = ""
data["dob"] = "04/01/1962"
data["gender"] = "Female"
data["street1"] = "1234 First Ave"
data["street2"] = ""
data["city"] = "Ensenada"
data["colonia"] = ""
data["state"] = u"Baja California"
data["phone1"] = "1-111-111-1111"
data["phone2"] = ""
data["email"] = "patient@example.com"
data["emergencyfullname"] = "Maria Sanchez"
data["emergencyphone"] = "1-222-222-2222"
data["emergencyemail"] = "maria.sanchez@example.com"
x = CreatePatient(host, port, token, data)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
patientid = int(ret[1]["id"])
x = CreateStateChange(host, port, token)
x.setClinicStation(clinicstationid)
x.setPatient(patientid)
x.setState("new york")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 400)
x = DeleteClinicStation(host, port, token, clinicstationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteStation(host, port, token, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteClinic(host, port, token, clinicid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeletePatient(host, port, token, patientid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
# create multiple, verify they all exist and are correct
def testCreateMultipleStateChange(self):
x = CreateClinic(host, port, token, "Ensenada", "02/05/2016", "02/06/2016")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
clinicid = int(ret[1]["id"])
x = CreateStation(host, port, token, "ENT")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
stationid = int(ret[1]["id"])
x = CreateClinicStation(host, port, token, clinicid, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
clinicstationid = int(ret[1]["id"])
data = {}
data["paternal_last"] = "abcd1234"
data["maternal_last"] = "yyyyyy"
data["first"] = "zzzzzzz"
data["middle"] = ""
data["suffix"] = "Jr."
data["prefix"] = ""
data["dob"] = "04/01/1962"
data["gender"] = "Female"
data["street1"] = "1234 First Ave"
data["street2"] = ""
data["city"] = "Ensenada"
data["colonia"] = ""
data["state"] = u"Baja California"
data["phone1"] = "1-111-111-1111"
data["phone2"] = ""
data["email"] = "patient@example.com"
data["emergencyfullname"] = "Maria Sanchez"
data["emergencyphone"] = "1-222-222-2222"
data["emergencyemail"] = "maria.sanchez@example.com"
x = CreatePatient(host, port, token, data)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
patientid = int(ret[1]["id"])
x = CreateStateChange(host, port, token)
x.setClinicStation(clinicstationid)
x.setPatient(patientid)
x.setState("in")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
statechangeid = int(ret[1]["id"])
x = GetStateChange(host, port, token)
x.setId(statechangeid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
self.assertTrue(int(ret[1]["id"]) == statechangeid)
self.assertTrue(int(ret[1]["clinicstation"] == clinicstationid))
self.assertTrue(int(ret[1]["patient"] == patientid))
self.assertTrue("time" in ret[1]);
self.assertTrue("state" in ret[1]);
self.assertTrue(ret[1]["state"] == "in");
x = DeleteStateChange(host, port, token, statechangeid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteClinicStation(host, port, token, clinicstationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteStation(host, port, token, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteClinic(host, port, token, clinicid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeletePatient(host, port, token, patientid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
def testDeleteStateChange(self):
# create statechange, delete, verify it is gone
x = CreateClinic(host, port, token, "Ensenada", "02/05/2016", "02/06/2016")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
clinicid = int(ret[1]["id"])
x = CreateStation(host, port, token, "ENT")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
stationid = int(ret[1]["id"])
x = CreateClinicStation(host, port, token, clinicid, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
clinicstationid = int(ret[1]["id"])
data = {}
data["paternal_last"] = "abcd1234"
data["maternal_last"] = "yyyyyy"
data["first"] = "zzzzzzz"
data["middle"] = ""
data["suffix"] = "Jr."
data["prefix"] = ""
data["dob"] = "04/01/1962"
data["gender"] = "Female"
data["street1"] = "1234 First Ave"
data["street2"] = ""
data["city"] = "Ensenada"
data["colonia"] = ""
data["state"] = u"Baja California"
data["phone1"] = "1-111-111-1111"
data["phone2"] = ""
data["email"] = "patient@example.com"
data["emergencyfullname"] = "Maria Sanchez"
data["emergencyphone"] = "1-222-222-2222"
data["emergencyemail"] = "maria.sanchez@example.com"
x = CreatePatient(host, port, token, data)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
patientid = int(ret[1]["id"])
x = CreateStateChange(host, port, token)
x.setClinicStation(clinicstationid)
x.setPatient(patientid)
x.setState("in")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
statechangeid = int(ret[1]["id"])
x = GetStateChange(host, port, token)
x.setId(statechangeid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
self.assertTrue(int(ret[1]["id"]) == statechangeid)
self.assertTrue(int(ret[1]["clinicstation"] == clinicstationid))
self.assertTrue(int(ret[1]["patient"] == patientid))
self.assertTrue("time" in ret[1]);
self.assertTrue("state" in ret[1]);
self.assertTrue(ret[1]["state"] == "in");
x = DeleteStateChange(host, port, token, statechangeid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
# try deleting an invalid state change
x = DeleteStateChange(host, port, token, statechangeid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 404)
# create a few state change objects, delete them
# and verify there are none in the database
ids = []
x = CreateStateChange(host, port, token)
x.setClinicStation(clinicstationid)
x.setPatient(patientid)
x.setState("out")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
ids.append(int(ret[1]["id"]))
x = CreateStateChange(host, port, token)
x.setClinicStation(clinicstationid)
x.setPatient(patientid)
x.setState("in")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
ids.append(int(ret[1]["id"]))
x = CreateStateChange(host, port, token)
x.setClinicStation(clinicstationid)
x.setPatient(patientid)
x.setState("out")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
ids.append(int(ret[1]["id"]))
for x in ids:
y = GetStateChange(host, port, token)
y.setId(x)
ret = y.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
self.assertTrue(int(ret[1]["id"]) == x)
for x in ids:
y = DeleteStateChange(host, port, token, x)
ret = y.send(timeout=30)
self.assertEqual(ret[0], 200)
for x in ids:
y = GetStateChange(host, port, token)
y.setId(x)
ret = y.send(timeout=30)
self.assertEqual(ret[0], 404)
x = DeleteClinicStation(host, port, token, clinicstationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteStation(host, port, token, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteClinic(host, port, token, clinicid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeletePatient(host, port, token, patientid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
def testGetAllStateChange(self):
x = CreateClinic(host, port, token, "Ensenada", "02/05/2016", "02/06/2016")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
clinicid = int(ret[1]["id"])
x = CreateStation(host, port, token, "ENT")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
stationid = int(ret[1]["id"])
x = CreateClinicStation(host, port, token, clinicid, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
clinicstationid = int(ret[1]["id"])
data = {}
data["paternal_last"] = "abcd1234"
data["maternal_last"] = "yyyyyy"
data["first"] = "zzzzzzz"
data["middle"] = ""
data["suffix"] = "Jr."
data["prefix"] = ""
data["dob"] = "04/01/1962"
data["gender"] = "Female"
data["street1"] = "1234 First Ave"
data["street2"] = ""
data["city"] = "Ensenada"
data["colonia"] = ""
data["state"] = u"Baja California"
data["phone1"] = "1-111-111-1111"
data["phone2"] = ""
data["email"] = "patient@example.com"
data["emergencyfullname"] = "Maria Sanchez"
data["emergencyphone"] = "1-222-222-2222"
data["emergencyemail"] = "maria.sanchez@example.com"
x = CreatePatient(host, port, token, data)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
patientid = int(ret[1]["id"])
x = CreateStateChange(host, port, token)
x.setClinicStation(clinicstationid)
x.setPatient(patientid)
x.setState("in")
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
statechangeid = int(ret[1]["id"])
x = GetStateChange(host, port, token)
x.setId(statechangeid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
self.assertTrue("id" in ret[1])
self.assertTrue(int(ret[1]["id"]) == statechangeid)
self.assertTrue(int(ret[1]["clinicstation"] == clinicstationid))
self.assertTrue(int(ret[1]["patient"] == patientid))
self.assertTrue("time" in ret[1]);
self.assertTrue("state" in ret[1]);
self.assertTrue(ret[1]["state"] == "in");
# following tests assume that there is only one matching statechange
# in the DB. Note these forms of the GET return vectors, not a single
# object
x = GetStateChange(host, port, token)
x.setClinicStation(clinicstationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
ret = ret[1][0]
self.assertTrue("id" in ret)
self.assertTrue(int(ret["id"]) == statechangeid)
self.assertTrue(int(ret["clinicstation"] == clinicstationid))
self.assertTrue(int(ret["patient"] == patientid))
self.assertTrue("time" in ret);
self.assertTrue("state" in ret);
self.assertTrue(ret["state"] == "in");
x.clearArgs()
x.setClinicStation(clinicstationid)
x.setPatient(patientid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
ret = ret[1][0]
self.assertTrue("id" in ret)
self.assertTrue(int(ret["id"]) == statechangeid)
self.assertTrue(int(ret["clinicstation"] == clinicstationid))
self.assertTrue(int(ret["patient"] == patientid))
self.assertTrue("time" in ret);
self.assertTrue("state" in ret);
self.assertTrue(ret["state"] == "in");
x.clearArgs()
x.setClinic(clinicid)
x.setPatient(patientid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
ret = ret[1][0]
self.assertTrue("id" in ret)
self.assertTrue(int(ret["id"]) == statechangeid)
self.assertTrue(int(ret["clinicstation"] == clinicstationid))
self.assertTrue(int(ret["patient"] == patientid))
self.assertTrue("time" in ret);
self.assertTrue("state" in ret);
self.assertTrue(ret["state"] == "in");
x.clearArgs()
x.setClinic(clinicid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
ret = ret[1][0]
self.assertTrue("id" in ret)
self.assertTrue(int(ret["id"]) == statechangeid)
self.assertTrue(int(ret["clinicstation"] == clinicstationid))
self.assertTrue(int(ret["patient"] == patientid))
self.assertTrue("time" in ret);
self.assertTrue("state" in ret);
self.assertTrue(ret["state"] == "in");
x.clearArgs()
x.setClinicStation(clinicstationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
ret = ret[1][0]
self.assertTrue("id" in ret)
self.assertTrue(int(ret["id"]) == statechangeid)
self.assertTrue(int(ret["clinicstation"] == clinicstationid))
self.assertTrue(int(ret["patient"] == patientid))
self.assertTrue("time" in ret);
self.assertTrue("state" in ret);
self.assertTrue(ret["state"] == "in");
x = DeleteStateChange(host, port, token, statechangeid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteClinicStation(host, port, token, clinicstationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteStation(host, port, token, stationid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeleteClinic(host, port, token, clinicid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
x = DeletePatient(host, port, token, patientid)
ret = x.send(timeout=30)
self.assertEqual(ret[0], 200)
def usage():
print("statechange [-h host] [-p port] [-u username] [-w password]")
def main():
try:
opts, args = getopt.getopt(sys.argv[1:], "h:p:u:w:")
except getopt.GetoptError as err:
print(str(err))
usage()
sys.exit(2)
global host
host = "127.0.0.1"
global port
port = 8000
global username
username = None
global password
password = None
for o, a in opts:
if o == "-h":
host = a
elif o == "-p":
port = int(a)
elif o == "-u":
username = a
elif o == "-w":
password = a
else:
assert False, "unhandled option"
unittest.main(argv=[sys.argv[0]])
if __name__ == "__main__":
main()
| 33.73731 | 92 | 0.573444 | 3,054 | 26,585 | 4.967911 | 0.091683 | 0.055102 | 0.06512 | 0.085157 | 0.811231 | 0.796731 | 0.789612 | 0.778408 | 0.771882 | 0.76951 | 0 | 0.048303 | 0.277337 | 26,585 | 787 | 93 | 33.780178 | 0.741412 | 0.043671 | 0 | 0.837795 | 0 | 0 | 0.111177 | 0.009924 | 0 | 0 | 0 | 0 | 0.234646 | 1 | 0.034646 | false | 0.007874 | 0.014173 | 0 | 0.055118 | 0.00315 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
821cb9c6818933f0034f28c8836ece3311a975e1 | 127 | py | Python | kerasutils/utils/__init__.py | tchaye59/kerasutils | 2849a35a246282851f5cdc22625b2afefb81bf65 | [
"MIT"
] | null | null | null | kerasutils/utils/__init__.py | tchaye59/kerasutils | 2849a35a246282851f5cdc22625b2afefb81bf65 | [
"MIT"
] | null | null | null | kerasutils/utils/__init__.py | tchaye59/kerasutils | 2849a35a246282851f5cdc22625b2afefb81bf65 | [
"MIT"
] | null | null | null | from kerasutils.utils.bb_utils import *
from kerasutils.utils.yolo_utils import *
from kerasutils.utils.tf_lite_utils import *
| 31.75 | 44 | 0.834646 | 19 | 127 | 5.368421 | 0.421053 | 0.411765 | 0.558824 | 0.490196 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094488 | 127 | 3 | 45 | 42.333333 | 0.886957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
824505a27a46c27d36f78f67522ab10bb8912a69 | 8,406 | py | Python | sae/processors.py | nicolay-r/bert | 27aaa72e22a68ab2860308574712362a42011605 | [
"Apache-2.0"
] | null | null | null | sae/processors.py | nicolay-r/bert | 27aaa72e22a68ab2860308574712362a42011605 | [
"Apache-2.0"
] | null | null | null | sae/processors.py | nicolay-r/bert | 27aaa72e22a68ab2860308574712362a42011605 | [
"Apache-2.0"
] | null | null | null | import os
import random
import tokenization
import tensorflow as tf
from core.data_processor import DataProcessor
from core.input_example import InputExample
flags = tf.flags
FLAGS = flags.FLAGS
filename_template = "sample-{data_type}-{cv_index}.tsv.gz"
# Mix of the origininal data, as the latter in case of 'train' type
# mostly represents a sequence of a zero labeled examples.
RANDOM_SEED = 1
class SAE_2SM_Processor(DataProcessor):
"""Processor for the SAE data set, three scale classification format
SAE stands for "Sentiment Attitude Extraction
2 -- Three scale
S -- Single sentence (text_a only)
M -- multiple classification
Columns:
test: [id, text_a]
train: [id, label, text_a]
"""
def get_train_examples(self, data_dir):
"""See base class."""
filename = filename_template.format(data_type='train', cv_index=FLAGS.cv_index)
return self._create_examples(self._read_tsv_gzip(os.path.join(data_dir, filename)), "train")
def get_dev_examples(self, data_dir):
return self.get_train_examples(data_dir)
def get_test_examples(self, data_dir):
"""See base class."""
filename = filename_template.format(data_type='test', cv_index=FLAGS.cv_index)
return self._create_examples(
self._read_tsv_gzip(os.path.join(data_dir, filename)), "test")
def get_labels(self):
"""See base class."""
return ["0", "1"]
def _create_examples(self, lines, set_type):
"""Creates examples for the training and dev sets."""
examples = []
for (i, line) in enumerate(lines):
# Only the test set has a header
if i == 0:
continue
guid = "%s-%s" % (set_type, i)
if set_type == "test":
# Headers: id, news_id, text_a, s_obj, t_obj
label = "0"
text_a = tokenization.convert_to_unicode(line[2])
s_obj = tokenization.convert_to_unicode(line[3])
t_obj = tokenization.convert_to_unicode(line[4])
else:
# Headers: id, news_id, label, text_a, s_obj, t_obj
label = tokenization.convert_to_unicode(line[2])
text_a = tokenization.convert_to_unicode(line[3])
s_obj = tokenization.convert_to_unicode(line[4])
t_obj = tokenization.convert_to_unicode(line[5])
examples.append(
InputExample(guid=guid,
text_a=text_a,
text_b=None,
s_obj=int(s_obj),
t_obj=int(t_obj),
label=label))
if set_type == "train":
random.Random(RANDOM_SEED).shuffle(examples)
return examples
class SAE_3SM_Processor(SAE_2SM_Processor):
def get_labels(self):
"""See base class."""
return ["0", "1", "2"]
class SAE_PB_Processor(DataProcessor):
"""Processor for the SAE data set, three scale classification format
SAE stands for "Sentiment Attitude Extraction
P -- Pair of sentences (text_a, text_b)
B -- Binary classification
Columns:
test: [id, text_a, text_b]
train: [id, label, text_a, text_b]
"""
def get_train_examples(self, data_dir):
"""See base class."""
filename = filename_template.format(data_type='train', cv_index=FLAGS.cv_index)
return self._create_examples(
self._read_tsv_gzip(os.path.join(data_dir, filename)), "train")
def get_dev_examples(self, data_dir):
return self.get_train_examples(data_dir)
def get_test_examples(self, data_dir):
"""See base class."""
filename = filename_template.format(data_type='test', cv_index=FLAGS.cv_index)
return self._create_examples(
self._read_tsv_gzip(os.path.join(data_dir, filename)), "test")
def get_labels(self):
"""See base class."""
return ["0", "1"]
def _create_examples(self, lines, set_type):
"""Creates examples for the training and dev sets."""
examples = []
for (i, line) in enumerate(lines):
# Only the test set has a header
if i == 0:
continue
guid = "%s-%s" % (set_type, i)
if set_type == "test":
# Headers: id, news_id, text_a, text_b, s_obj, t_obj
label = "0"
text_a = tokenization.convert_to_unicode(line[2])
text_b = tokenization.convert_to_unicode(line[3])
s_obj = tokenization.convert_to_unicode(line[4])
t_obj = tokenization.convert_to_unicode(line[5])
else:
# Headers: id, news_id, label, text_a, text_b, s_obj, t_obj
label = tokenization.convert_to_unicode(line[2])
text_a = tokenization.convert_to_unicode(line[3])
text_b = tokenization.convert_to_unicode(line[4])
s_obj = tokenization.convert_to_unicode(line[5])
t_obj = tokenization.convert_to_unicode(line[6])
examples.append(
InputExample(guid=guid,
text_a=text_a,
text_b=text_b,
s_obj=int(s_obj),
t_obj=int(t_obj),
label=label))
if set_type == "train":
random.Random(RANDOM_SEED).shuffle(examples)
return examples
class SAE_2PM_Processor(DataProcessor):
"""Processor for the SAE data set, three scale classification format
SAE stands for "Sentiment Attitude Extraction
2 -- Three scale
P -- Pair of sentences (text_a, text_b)
M -- Multiple classification
Columns:
test: [id, text_a, text_b]
train: [id, label, text_a, text_b]
"""
def get_train_examples(self, data_dir):
"""See base class."""
filename = filename_template.format(data_type='train', cv_index=FLAGS.cv_index)
return self._create_examples(
self._read_tsv_gzip(os.path.join(data_dir, filename)), "train")
def get_dev_examples(self, data_dir):
return self.get_train_examples(data_dir)
def get_test_examples(self, data_dir):
"""See base class."""
filename = filename_template.format(data_type='test', cv_index=FLAGS.cv_index)
return self._create_examples(
self._read_tsv_gzip(os.path.join(data_dir, filename)), "test")
def get_labels(self):
"""See base class."""
return ["0", "1"]
def _create_examples(self, lines, set_type):
"""Creates examples for the training and dev sets."""
examples = []
for (i, line) in enumerate(lines):
# Only the test set has a header
if i == 0:
continue
guid = "%s-%s" % (set_type, i)
if set_type == "test":
# Headers: id, news_id, text_a, text_b, s_obj, t_obj
label = "0"
text_a = tokenization.convert_to_unicode(line[2])
text_b = tokenization.convert_to_unicode(line[3])
s_obj = tokenization.convert_to_unicode(line[4])
t_obj = tokenization.convert_to_unicode(line[5])
else:
# Headers: id, news_id, label, text_a, text_b, s_obj, t_obj
label = tokenization.convert_to_unicode(line[2])
text_a = tokenization.convert_to_unicode(line[3])
text_b = tokenization.convert_to_unicode(line[4])
s_obj = tokenization.convert_to_unicode(line[5])
t_obj = tokenization.convert_to_unicode(line[6])
examples.append(
InputExample(guid=guid,
text_a=text_a,
text_b=text_b,
s_obj=int(s_obj),
t_obj=int(t_obj),
label=label))
if set_type == "train":
random.Random(RANDOM_SEED).shuffle(examples)
return examples
class SAE_3PM_Processor(SAE_2PM_Processor):
def get_labels(self):
"""See base class."""
return ["0", "1", "2"]
| 36.707424 | 100 | 0.577088 | 1,046 | 8,406 | 4.369981 | 0.113767 | 0.029534 | 0.114855 | 0.153139 | 0.910742 | 0.907023 | 0.906804 | 0.890834 | 0.862175 | 0.862175 | 0 | 0.009104 | 0.320485 | 8,406 | 228 | 101 | 36.868421 | 0.791141 | 0.195575 | 0 | 0.860294 | 0 | 0 | 0.022577 | 0.005529 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.044118 | 0.022059 | 0.330882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
415a7d755f985f3ab149f94bad633c296388e324 | 16,000 | py | Python | tests/envelopes/test_ahdsr.py | Nikolay-Lysenko/sinethesizer | fe6855186a00e701113ea5bb4fac104bf8497035 | [
"MIT"
] | 8 | 2019-07-25T12:17:38.000Z | 2021-09-04T19:38:21.000Z | tests/envelopes/test_ahdsr.py | Nikolay-Lysenko/sinethesizer | fe6855186a00e701113ea5bb4fac104bf8497035 | [
"MIT"
] | 7 | 2019-07-20T18:04:54.000Z | 2021-08-03T17:31:26.000Z | tests/envelopes/test_ahdsr.py | Nikolay-Lysenko/sinethesizer | fe6855186a00e701113ea5bb4fac104bf8497035 | [
"MIT"
] | 1 | 2019-10-16T18:44:43.000Z | 2019-10-16T18:44:43.000Z | """
Test `sinethesizer.envelopes.ahdsr` module.
Author: Nikolay Lysenko
"""
import numpy as np
import pytest
from sinethesizer.envelopes.ahdsr import (
create_generic_ahdsr_envelope,
create_relative_ahdsr_envelope,
create_trapezoid_envelope
)
from sinethesizer.synth.core import Event
@pytest.mark.parametrize(
"duration, velocity, frame_rate, "
"attack_to_ahds_max_ratio, max_attack_duration, attack_degree, "
"hold_to_hds_max_ratio, max_hold_duration, "
"decay_to_ds_max_ratio, max_decay_duration, decay_degree, "
"sustain_level, max_sustain_duration, "
"max_release_duration, release_duration_on_velocity_order, "
"release_degree, "
"peak_value, ratio_at_zero_velocity, envelope_values_on_velocity_order, "
"expected",
[
(
1.0, # `duration`
1.0, # `velocity`
20, # `frame_rate`
0.2, # `attack_to_ahds_max_ratio`
0.25, # `max_attack_duration`
1.0, # `attack_degree`
0.1, # `hold_to_hds_max_ratio`
0.05, # `max_hold_duration`
0.3, # `decay_to_ds_max_ratio`
0.25, # `max_decay_duration`
1.0, # `decay_degree`
0.6, # `sustain_level`
1.0, # `max_sustain_duration`
0.4, # `max_release_duration`
0.5, # `release_duration_on_velocity_order`
1.0, # `release_degree`
1.0, # `peak_value`
0.3, # `ratio_at_zero_velocity`
1.0, # `envelope_values_on_velocity_order`
np.array([
# Attack
0, 1 / 3, 2 / 3, 1.0,
# Hold
1.0,
# Decay
1.0, 1 - 0.4 / 3, 1 - 0.8 / 3, 0.6,
# Sustain
0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6,
# Release
0.6, 6 / 7 * 0.6, 5 / 7 * 0.6, 4 / 7 * 0.6, 3 / 7 * 0.6,
2 / 7 * 0.6, 1 / 7 * 0.6, 0.0
])
),
(
1.0, # `duration`
2 / 7, # `velocity`
20, # `frame_rate`
0.2, # `attack_to_ahds_max_ratio`
0.25, # `max_attack_duration`
1.0, # `attack_degree`
0.1, # `hold_to_hds_max_ratio`
0.05, # `max_hold_duration`
0.3, # `decay_to_ds_max_ratio`
0.25, # `max_decay_duration`
1.0, # `decay_degree`
0.6, # `sustain_level`
1.0, # `max_sustain_duration`
0.4, # `max_release_duration`
0.5, # `release_duration_on_velocity_order`
1.0, # `release_degree`
1.0, # `peak_value`
0.3, # `ratio_at_zero_velocity`
1.0, # `envelope_values_on_velocity_order`
np.array([
# Attack
0, 1 / 6, 1 / 3, 0.5,
# Hold
0.5,
# Decay
0.5, 0.5 - 0.2 / 3, 0.5 - 0.4 / 3, 0.3,
# Sustain
0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3,
# Release
0.3, 0.2, 0.1, 0.0
])
),
(
1.0, # `duration`
2 / 7, # `velocity`
20, # `frame_rate`
0.2, # `attack_to_ahds_max_ratio`
0.25, # `max_attack_duration`
1.0, # `attack_degree`
0.1, # `hold_to_hds_max_ratio`
0.05, # `max_hold_duration`
0.3, # `decay_to_ds_max_ratio`
0.25, # `max_decay_duration`
1.0, # `decay_degree`
0.6, # `sustain_level`
1.0, # `max_sustain_duration`
0.4, # `max_release_duration`
0.0, # `release_duration_on_velocity_order`
1.0, # `release_degree`
1.0, # `peak_value`
0.3, # `ratio_at_zero_velocity`
1.0, # `envelope_values_on_velocity_order`
np.array([
# Attack
0, 1 / 6, 1 / 3, 0.5,
# Hold
0.5,
# Decay
0.5, 0.5 - 0.2 / 3, 0.5 - 0.4 / 3, 0.3,
# Sustain
0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3,
# Release
0.3, 6 / 7 * 0.3, 5 / 7 * 0.3, 4 / 7 * 0.3, 3 / 7 * 0.3,
2 / 7 * 0.3, 1 / 7 * 0.3, 0.0
])
),
(
1.0, # `duration`
1.0, # `velocity`
20, # `frame_rate`
0.2, # `attack_to_ahds_max_ratio`
0.25, # `max_attack_duration`
1.0, # `attack_degree`
0.1, # `hold_to_hds_max_ratio`
0.05, # `max_hold_duration`
0.3, # `decay_to_ds_max_ratio`
0.25, # `max_decay_duration`
1.0, # `decay_degree`
0.6, # `sustain_level`
1.0, # `max_sustain_duration`
0.4, # `max_release_duration`
0.5, # `release_duration_on_velocity_order`
1.0, # `release_degree`
2.0, # `peak_value`
0.3, # `ratio_at_zero_velocity`
1.0, # `envelope_values_on_velocity_order`
np.array([
# Attack
0, 2 / 3, 4 / 3, 2.0,
# Hold
2.0,
# Decay
2.0, 2 - 0.8 / 3, 2 - 1.6 / 3, 1.2,
# Sustain
1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2,
# Release
1.2, 6 / 7 * 1.2, 5 / 7 * 1.2, 4 / 7 * 1.2, 3 / 7 * 1.2,
2 / 7 * 1.2, 1 / 7 * 1.2, 0.0
])
),
(
3.0, # `duration`
1.0, # `velocity`
20, # `frame_rate`
0.2, # `attack_to_ahds_max_ratio`
0.25, # `max_attack_duration`
1.0, # `attack_degree`
0.0, # `hold_to_hds_max_ratio`
0.0, # `max_hold_duration`
0.25, # `decay_to_ds_max_ratio`
1.0, # `max_decay_duration`
1.0, # `decay_degree`
0.6, # `sustain_level`
1.5, # `max_sustain_duration`
1.05, # `max_release_duration`
0.6, # `release_duration_on_velocity_order`
1.0, # `release_degree`
1.0, # `peak_value`
0.3, # `ratio_at_zero_velocity`
1.0, # `envelope_values_on_velocity_order`
np.array([
# Attack
0, 0.25, 0.5, 0.75, 1.0,
# No hold
# Decay
1.0, 1.0 - 0.1 / 3, 1.0 - 2 * 0.1 / 3, 1.0 - 3 * 0.1 / 3,
1.0 - 4 * 0.1 / 3, 1.0 - 5 * 0.1 / 3, 1.0 - 6 * 0.1 / 3,
1.0 - 7 * 0.1 / 3, 1.0 - 8 * 0.1 / 3, 1.0 - 9 * 0.1 / 3,
1.0 - 10 * 0.1 / 3, 1.0 - 11 * 0.1 / 3, 1.0 - 12 * 0.1 / 3,
# Sustain
0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6,
0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6,
0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6,
# Release
0.6, 0.57, 0.54, 0.51, 0.48, 0.45, 0.42, 0.39, 0.36, 0.33,
0.3, 0.27, 0.24, 0.21, 0.18, 0.15, 0.12, 0.09, 0.06, 0.03, 0.0
])
),
(
1.0, # `duration`
0.5, # `velocity`
20, # `frame_rate`
0.2, # `attack_to_ahds_max_ratio`
0.0, # `max_attack_duration`
1.0, # `attack_degree`
0.0, # `hold_to_hds_max_ratio`
0.0, # `max_hold_duration`
0.25, # `decay_to_ds_max_ratio`
0.0, # `max_decay_duration`
1.0, # `decay_degree`
0.6, # `sustain_level`
1.0, # `max_sustain_duration`
0.3, # `max_release_duration`
1.0, # `release_duration_on_velocity_order`
1.0, # `release_degree`
1.0, # `peak_value`
0.0, # `ratio_at_zero_velocity`
1.0, # `envelope_values_on_velocity_order`
np.array([
# Sustain
0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3,
0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3,
# Release
0.3, 0.15, 0.0
])
),
(
1.0, # `duration`
0.1, # `velocity`
20, # `frame_rate`
0.2, # `attack_to_ahds_max_ratio`
0.0, # `max_attack_duration`
1.0, # `attack_degree`
0.0, # `hold_to_hds_max_ratio`
0.0, # `max_hold_duration`
0.25, # `decay_to_ds_max_ratio`
0.0, # `max_decay_duration`
1.0, # `decay_degree`
1.0, # `sustain_level`
1.0, # `max_sustain_duration`
0.3, # `max_release_duration`
1.0, # `release_duration_on_velocity_order`
1.0, # `release_degree`
1.0, # `peak_value`
0.0, # `ratio_at_zero_velocity`
1.0, # `envelope_values_on_velocity_order`
np.array([
# Sustain
0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1,
0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1
])
),
]
)
def test_create_generic_ahdsr_envelope(
duration: float, velocity: float, frame_rate: int,
attack_to_ahds_max_ratio: float,
max_attack_duration: float,
attack_degree: float,
hold_to_hds_max_ratio: float,
max_hold_duration: float,
decay_to_ds_max_ratio: float,
max_decay_duration: float,
decay_degree: float,
sustain_level: float,
max_sustain_duration: float,
max_release_duration: float,
release_duration_on_velocity_order: float,
release_degree: float,
peak_value: float,
ratio_at_zero_velocity: float,
envelope_values_on_velocity_order: float,
expected: np.ndarray
) -> None:
"""Test `create_generic_ahdsr_envelope` function."""
event = Event(
instrument='any_instrument',
start_time=0,
duration=duration,
frequency=440,
velocity=velocity,
effects='',
frame_rate=frame_rate
)
result = create_generic_ahdsr_envelope(
event,
attack_to_ahds_max_ratio, max_attack_duration, attack_degree,
hold_to_hds_max_ratio, max_hold_duration,
decay_to_ds_max_ratio, max_decay_duration, decay_degree,
sustain_level, max_sustain_duration,
max_release_duration, release_duration_on_velocity_order,
release_degree,
peak_value, ratio_at_zero_velocity, envelope_values_on_velocity_order
)
np.testing.assert_almost_equal(result, expected)
@pytest.mark.parametrize(
"duration, velocity, frame_rate, "
"attack_to_ahds_ratio, attack_degree, hold_to_ahds_ratio, "
"decay_to_ahds_ratio, decay_degree, sustain_level, "
"max_release_duration, release_duration_on_velocity_order, "
"release_degree, "
"peak_value, ratio_at_zero_velocity, envelope_values_on_velocity_order, "
"expected",
[
(
1.0, # `duration`
1.0, # `velocity`
10, # `frame_rate`
0.2, # `attack_to_ahds_ratio`
1.0, # `attack_degree`,
0.2, # `hold_to_ahds_ratio`
0.2, # `decay_to_ahds_ratio`
1.0, # `decay_degree`
0.6, # `sustain_level`
0.4, # `max_release_duration`
1.0, # `release_duration_on_velocity_order`
1.0, # `release_degree`
1.0, # `peak_value`
0.0, # `ratio_at_zero_velocity`
1.0, # `envelope_values_on_velocity_order`
np.array([
# Attack
0, 1.0,
# Hold
1.0, 1.0,
# Decay
1.0, 0.6,
# Sustain
0.6, 0.6, 0.6, 0.6,
# Release
0.6, 0.4, 0.2, 0.0
])
),
(
1.0, # `duration`
0.5, # `velocity`
10, # `frame_rate`
0.0, # `attack_to_ahds_ratio`
1.0, # `attack_degree`,
0.0, # `hold_to_ahds_ratio`
0.0, # `decay_to_ahds_ratio`
1.0, # `decay_degree`
0.6, # `sustain_level`
1.0, # `max_release_duration`
1.0, # `release_duration_on_velocity_order`
1.0, # `release_degree`
1.0, # `peak_value`
0.0, # `ratio_at_zero_velocity`
0.0, # `envelope_values_on_velocity_order`
np.array([
# Sustain
0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6, 0.6,
# Release
0.6, 0.45, 0.3, 0.15, 0.0
])
),
]
)
def test_create_relative_ahdsr_envelope(
duration: float, velocity: float, frame_rate: int,
attack_to_ahds_ratio: float, attack_degree: float,
hold_to_ahds_ratio: float,
decay_to_ahds_ratio: float, decay_degree: float,
sustain_level: float,
max_release_duration: float, release_duration_on_velocity_order: float,
release_degree: float,
peak_value: float, ratio_at_zero_velocity: float,
envelope_values_on_velocity_order: float,
expected: np.ndarray
) -> None:
"""Test `create_relative_ahdsr_envelope` function."""
event = Event(
instrument='any_instrument',
start_time=0,
duration=duration,
frequency=440,
velocity=velocity,
effects='',
frame_rate=frame_rate
)
result = create_relative_ahdsr_envelope(
event,
attack_to_ahds_ratio, attack_degree, hold_to_ahds_ratio,
decay_to_ahds_ratio, decay_degree, sustain_level,
max_release_duration, release_duration_on_velocity_order,
release_degree,
peak_value, ratio_at_zero_velocity, envelope_values_on_velocity_order
)
np.testing.assert_almost_equal(result, expected)
@pytest.mark.parametrize(
"duration, velocity, frame_rate, "
"attack_share, attack_degree, decay_share, decay_degree, "
"peak_value, ratio_at_zero_velocity, envelope_values_on_velocity_order, "
"expected",
[
(
1.0, # `duration`
1.0, # `velocity`
10, # `frame_rate`
0.2, # `attack_share`
1.0, # `attack_degree`
0.5, # `decay_share`
1.0, # `decay_degree`
1.0, # `peak_value`
0.0, # `ratio_at_zero_velocity`
0.0, # `envelope_values_on_velocity_order`
np.array([
# Attack
0, 1.0,
# Hold
1.0, 1.0, 1.0,
# Decay
1.0, 0.75, 0.5, 0.25, 0.0
])
),
]
)
def test_create_trapezoid_envelope(
duration: float, velocity: float, frame_rate: int,
attack_share: float, attack_degree: float,
decay_share: float, decay_degree: float,
peak_value: float, ratio_at_zero_velocity: float,
envelope_values_on_velocity_order: float,
expected: np.ndarray
) -> None:
"""Test `create_trapezoid_envelope` function."""
event = Event(
instrument='any_instrument',
start_time=0,
duration=duration,
frequency=440,
velocity=velocity,
effects='',
frame_rate=frame_rate
)
result = create_trapezoid_envelope(
event,
attack_share, attack_degree, decay_share, decay_degree,
peak_value, ratio_at_zero_velocity, envelope_values_on_velocity_order
)
np.testing.assert_almost_equal(result, expected)
| 35.555556 | 79 | 0.490063 | 2,117 | 16,000 | 3.405291 | 0.049126 | 0.033569 | 0.023304 | 0.028298 | 0.900264 | 0.862256 | 0.845194 | 0.845194 | 0.826606 | 0.811347 | 0 | 0.103744 | 0.387313 | 16,000 | 449 | 80 | 35.634744 | 0.631643 | 0.249938 | 0 | 0.733503 | 0 | 0 | 0.075119 | 0.033481 | 0 | 0 | 0 | 0 | 0.007614 | 1 | 0.007614 | false | 0 | 0.010152 | 0 | 0.017767 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
417dfb4b602f2abc0ce5734a78165a2a6460652e | 20,365 | py | Python | lib/kinematics/DifferentialDQ.py | zdynamics/uRobot | cdcb175ac94f62f3ccec6913bb53d9fba736850e | [
"MIT"
] | 1 | 2021-12-24T22:01:32.000Z | 2021-12-24T22:01:32.000Z | lib/kinematics/DifferentialDQ.py | AssemblingTheFuture/zRobotics | cdcb175ac94f62f3ccec6913bb53d9fba736850e | [
"MIT"
] | null | null | null | lib/kinematics/DifferentialDQ.py | AssemblingTheFuture/zRobotics | cdcb175ac94f62f3ccec6913bb53d9fba736850e | [
"MIT"
] | null | null | null | # Access to parent folder to get its files
import sys, os
sys.path.append(sys.path[0].replace(r'/lib/kinematics', r''))
# Libraries
import numpy as np
from lib.kinematics.DQ import *
from lib.movements.DQ import *
from sympy import *
def dqVelocityPropagation(robot : object, w0 : np.array, qd : np.array, symbolic = False):
"""Using Dual Quaternions, this function computes velocity (both linear and angular) to the i-th reference frame of a serial robot given initial velocity. Serial robot's kinematic parameters have to be set before using this function
Args:
robot (object): serial robot (this won't work with other type of robots)
w0 (np.array): initial velocity of the system (equals to zero if the robot's base is not mobile)
qd (np.array): velocities of each joint
symbolic (bool, optional): used to calculate symbolic equations. Defaults to False.
Returns:
W (np.array): velocity to each reference frame (numerical)
W (SymPy Matrix): velocity to each reference frame (symbolic)
"""
# Initial conditions
W = [w0]
# Calculate forward kinematics to know the position and axis of actuation of each joint
fkDQ = forwardDQ(robot, symbolic)
# Get number of reference frames
m = robot.dhParameters.shape[0]
# Iterates through all reference frames (excepting inertial one)
for k in range(1, m):
# Get Denavit - Hartenberg Parameters Matrix of current frame
frame = robot.symbolicDHParameters[k, :]
# Check if this frame contains any of the "n" joints
containedJoints = np.in1d(robot.qSymbolic, frame)
# If any joint is in the current reference frame
if any(element == True for element in containedJoints):
# Get the number of the joint
joint = np.where(containedJoints == True)[0][-1]
# Get pose of reference frame where joint is attached
Q = fkDQ[k - 1]
# Get axis of actuation of the joint (screw vector)
xi = robot.xi[:, joint].reshape((8, 1))
# Relative velocity calculation equals to left(Q) * right(conjugate(Q)) * xi * qdi
wJoint = dqMultiplication(dqMultiplication(Q, xi, symbolic), conjugateDQ(Q, symbolic), symbolic) * qd[joint] if symbolic else dqMultiplication(dqMultiplication(Q, xi), conjugateDQ(Q)) * qd[joint]
else:
# Relative angular velocity calculation equals to zero (no effects caused by joints)
wJoint = zeros(8, 1) if symbolic else np.zeros((8, 1))
# Create relative position cross operator between i-th and i + 1 frames
ri = crossOperatorExtension(dqToR3(fkDQ[k], symbolic) - dqToR3(fkDQ[k - 1], symbolic), symbolic)
# Relative position matrix between i-th and i + 1 frames
Mi = Matrix([[eye(4), zeros(4)], [-ri, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(-ri, np.eye(4), axis = 1), axis = 0)
# Create position cross operator for i + 1 reference frame
r = crossOperatorExtension(dqToR3(fkDQ[k], symbolic), symbolic)
# Create inverse position matrix for i + 1 reference frame
M = Matrix([[eye(4), zeros(4)], [-r, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(-r, np.eye(4), axis = 1), axis = 0)
# Calculate velocity up to this point
w = trigsimp((Mi * W[-1]) + (M * wJoint)) if symbolic else Mi.dot(W[-1]) + M.dot(wJoint)
# Append each calculated velocity
W.append(nsimplify(w.evalf(), tolerance = 1e-10) if symbolic else w)
return W
def dqAccelerationPropagation(robot : object, dw0 : np.array, Wdq : list, qd : np.array, qdd : np.array, symbolic = False):
"""Using Dual Quaternions, this function computes acceleration (both linear and angular) to the i-th reference frame of a serial robot given initial acceleration. Serial robot's kinematic parameters have to be set before using this function
Args:
robot (object): serial robot (this won't work with other type of robots)
dw0 (np.array): initial acceleration of the system (equals to zero if the robot's base is not mobile)
Wdq (list): inertial velocity of the system using dual quaternions (equals to zero if the robot's base is not mobile)
qd (np.array): velocities of each joint
qdd (np.array): accelerations of each joint
symbolic (bool, optional): used to calculate symbolic equations. Defaults to False.
Returns:
dW (np.array): acceleration to each reference frame (numerical)
dW (SymPy Matrix): acceleration to each reference frame (symbolic)
"""
# Initial conditions
dW = [dw0]
# Calculate forward kinematics to know the position and axis of actuation of each joint
fkDQ = forwardDQ(robot, symbolic)
# Get number of reference frames
m = robot.dhParameters.shape[0]
# Iterates through all reference frames (excepting inertial one)
for k in range(1, m):
# Get Denavit - Hartenberg Parameters Matrix of current frame
frame = robot.symbolicDHParameters[k, :]
# Check if this frame contains any of the "n" joints
containedJoints = np.in1d(robot.qSymbolic, frame)
# If any joint is in the current reference frame
if any(element == True for element in containedJoints):
# Get the number of the joint
joint = np.where(containedJoints == True)[0][-1]
# Get pose of reference frame where joint is attached
Q = fkDQ[k - 1]
# Get axis of actuation of the joint (screw vector)
xi = robot.xi[:, joint].reshape((8, 1))
# Get derivative of axis of actuation of the joint (screw vector)
xid = robot.xid[:, joint].reshape((8, 1))
# Relative acceleration calculation equals to left(Q) * right(conjugate(Q)) * (xid * qdi + xi * qddi)
dwJoint = dqMultiplication(dqMultiplication(Q, (xid * qd[joint]) + (xi * qdd[joint]), symbolic), conjugateDQ(Q, symbolic), symbolic) if symbolic else dqMultiplication(dqMultiplication(Q, (xid * qd[joint]) + (xi * qdd[joint])), conjugateDQ(Q))
# Create cross operator for the position of the i-th reference frame
ri = crossOperatorExtension(dqToR3(fkDQ[k - 1], symbolic), symbolic)
# Create position matrix for i-th reference frame
Mi = Matrix([[eye(4), zeros(4, 1)], [ri, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(ri, np.eye(4), axis = 1), axis = 0)
# Dual velocity of the i-th frame seen from itself
dualW = dqMultiplication(dqMultiplication(conjugateDQ(Q, symbolic), Mi * Wdq[k - 1], symbolic), Q, symbolic) if symbolic else dqMultiplication(dqMultiplication(conjugateDQ(Q), Mi.dot(Wdq[k - 1])), Q)
# Centripetal effect of the joint with respect to the inertial frame
dwCentripetalJoint = dqMultiplication(dqMultiplication(Q, dualCrossOperator(dualW, symbolic) * xi * qd[joint], symbolic), conjugateDQ(Q, symbolic), symbolic) if symbolic else dqMultiplication(dqMultiplication(Q, dualCrossOperator(dualW).dot(xi * qd[joint])), conjugateDQ(Q))
else:
# Relative acceleration calculation equals to zero (no effects caused by joints)
dwJoint = zeros(8, 1) if symbolic else np.zeros((8, 1))
# Centripetal acceleration calculation to i-th frame (with respect to inertial one) equals to zero (no effects caused by joints)
dwCentripetalJoint = zeros(8, 1) if symbolic else np.zeros((8, 1))
# Create relative position cross operator between i-th and i + 1 frames
ri = crossOperatorExtension(dqToR3(fkDQ[k], symbolic) - dqToR3(fkDQ[k - 1], symbolic), symbolic)
# Relative position matrix between i-th and i + 1 frames
Mi = Matrix([[eye(4), zeros(4)], [-ri, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(-ri, np.eye(4), axis = 1), axis = 0)
# Create position cross operator for i + 1 reference frame
r = crossOperatorExtension(dqToR3(fkDQ[k], symbolic), symbolic)
# Create inverse position matrix for i + 1 reference frame
M = Matrix([[eye(4), zeros(4, 4)], [-r, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(-r, np.eye(4), axis = 1), axis = 0)
# Centripetal effects
dwCentripetal = Matrix([[zeros(4, 1)], [(crossOperatorExtension(Wdq[k][0 : 4, :], symbolic) * Wdq[k][4 : 8, :]) - (crossOperatorExtension(Wdq[k - 1][0 : 4, :], symbolic) * Wdq[k - 1][4 : 8, :])]]) if symbolic else np.append(np.zeros((4, 1)), crossOperatorExtension(Wdq[k][0 : 4, :]).dot(Wdq[k][4 : 8, :]) - crossOperatorExtension(Wdq[k - 1][0 : 4, :]).dot(Wdq[k - 1][4 : 8, :]), axis = 0)
# Calculate angular velocity up to this point
dw = trigsimp((Mi * dW[-1]) + dwCentripetal + M * (dwCentripetalJoint + dwJoint)) if symbolic else Mi.dot(dW[-1]) + dwCentripetal + M.dot(dwCentripetalJoint + dwJoint)
# Append each calculated angular velocity
dW.append(nsimplify(dw.evalf(), tolerance = 1e-10) if symbolic else dw)
return dW
def dqVelocityPropagationCOM(robot : object, WdqCOM0 : np.array, Wdq : list, qd : np.array, symbolic = False):
"""Using Dual Quaternions, this function computes angular and linear velocity to the j-th center of mass of a serial robot given reference frames' ones. Serial robot's kinematic parameters have to be set before using this function
Args:
robot (object): serial robot (this won't work with other type of robots)
WdqCOM0 (np.array): initial velocity of the system (equals to zero if the robot's base is not mobile)
Wdq (list): angular and linear velocities of the system (this have to be calculated with "dqVelocityPropagation" function)
qd (np.array): velocities of each joint
symbolic (bool, optional): used to calculate symbolic equations. Defaults to False.
Returns:
WdqCOM (np.array): angular and linear velocity of each center of mass (numerical)
WdqCOM (SymPy Matrix): angular and linear velocity of each center of mass (symbolic)
"""
# Initial conditions
WdqCOM = [WdqCOM0]
# Calculate forward kinematics to know the position and axis of actuation of each joint
fkDQ = forwardDQ(robot, symbolic)
# Calculate forward kinematics to know the position and axis of actuation of each center of mass
fkCOMDQ = forwardCOMDQ(robot, symbolic)
# Get number of reference frames
m = robot.dhParameters.shape[0]
# Iterates through all reference frames (excepting inertial one)
for k in range(1, m):
# Get Denavit - Hartenberg Parameters Matrix of current frame
frame = robot.symbolicDHParametersCOM[k, :]
# Check if current frame contain any of the joints
containedJoints = np.in1d(robot.qSymbolic, frame)
# Check if current frame contains any of the centers of mass
containedCOMs = np.in1d(robot.symbolicCOMs, frame)
# If any joint is in the current reference frame
if any(element == True for element in containedJoints):
# Get the number of the associated joint
joint = np.where(containedJoints == True)[0][-1]
# Get pose of reference frame where joint is attached
Q = fkDQ[k - 1]
# Get axis of actuation of the joint (screw vector)
xi = robot.xi[:, joint].reshape((8, 1))
# Relative velocity calculation equals to left(Q) * right(conjugate(Q)) * xi * qdi
wJoint = dqMultiplication(dqMultiplication(Q, xi, symbolic), conjugateDQ(Q, symbolic), symbolic) * qd[joint] if symbolic else dqMultiplication(dqMultiplication(Q, xi), conjugateDQ(Q)) * qd[joint]
# If there's no joint in current frame
else:
# Relative velocity calculation equals to zero (no effects caused by joints)
wJoint = zeros(8, 1) if symbolic else np.zeros((8, 1))
# If any center of mass is in the current reference frame
if any(element == True for element in containedCOMs):
# Get the number of the center of mass (the sum is because of the way Python indexes arrays)
COM = np.where(containedCOMs == True)[0][-1] + 1
# Get relative position of center of mass
ri = crossOperatorExtension(dqToR3(fkCOMDQ[COM], symbolic) - dqToR3(fkDQ[k - 1], symbolic), symbolic)
# Relative position matrix between i-th and center of mass frames
Mi = Matrix([[eye(4), zeros(4)], [-ri, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(-ri, np.eye(4), axis = 1), axis = 0)
# Create position cross operator for the reference frame of the center of mass
rCOM = crossOperatorExtension(dqToR3(fkCOMDQ[COM], symbolic), symbolic)
# Create inverse position matrix for the reference frame of the center of mass
Mcom = Matrix([[eye(4), zeros(4)], [-rCOM, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(-rCOM, np.eye(4), axis = 1), axis = 0)
# Calculate velocity up to this point
wCOM = trigsimp((Mi * Wdq[k - 1]) + (Mcom * wJoint)) if symbolic else Mi.dot(Wdq[k - 1]) + Mcom.dot(wJoint)
# Append each calculated velocity
WdqCOM.append(nsimplify(wCOM.evalf(), tolerance = 1e-10) if symbolic else wCOM)
return WdqCOM
def dqAccelerationPropagationCOM(robot : object, dWdqCOM0 : np.array, Wdq : list, WdqCOM : list, dWdq : list, qd : np.array, qdd : np.array, symbolic = False):
"""Using Dual Quaternions, this function computes angular and linear acceleration to the j-th center of mass of a serial robot given reference frames' ones. Serial robot's kinematic parameters have to be set before using this function
Args:
robot (object): serial robot (this won't work with other type of robots)
dWdqCOM0 (np.array): initial acceleration of the system (equals to zero if the robot's base is not mobile)
Wdq (list): angular and linear velocities of the system (this have to be calculated with "dqVelocityPropagation" function)
WdqCOM (list): angular and linear velocities of the centers of mass (this have to be calculated with "dqVelocityPropagationCOM" function)
dWdq (list): angular and linear accelerations of the system (this have to be calculated with "dqAccelerationPropagation" function)
qd (np.array): velocities of each joint
symbolic (bool, optional): used to calculate symbolic equations. Defaults to False.
Returns:
dWdqCOM (np.array): angular and linear acceleration of each center of mass (numerical)
dWdqCOM (SymPy Matrix): angular and linear acceleration of each center of mass (symbolic)
"""
# Initial conditions
dWdqCOM = [dWdqCOM0]
# Calculate forward kinematics to know the position and axis of actuation of each joint
fkDQ = forwardDQ(robot, symbolic)
# Calculate forward kinematics to know the position and axis of actuation of each center of mass
fkCOMDQ = forwardCOMDQ(robot, symbolic)
# Get number of reference frames
m = robot.dhParameters.shape[0]
# Iterates through all reference frames (excepting inertial one)
for k in range(1, m):
# Get Denavit - Hartenberg Parameters Matrix of current frame
frame = robot.symbolicDHParametersCOM[k, :]
# Check if current frame contain any of the joints
containedJoints = np.in1d(robot.qSymbolic, frame)
# Check if current frame contains any of the centers of mass
containedCOMs = np.in1d(robot.symbolicCOMs, frame)
# If any joint is in the current reference frame
if any(element == True for element in containedJoints):
# Get the number of the associated joint
joint = np.where(containedJoints == True)[0][-1]
# Get pose of reference frame where joint is attached
Q = fkDQ[k - 1]
# Get axis of actuation of the joint (screw vector)
xi = robot.xi[:, joint].reshape((8, 1))
# Get derivative of axis of actuation of the joint (screw vector)
xid = robot.xid[:, joint].reshape((8, 1))
# Relative acceleration calculation equals to left(Q) * right(conjugate(Q)) * (xid * qdi + xi * qddi)
dwJoint = dqMultiplication(dqMultiplication(Q, (xid * qd[joint]) + (xi * qdd[joint]), symbolic), conjugateDQ(Q, symbolic), symbolic) if symbolic else dqMultiplication(dqMultiplication(Q, (xid * qd[joint]) + (xi * qdd[joint])), conjugateDQ(Q))
# Create cross operator for the position of the i-th reference frame
ri = crossOperatorExtension(dqToR3(fkDQ[k - 1], symbolic), symbolic)
# Create position matrix for i-th reference frame
Mi = Matrix([[eye(4), zeros(4, 1)], [ri, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(ri, np.eye(4), axis = 1), axis = 0)
# Dual velocity of the i-th frame seen from itself
dualW = dqMultiplication(dqMultiplication(conjugateDQ(Q, symbolic), Mi * Wdq[k - 1], symbolic), Q, symbolic) if symbolic else dqMultiplication(dqMultiplication(conjugateDQ(Q), Mi.dot(Wdq[k - 1])), Q)
# Centripetal effect of the joint with respect to the inertial frame
dwCentripetalJoint = dqMultiplication(dqMultiplication(Q, dualCrossOperator(dualW, symbolic) * xi * qd[joint], symbolic), conjugateDQ(Q, symbolic), symbolic) if symbolic else dqMultiplication(dqMultiplication(Q, dualCrossOperator(dualW).dot(xi * qd[joint])), conjugateDQ(Q))
# If there's no joint in current frame
else:
# Relative acceleration calculation equals to zero (no effects caused by joints)
dwJoint = zeros(8, 1) if symbolic else np.zeros((8, 1))
# Centripetal acceleration calculation to i-th frame (with respect to inertial one) equals to zero (no effects caused by joints)
dwCentripetalJoint = zeros(8, 1) if symbolic else np.zeros((8, 1))
# If any center of mass is in the current reference frame
if any(element == True for element in containedCOMs):
# Get the number of the center of mass (the sum is because of the way Python indexes arrays)
COM = np.where(containedCOMs == True)[0][-1] + 1
# Get relative position of center of mass
ri = crossOperatorExtension(dqToR3(fkCOMDQ[COM], symbolic) - dqToR3(fkDQ[k - 1], symbolic), symbolic)
# Relative position matrix between i-th and center of mass frames
Mi = Matrix([[eye(4), zeros(4)], [-ri, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(-ri, np.eye(4), axis = 1), axis = 0)
# Create position cross operator for the reference frame of the center of mass
rCOM = crossOperatorExtension(dqToR3(fkCOMDQ[COM], symbolic), symbolic)
# Create inverse position matrix for the reference frame of the center of mass
Mcom = Matrix([[eye(4), zeros(4)], [-rCOM, eye(4)]]) if symbolic else np.append(np.append(np.eye(4), np.zeros((4, 4)), axis = 1), np.append(-rCOM, np.eye(4), axis = 1), axis = 0)
# Centripetal effects
dwCentripetalCOM = Matrix([[zeros(4, 1)], [(crossOperatorExtension(WdqCOM[COM][0 : 4, :], symbolic) * WdqCOM[COM][4 : 8, :]) - (crossOperatorExtension(Wdq[k - 1][0 : 4, :], symbolic) * Wdq[k - 1][4 : 8, :])]]) if symbolic else np.append(np.zeros((4, 1)), crossOperatorExtension(WdqCOM[COM][0 : 4, :]).dot(WdqCOM[COM][4 : 8, :]) - crossOperatorExtension(Wdq[k - 1][0 : 4, :]).dot(Wdq[k - 1][4 : 8, :]), axis = 0)
# Calculate angular velocity up to this point
dwCOM = trigsimp((Mi * dWdq[k - 1]) + dwCentripetalCOM + Mcom * (dwCentripetalJoint + dwJoint)) if symbolic else Mi.dot(dWdq[k - 1]) + dwCentripetalCOM + Mcom.dot(dwCentripetalJoint + dwJoint)
# Append each calculated velocity
dWdqCOM.append(nsimplify(dwCOM.evalf(), tolerance = 1e-10) if symbolic else dwCOM)
return dWdqCOM
if __name__ == '__main__':
"""
THIS SECTION IS FOR TESTING PURPOSES ONLY
"""
print("Z") | 53.733509 | 418 | 0.663982 | 2,827 | 20,365 | 4.780333 | 0.083481 | 0.01184 | 0.035223 | 0.021311 | 0.917419 | 0.900622 | 0.881086 | 0.854077 | 0.84357 | 0.825144 | 0 | 0.01822 | 0.229217 | 20,365 | 379 | 419 | 53.733509 | 0.842709 | 0.445372 | 0 | 0.719298 | 0 | 0 | 0.002252 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035088 | false | 0 | 0.04386 | 0 | 0.114035 | 0.008772 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
419c1794c5d1f69674d63651ed3ec61b4c7b5fab | 130 | py | Python | pypadre/pod/app/__init__.py | padre-lab-eu/pypadre | c244a5f1d4eb7bf168cc06dd9b43416883534268 | [
"MIT"
] | 3 | 2019-12-19T13:29:52.000Z | 2019-12-20T07:32:05.000Z | pypadre/pod/app/__init__.py | padre-lab-eu/pypadre | c244a5f1d4eb7bf168cc06dd9b43416883534268 | [
"MIT"
] | 1 | 2019-12-16T13:39:24.000Z | 2019-12-16T13:39:24.000Z | pypadre/pod/app/__init__.py | padre-lab-eu/pypadre | c244a5f1d4eb7bf168cc06dd9b43416883534268 | [
"MIT"
] | null | null | null | """
The package contains
1. The command line interface
2.
"""
from .padre_app import PadreApp
from .padre_app import PadreConfig
| 14.444444 | 34 | 0.769231 | 19 | 130 | 5.157895 | 0.736842 | 0.183673 | 0.244898 | 0.367347 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018182 | 0.153846 | 130 | 8 | 35 | 16.25 | 0.872727 | 0.407692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
41be161a112ec63bf93a9068b3334a705d87729a | 181 | py | Python | example/tests/orders/test_orders_status.py | icvntechstudio/django-salesman | 017dd31713e37a445500c18e0c7034608f4f62a7 | [
"BSD-3-Clause"
] | 222 | 2020-02-03T16:58:56.000Z | 2022-03-30T16:35:35.000Z | example/tests/orders/test_orders_status.py | icvntechstudio/django-salesman | 017dd31713e37a445500c18e0c7034608f4f62a7 | [
"BSD-3-Clause"
] | 16 | 2020-03-17T12:38:27.000Z | 2022-03-16T13:14:55.000Z | example/tests/orders/test_orders_status.py | icvntechstudio/django-salesman | 017dd31713e37a445500c18e0c7034608f4f62a7 | [
"BSD-3-Clause"
] | 23 | 2020-08-28T04:46:33.000Z | 2022-01-12T21:57:39.000Z | from salesman.orders.status import BaseOrderStatus
def test_base_order_status():
assert BaseOrderStatus.get_payable() == []
assert BaseOrderStatus.get_transitions() == {}
| 25.857143 | 50 | 0.762431 | 19 | 181 | 7 | 0.736842 | 0.315789 | 0.360902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132597 | 181 | 6 | 51 | 30.166667 | 0.847134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
68bd9ffd4a566767be62b0d6bf3d0efd50ecd499 | 74 | py | Python | tests/transformers/sql_translator/__init__.py | pyrapt/rapt | 0193a07aafff83a887fdc9e5e0f25eafa5b1b205 | [
"MIT"
] | 1 | 2019-08-22T09:39:00.000Z | 2019-08-22T09:39:00.000Z | tests/transformers/sql_translator/__init__.py | pyrapt/rapt | 0193a07aafff83a887fdc9e5e0f25eafa5b1b205 | [
"MIT"
] | null | null | null | tests/transformers/sql_translator/__init__.py | pyrapt/rapt | 0193a07aafff83a887fdc9e5e0f25eafa5b1b205 | [
"MIT"
] | 1 | 2022-03-24T00:51:03.000Z | 2022-03-24T00:51:03.000Z | from . import test_translation_sequence
from . import test_translation_sql | 37 | 39 | 0.878378 | 10 | 74 | 6.1 | 0.6 | 0.327869 | 0.459016 | 0.819672 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094595 | 74 | 2 | 40 | 37 | 0.910448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
ec16adcc8901ee11ef8bedf4e825901a718d2a75 | 3,635 | py | Python | tests/test_websockets.py | tasn/mangum | 6da7e51ca8e7979f41291ab3f0e698882f219814 | [
"MIT"
] | 661 | 2020-06-02T01:06:35.000Z | 2022-03-30T22:40:47.000Z | tests/test_websockets.py | tasn/mangum | 6da7e51ca8e7979f41291ab3f0e698882f219814 | [
"MIT"
] | 116 | 2020-06-02T02:14:14.000Z | 2022-03-25T11:54:38.000Z | tests/test_websockets.py | tasn/mangum | 6da7e51ca8e7979f41291ab3f0e698882f219814 | [
"MIT"
] | 55 | 2020-06-02T02:01:26.000Z | 2022-03-16T16:13:09.000Z | import respx
from mangum import Mangum
@respx.mock(assert_all_mocked=False)
def test_websocket_close(
sqlite3_dsn, mock_ws_connect_event, mock_ws_send_event
) -> None:
async def app(scope, receive, send):
if scope["type"] == "websocket":
while True:
message = await receive()
if message["type"] == "websocket.connect":
await send({"type": "websocket.close"})
handler = Mangum(app, lifespan="off", dsn=sqlite3_dsn)
response = handler(mock_ws_connect_event, {})
assert response == {"statusCode": 200}
response = handler(mock_ws_send_event, {})
assert response == {"statusCode": 403}
@respx.mock(assert_all_mocked=False)
def test_websocket_disconnect(
sqlite3_dsn, mock_ws_connect_event, mock_ws_send_event, mock_websocket_app
) -> None:
handler = Mangum(mock_websocket_app, lifespan="off", dsn=sqlite3_dsn)
response = handler(mock_ws_connect_event, {})
assert response == {"statusCode": 200}
response = handler(mock_ws_send_event, {})
assert response == {"statusCode": 200}
def test_websocket_exception(
sqlite3_dsn, mock_ws_connect_event, mock_ws_send_event
) -> None:
async def app(scope, receive, send):
raise Exception()
handler = Mangum(app, dsn=sqlite3_dsn)
handler(mock_ws_connect_event, {})
handler = Mangum(app, dsn=sqlite3_dsn)
response = handler(mock_ws_send_event, {})
assert response == {"statusCode": 500}
def test_websocket_unexpected_message_error(
sqlite3_dsn, mock_ws_connect_event, mock_ws_send_event
) -> None:
async def app(scope, receive, send):
await send({"type": "websocket.oops", "subprotocol": None})
handler = Mangum(app, dsn=sqlite3_dsn)
handler(mock_ws_connect_event, {})
handler = Mangum(app, dsn=sqlite3_dsn)
response = handler(mock_ws_send_event, {})
assert response == {"statusCode": 500}
@respx.mock(assert_all_mocked=False)
def test_websocket_without_body(
sqlite3_dsn, mock_ws_connect_event, mock_ws_send_event, mock_websocket_app
) -> None:
handler = Mangum(mock_websocket_app, lifespan="off", dsn=sqlite3_dsn)
response = handler(mock_ws_connect_event, {})
assert response == {"statusCode": 200}
del mock_ws_send_event["body"]
response = handler(mock_ws_send_event, {})
assert response == {"statusCode": 200}
@respx.mock(assert_all_mocked=False)
def test_base64_encoded_body_on_request(
sqlite3_dsn, mock_ws_connect_event, mock_ws_send_event, mock_websocket_app
):
handler = Mangum(mock_websocket_app, dsn=sqlite3_dsn)
response = handler(mock_ws_connect_event, {})
assert response == {"statusCode": 200}
mock_ws_send_event["body"] = b"bWFuZ3Vt="
mock_ws_send_event["isBase64Encoded"] = True
response = handler(mock_ws_send_event, {})
assert response == {"statusCode": 200}
def test_binary_response(
sqlite3_dsn, mock_ws_connect_event, mock_ws_send_event, mock_websocket_app
):
async def app(scope, receive, send):
if scope["type"] == "websocket":
while True:
message = await receive()
if message["type"] == "websocket.connect":
await send({"type": "websocket.accept"})
elif message["type"] == "websocket.receive":
await send({"type": "websocket.send", "body": b"bWFuZ3Vt="})
handler = Mangum(app, dsn=sqlite3_dsn)
response = handler(mock_ws_connect_event, {})
assert response == {"statusCode": 200}
response = handler(mock_ws_send_event, {})
assert response == {"statusCode": 500}
| 33.045455 | 80 | 0.683356 | 450 | 3,635 | 5.184444 | 0.128889 | 0.079726 | 0.072868 | 0.109301 | 0.848693 | 0.82126 | 0.82126 | 0.82126 | 0.805829 | 0.747964 | 0 | 0.019849 | 0.196149 | 3,635 | 109 | 81 | 33.348624 | 0.778576 | 0 | 0 | 0.746988 | 0 | 0 | 0.096011 | 0 | 0 | 0 | 0 | 0 | 0.192771 | 1 | 0.084337 | false | 0 | 0.024096 | 0 | 0.108434 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ec1e7d2eebb9831621d9b5694b8669934dd8a51f | 71,413 | py | Python | nidaqmx/_task_modules/triggering/reference_trigger.py | stafak/nidaqmx-python | f354d7971b21074c120c6f298dbbf4a5e0e4f4f4 | [
"MIT"
] | 252 | 2017-03-22T02:43:16.000Z | 2022-03-27T14:44:44.000Z | nidaqmx/_task_modules/triggering/reference_trigger.py | stafak/nidaqmx-python | f354d7971b21074c120c6f298dbbf4a5e0e4f4f4 | [
"MIT"
] | 133 | 2017-03-21T20:57:59.000Z | 2022-03-31T16:08:12.000Z | nidaqmx/_task_modules/triggering/reference_trigger.py | stafak/nidaqmx-python | f354d7971b21074c120c6f298dbbf4a5e0e4f4f4 | [
"MIT"
] | 124 | 2017-04-01T18:35:24.000Z | 2022-03-25T06:30:00.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import ctypes
import numpy
from nidaqmx._lib import (
lib_importer, wrapped_ndpointer, ctypes_byte_str, c_bool32)
from nidaqmx.system.physical_channel import PhysicalChannel
from nidaqmx.errors import (
check_for_error, is_string_buffer_too_small, is_array_buffer_too_small)
from nidaqmx.constants import (
Coupling, DigitalPatternCondition, Edge, Slope, TriggerType,
WindowTriggerCondition1)
class ReferenceTrigger(object):
"""
Represents the reference trigger configurations for a DAQmx task.
"""
def __init__(self, task_handle):
self._handle = task_handle
@property
def anlg_edge_coupling(self):
"""
:class:`nidaqmx.constants.Coupling`: Specifies the coupling for
the source signal of the trigger if the source is a terminal
rather than a virtual channel.
"""
val = ctypes.c_int()
cfunc = lib_importer.windll.DAQmxGetAnlgEdgeRefTrigCoupling
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(ctypes.c_int)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return Coupling(val.value)
@anlg_edge_coupling.setter
def anlg_edge_coupling(self, val):
val = val.value
cfunc = lib_importer.windll.DAQmxSetAnlgEdgeRefTrigCoupling
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_int]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_coupling.deleter
def anlg_edge_coupling(self):
cfunc = lib_importer.windll.DAQmxResetAnlgEdgeRefTrigCoupling
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_edge_dig_fltr_enable(self):
"""
bool: Specifies whether to apply a digital filter to the digital
output of the analog triggering circuitry (the Analog
Comparison Event). When enabled, the analog signal must stay
above or below the trigger level for the minimum pulse width
before being recognized. Use filtering for noisy trigger
signals that transition in and out of the hysteresis window
rapidly.
"""
val = c_bool32()
cfunc = lib_importer.windll.DAQmxGetAnlgEdgeRefTrigDigFltrEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(c_bool32)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_edge_dig_fltr_enable.setter
def anlg_edge_dig_fltr_enable(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgEdgeRefTrigDigFltrEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, c_bool32]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_dig_fltr_enable.deleter
def anlg_edge_dig_fltr_enable(self):
cfunc = lib_importer.windll.DAQmxResetAnlgEdgeRefTrigDigFltrEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_edge_dig_fltr_min_pulse_width(self):
"""
float: Specifies in seconds the minimum pulse width thefilter
recognizes.
"""
val = ctypes.c_double()
cfunc = (lib_importer.windll.
DAQmxGetAnlgEdgeRefTrigDigFltrMinPulseWidth)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_edge_dig_fltr_min_pulse_width.setter
def anlg_edge_dig_fltr_min_pulse_width(self, val):
cfunc = (lib_importer.windll.
DAQmxSetAnlgEdgeRefTrigDigFltrMinPulseWidth)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_dig_fltr_min_pulse_width.deleter
def anlg_edge_dig_fltr_min_pulse_width(self):
cfunc = (lib_importer.windll.
DAQmxResetAnlgEdgeRefTrigDigFltrMinPulseWidth)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_edge_dig_fltr_timebase_rate(self):
"""
float: Specifies in hertz the rate of the digital filter
timebase. NI-DAQmx uses this value to compute settings for
the filter.
"""
val = ctypes.c_double()
cfunc = (lib_importer.windll.
DAQmxGetAnlgEdgeRefTrigDigFltrTimebaseRate)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_edge_dig_fltr_timebase_rate.setter
def anlg_edge_dig_fltr_timebase_rate(self, val):
cfunc = (lib_importer.windll.
DAQmxSetAnlgEdgeRefTrigDigFltrTimebaseRate)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_dig_fltr_timebase_rate.deleter
def anlg_edge_dig_fltr_timebase_rate(self):
cfunc = (lib_importer.windll.
DAQmxResetAnlgEdgeRefTrigDigFltrTimebaseRate)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_edge_dig_fltr_timebase_src(self):
"""
str: Specifies the terminal of the signal to use as the timebase
of the digital filter.
"""
cfunc = (lib_importer.windll.
DAQmxGetAnlgEdgeRefTrigDigFltrTimebaseSrc)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_char_p,
ctypes.c_uint]
temp_size = 0
while True:
val = ctypes.create_string_buffer(temp_size)
size_or_code = cfunc(
self._handle, val, temp_size)
if is_string_buffer_too_small(size_or_code):
# Buffer size must have changed between calls; check again.
temp_size = 0
elif size_or_code > 0 and temp_size == 0:
# Buffer size obtained, use to retrieve data.
temp_size = size_or_code
else:
break
check_for_error(size_or_code)
return val.value.decode('ascii')
@anlg_edge_dig_fltr_timebase_src.setter
def anlg_edge_dig_fltr_timebase_src(self, val):
cfunc = (lib_importer.windll.
DAQmxSetAnlgEdgeRefTrigDigFltrTimebaseSrc)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_dig_fltr_timebase_src.deleter
def anlg_edge_dig_fltr_timebase_src(self):
cfunc = (lib_importer.windll.
DAQmxResetAnlgEdgeRefTrigDigFltrTimebaseSrc)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_edge_dig_sync_enable(self):
"""
bool: Specifies whether to synchronize recognition of
transitions in the signal to the internal timebase of the
device.
"""
val = c_bool32()
cfunc = lib_importer.windll.DAQmxGetAnlgEdgeRefTrigDigSyncEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(c_bool32)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_edge_dig_sync_enable.setter
def anlg_edge_dig_sync_enable(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgEdgeRefTrigDigSyncEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, c_bool32]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_dig_sync_enable.deleter
def anlg_edge_dig_sync_enable(self):
cfunc = lib_importer.windll.DAQmxResetAnlgEdgeRefTrigDigSyncEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_edge_hyst(self):
"""
float: Specifies a hysteresis level in the units of the
measurement. If **anlg_edge_slope** is **Slope1.RISING**,
the trigger does not deassert until the source signal passes
below **anlg_edge_lvl** minus the hysteresis. If
**anlg_edge_slope** is **Slope1.FALLING**, the trigger does
not deassert until the source signal passes above
**anlg_edge_lvl** plus the hysteresis. Hysteresis is always
enabled. Set this property to a non-zero value to use
hysteresis.
"""
val = ctypes.c_double()
cfunc = lib_importer.windll.DAQmxGetAnlgEdgeRefTrigHyst
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_edge_hyst.setter
def anlg_edge_hyst(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgEdgeRefTrigHyst
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_hyst.deleter
def anlg_edge_hyst(self):
cfunc = lib_importer.windll.DAQmxResetAnlgEdgeRefTrigHyst
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_edge_lvl(self):
"""
float: Specifies in the units of the measurement the threshold
at which the Reference Trigger occurs. Use
**anlg_edge_slope** to specify on which slope to trigger at
this threshold.
"""
val = ctypes.c_double()
cfunc = lib_importer.windll.DAQmxGetAnlgEdgeRefTrigLvl
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_edge_lvl.setter
def anlg_edge_lvl(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgEdgeRefTrigLvl
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_lvl.deleter
def anlg_edge_lvl(self):
cfunc = lib_importer.windll.DAQmxResetAnlgEdgeRefTrigLvl
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_edge_slope(self):
"""
:class:`nidaqmx.constants.Slope`: Specifies on which slope of
the source signal the Reference Trigger occurs.
"""
val = ctypes.c_int()
cfunc = lib_importer.windll.DAQmxGetAnlgEdgeRefTrigSlope
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(ctypes.c_int)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return Slope(val.value)
@anlg_edge_slope.setter
def anlg_edge_slope(self, val):
val = val.value
cfunc = lib_importer.windll.DAQmxSetAnlgEdgeRefTrigSlope
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_int]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_slope.deleter
def anlg_edge_slope(self):
cfunc = lib_importer.windll.DAQmxResetAnlgEdgeRefTrigSlope
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_edge_src(self):
"""
str: Specifies the name of a virtual channel or terminal where
there is an analog signal to use as the source of the
Reference Trigger.
"""
cfunc = lib_importer.windll.DAQmxGetAnlgEdgeRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_char_p,
ctypes.c_uint]
temp_size = 0
while True:
val = ctypes.create_string_buffer(temp_size)
size_or_code = cfunc(
self._handle, val, temp_size)
if is_string_buffer_too_small(size_or_code):
# Buffer size must have changed between calls; check again.
temp_size = 0
elif size_or_code > 0 and temp_size == 0:
# Buffer size obtained, use to retrieve data.
temp_size = size_or_code
else:
break
check_for_error(size_or_code)
return val.value.decode('ascii')
@anlg_edge_src.setter
def anlg_edge_src(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgEdgeRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_edge_src.deleter
def anlg_edge_src(self):
cfunc = lib_importer.windll.DAQmxResetAnlgEdgeRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_btm(self):
"""
float: Specifies the lower limit of the window. Specify this
value in the units of the measurement.
"""
val = ctypes.c_double()
cfunc = lib_importer.windll.DAQmxGetAnlgWinRefTrigBtm
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_win_btm.setter
def anlg_win_btm(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgWinRefTrigBtm
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_btm.deleter
def anlg_win_btm(self):
cfunc = lib_importer.windll.DAQmxResetAnlgWinRefTrigBtm
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_coupling(self):
"""
:class:`nidaqmx.constants.Coupling`: Specifies the coupling for
the source signal of the trigger if the source is a terminal
rather than a virtual channel.
"""
val = ctypes.c_int()
cfunc = lib_importer.windll.DAQmxGetAnlgWinRefTrigCoupling
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(ctypes.c_int)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return Coupling(val.value)
@anlg_win_coupling.setter
def anlg_win_coupling(self, val):
val = val.value
cfunc = lib_importer.windll.DAQmxSetAnlgWinRefTrigCoupling
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_int]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_coupling.deleter
def anlg_win_coupling(self):
cfunc = lib_importer.windll.DAQmxResetAnlgWinRefTrigCoupling
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_dig_fltr_enable(self):
"""
bool: Specifies whether to apply a digital filter to the digital
output of the analog triggering circuitry (the Analog
Comparison Event). When enabled, the analog signal must stay
within the trigger window for the minimum pulse width before
being recognized. Use filtering for noisy trigger signals
that transition in and out of the window rapidly.
"""
val = c_bool32()
cfunc = lib_importer.windll.DAQmxGetAnlgWinRefTrigDigFltrEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(c_bool32)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_win_dig_fltr_enable.setter
def anlg_win_dig_fltr_enable(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgWinRefTrigDigFltrEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, c_bool32]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_dig_fltr_enable.deleter
def anlg_win_dig_fltr_enable(self):
cfunc = lib_importer.windll.DAQmxResetAnlgWinRefTrigDigFltrEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_dig_fltr_min_pulse_width(self):
"""
float: Specifies in seconds the minimum pulse width the filter
recognizes.
"""
val = ctypes.c_double()
cfunc = (lib_importer.windll.
DAQmxGetAnlgWinRefTrigDigFltrMinPulseWidth)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_win_dig_fltr_min_pulse_width.setter
def anlg_win_dig_fltr_min_pulse_width(self, val):
cfunc = (lib_importer.windll.
DAQmxSetAnlgWinRefTrigDigFltrMinPulseWidth)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_dig_fltr_min_pulse_width.deleter
def anlg_win_dig_fltr_min_pulse_width(self):
cfunc = (lib_importer.windll.
DAQmxResetAnlgWinRefTrigDigFltrMinPulseWidth)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_dig_fltr_timebase_rate(self):
"""
float: Specifies in hertz the rate of the digital filter
timebase. NI-DAQmx uses this value to compute settings for
the filter.
"""
val = ctypes.c_double()
cfunc = (lib_importer.windll.
DAQmxGetAnlgWinRefTrigDigFltrTimebaseRate)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_win_dig_fltr_timebase_rate.setter
def anlg_win_dig_fltr_timebase_rate(self, val):
cfunc = (lib_importer.windll.
DAQmxSetAnlgWinRefTrigDigFltrTimebaseRate)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_dig_fltr_timebase_rate.deleter
def anlg_win_dig_fltr_timebase_rate(self):
cfunc = (lib_importer.windll.
DAQmxResetAnlgWinRefTrigDigFltrTimebaseRate)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_dig_fltr_timebase_src(self):
"""
str: Specifies the terminal of the signal to use as the timebase
of the digital filter.
"""
cfunc = lib_importer.windll.DAQmxGetAnlgWinRefTrigDigFltrTimebaseSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_char_p,
ctypes.c_uint]
temp_size = 0
while True:
val = ctypes.create_string_buffer(temp_size)
size_or_code = cfunc(
self._handle, val, temp_size)
if is_string_buffer_too_small(size_or_code):
# Buffer size must have changed between calls; check again.
temp_size = 0
elif size_or_code > 0 and temp_size == 0:
# Buffer size obtained, use to retrieve data.
temp_size = size_or_code
else:
break
check_for_error(size_or_code)
return val.value.decode('ascii')
@anlg_win_dig_fltr_timebase_src.setter
def anlg_win_dig_fltr_timebase_src(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgWinRefTrigDigFltrTimebaseSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_dig_fltr_timebase_src.deleter
def anlg_win_dig_fltr_timebase_src(self):
cfunc = lib_importer.windll.DAQmxResetAnlgWinRefTrigDigFltrTimebaseSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_dig_sync_enable(self):
"""
bool: Specifies whether to synchronize recognition of
transitions in the signal to the internal timebase of the
device.
"""
val = c_bool32()
cfunc = lib_importer.windll.DAQmxGetAnlgWinRefTrigDigSyncEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(c_bool32)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_win_dig_sync_enable.setter
def anlg_win_dig_sync_enable(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgWinRefTrigDigSyncEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, c_bool32]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_dig_sync_enable.deleter
def anlg_win_dig_sync_enable(self):
cfunc = lib_importer.windll.DAQmxResetAnlgWinRefTrigDigSyncEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_src(self):
"""
str: Specifies the name of a virtual channel or terminal where
there is an analog signal to use as the source of the
Reference Trigger.
"""
cfunc = lib_importer.windll.DAQmxGetAnlgWinRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_char_p,
ctypes.c_uint]
temp_size = 0
while True:
val = ctypes.create_string_buffer(temp_size)
size_or_code = cfunc(
self._handle, val, temp_size)
if is_string_buffer_too_small(size_or_code):
# Buffer size must have changed between calls; check again.
temp_size = 0
elif size_or_code > 0 and temp_size == 0:
# Buffer size obtained, use to retrieve data.
temp_size = size_or_code
else:
break
check_for_error(size_or_code)
return val.value.decode('ascii')
@anlg_win_src.setter
def anlg_win_src(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgWinRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_src.deleter
def anlg_win_src(self):
cfunc = lib_importer.windll.DAQmxResetAnlgWinRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_top(self):
"""
float: Specifies the upper limit of the window. Specify this
value in the units of the measurement.
"""
val = ctypes.c_double()
cfunc = lib_importer.windll.DAQmxGetAnlgWinRefTrigTop
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@anlg_win_top.setter
def anlg_win_top(self, val):
cfunc = lib_importer.windll.DAQmxSetAnlgWinRefTrigTop
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_top.deleter
def anlg_win_top(self):
cfunc = lib_importer.windll.DAQmxResetAnlgWinRefTrigTop
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def anlg_win_trig_when(self):
"""
:class:`nidaqmx.constants.WindowTriggerCondition1`: Specifies
whether the Reference Trigger occurs when the source signal
enters the window or when it leaves the window. Use
**anlg_win_btm** and **anlg_win_top** to specify the window.
"""
val = ctypes.c_int()
cfunc = lib_importer.windll.DAQmxGetAnlgWinRefTrigTrigWhen
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(ctypes.c_int)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return WindowTriggerCondition1(val.value)
@anlg_win_trig_when.setter
def anlg_win_trig_when(self, val):
val = val.value
cfunc = lib_importer.windll.DAQmxSetAnlgWinRefTrigTrigWhen
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_int]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@anlg_win_trig_when.deleter
def anlg_win_trig_when(self):
cfunc = lib_importer.windll.DAQmxResetAnlgWinRefTrigTrigWhen
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def auto_trig_enable(self):
"""
bool: Specifies whether to send a software trigger to the device
when a hardware trigger is no longer active in order to
prevent a timeout.
"""
val = c_bool32()
cfunc = lib_importer.windll.DAQmxGetRefTrigAutoTrigEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(c_bool32)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@auto_trig_enable.setter
def auto_trig_enable(self, val):
cfunc = lib_importer.windll.DAQmxSetRefTrigAutoTrigEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, c_bool32]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@auto_trig_enable.deleter
def auto_trig_enable(self):
cfunc = lib_importer.windll.DAQmxResetRefTrigAutoTrigEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def auto_triggered(self):
"""
bool: Indicates whether a completed acquisition was triggered by
the auto trigger. If an acquisition has not completed after
the task starts, this property returns False. This property
is only applicable when **auto_trig_enable** is True.
"""
val = c_bool32()
cfunc = lib_importer.windll.DAQmxGetRefTrigAutoTriggered
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(c_bool32)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@property
def delay(self):
"""
float: Specifies in seconds the time to wait after the device
receives the Reference Trigger before switching from
pretrigger to posttrigger samples.
"""
val = ctypes.c_double()
cfunc = lib_importer.windll.DAQmxGetRefTrigDelay
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@delay.setter
def delay(self, val):
cfunc = lib_importer.windll.DAQmxSetRefTrigDelay
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@delay.deleter
def delay(self):
cfunc = lib_importer.windll.DAQmxResetRefTrigDelay
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_edge_dig_fltr_enable(self):
"""
bool: Specifies whether to apply a digital filter to the trigger
signal.
"""
val = c_bool32()
cfunc = lib_importer.windll.DAQmxGetDigEdgeRefTrigDigFltrEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(c_bool32)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@dig_edge_dig_fltr_enable.setter
def dig_edge_dig_fltr_enable(self, val):
cfunc = lib_importer.windll.DAQmxSetDigEdgeRefTrigDigFltrEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, c_bool32]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_edge_dig_fltr_enable.deleter
def dig_edge_dig_fltr_enable(self):
cfunc = lib_importer.windll.DAQmxResetDigEdgeRefTrigDigFltrEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_edge_dig_fltr_min_pulse_width(self):
"""
float: Specifies in seconds the minimum pulse width the filter
recognizes.
"""
val = ctypes.c_double()
cfunc = (lib_importer.windll.
DAQmxGetDigEdgeRefTrigDigFltrMinPulseWidth)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@dig_edge_dig_fltr_min_pulse_width.setter
def dig_edge_dig_fltr_min_pulse_width(self, val):
cfunc = (lib_importer.windll.
DAQmxSetDigEdgeRefTrigDigFltrMinPulseWidth)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_edge_dig_fltr_min_pulse_width.deleter
def dig_edge_dig_fltr_min_pulse_width(self):
cfunc = (lib_importer.windll.
DAQmxResetDigEdgeRefTrigDigFltrMinPulseWidth)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_edge_dig_fltr_timebase_rate(self):
"""
float: Specifies in hertz the rate of the digital filter
timebase. NI-DAQmx uses this value to compute settings for
the filter.
"""
val = ctypes.c_double()
cfunc = (lib_importer.windll.
DAQmxGetDigEdgeRefTrigDigFltrTimebaseRate)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_double)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@dig_edge_dig_fltr_timebase_rate.setter
def dig_edge_dig_fltr_timebase_rate(self, val):
cfunc = (lib_importer.windll.
DAQmxSetDigEdgeRefTrigDigFltrTimebaseRate)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_double]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_edge_dig_fltr_timebase_rate.deleter
def dig_edge_dig_fltr_timebase_rate(self):
cfunc = (lib_importer.windll.
DAQmxResetDigEdgeRefTrigDigFltrTimebaseRate)
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_edge_dig_fltr_timebase_src(self):
"""
str: Specifies the terminal of the signal to use as the timebase
of the digital filter.
"""
cfunc = lib_importer.windll.DAQmxGetDigEdgeRefTrigDigFltrTimebaseSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_char_p,
ctypes.c_uint]
temp_size = 0
while True:
val = ctypes.create_string_buffer(temp_size)
size_or_code = cfunc(
self._handle, val, temp_size)
if is_string_buffer_too_small(size_or_code):
# Buffer size must have changed between calls; check again.
temp_size = 0
elif size_or_code > 0 and temp_size == 0:
# Buffer size obtained, use to retrieve data.
temp_size = size_or_code
else:
break
check_for_error(size_or_code)
return val.value.decode('ascii')
@dig_edge_dig_fltr_timebase_src.setter
def dig_edge_dig_fltr_timebase_src(self, val):
cfunc = lib_importer.windll.DAQmxSetDigEdgeRefTrigDigFltrTimebaseSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_edge_dig_fltr_timebase_src.deleter
def dig_edge_dig_fltr_timebase_src(self):
cfunc = lib_importer.windll.DAQmxResetDigEdgeRefTrigDigFltrTimebaseSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_edge_dig_sync_enable(self):
"""
bool: Specifies whether to synchronize recognition of
transitions in the signal to the internal timebase of the
device.
"""
val = c_bool32()
cfunc = lib_importer.windll.DAQmxGetDigEdgeRefTrigDigSyncEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(c_bool32)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@dig_edge_dig_sync_enable.setter
def dig_edge_dig_sync_enable(self, val):
cfunc = lib_importer.windll.DAQmxSetDigEdgeRefTrigDigSyncEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, c_bool32]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_edge_dig_sync_enable.deleter
def dig_edge_dig_sync_enable(self):
cfunc = lib_importer.windll.DAQmxResetDigEdgeRefTrigDigSyncEnable
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_edge_edge(self):
"""
:class:`nidaqmx.constants.Edge`: Specifies on what edge of a
digital pulse the Reference Trigger occurs.
"""
val = ctypes.c_int()
cfunc = lib_importer.windll.DAQmxGetDigEdgeRefTrigEdge
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(ctypes.c_int)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return Edge(val.value)
@dig_edge_edge.setter
def dig_edge_edge(self, val):
val = val.value
cfunc = lib_importer.windll.DAQmxSetDigEdgeRefTrigEdge
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_int]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_edge_edge.deleter
def dig_edge_edge(self):
cfunc = lib_importer.windll.DAQmxResetDigEdgeRefTrigEdge
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_edge_src(self):
"""
str: Specifies the name of a terminal where there is a digital
signal to use as the source of the Reference Trigger.
"""
cfunc = lib_importer.windll.DAQmxGetDigEdgeRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_char_p,
ctypes.c_uint]
temp_size = 0
while True:
val = ctypes.create_string_buffer(temp_size)
size_or_code = cfunc(
self._handle, val, temp_size)
if is_string_buffer_too_small(size_or_code):
# Buffer size must have changed between calls; check again.
temp_size = 0
elif size_or_code > 0 and temp_size == 0:
# Buffer size obtained, use to retrieve data.
temp_size = size_or_code
else:
break
check_for_error(size_or_code)
return val.value.decode('ascii')
@dig_edge_src.setter
def dig_edge_src(self, val):
cfunc = lib_importer.windll.DAQmxSetDigEdgeRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_edge_src.deleter
def dig_edge_src(self):
cfunc = lib_importer.windll.DAQmxResetDigEdgeRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_pattern_pattern(self):
"""
str: Specifies the digital pattern that must be met for the
Reference Trigger to occur.
"""
cfunc = lib_importer.windll.DAQmxGetDigPatternRefTrigPattern
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_char_p,
ctypes.c_uint]
temp_size = 0
while True:
val = ctypes.create_string_buffer(temp_size)
size_or_code = cfunc(
self._handle, val, temp_size)
if is_string_buffer_too_small(size_or_code):
# Buffer size must have changed between calls; check again.
temp_size = 0
elif size_or_code > 0 and temp_size == 0:
# Buffer size obtained, use to retrieve data.
temp_size = size_or_code
else:
break
check_for_error(size_or_code)
return val.value.decode('ascii')
@dig_pattern_pattern.setter
def dig_pattern_pattern(self, val):
cfunc = lib_importer.windll.DAQmxSetDigPatternRefTrigPattern
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_pattern_pattern.deleter
def dig_pattern_pattern(self):
cfunc = lib_importer.windll.DAQmxResetDigPatternRefTrigPattern
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_pattern_src(self):
"""
:class:`nidaqmx.system.physical_channel.PhysicalChannel`:
Specifies the physical channels to use for pattern matching.
The order of the physical channels determines the order of
the pattern. If a port is included, the order of the
physical channels within the port is in ascending order.
"""
cfunc = lib_importer.windll.DAQmxGetDigPatternRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_char_p,
ctypes.c_uint]
temp_size = 0
while True:
val = ctypes.create_string_buffer(temp_size)
size_or_code = cfunc(
self._handle, val, temp_size)
if is_string_buffer_too_small(size_or_code):
# Buffer size must have changed between calls; check again.
temp_size = 0
elif size_or_code > 0 and temp_size == 0:
# Buffer size obtained, use to retrieve data.
temp_size = size_or_code
else:
break
check_for_error(size_or_code)
return PhysicalChannel(val.value.decode('ascii'))
@dig_pattern_src.setter
def dig_pattern_src(self, val):
val = val.name
cfunc = lib_importer.windll.DAQmxSetDigPatternRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_pattern_src.deleter
def dig_pattern_src(self):
cfunc = lib_importer.windll.DAQmxResetDigPatternRefTrigSrc
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def dig_pattern_trig_when(self):
"""
:class:`nidaqmx.constants.DigitalPatternCondition`: Specifies
whether the Reference Trigger occurs when the physical
channels specified with **dig_pattern_src** match or differ
from the digital pattern specified with
**dig_pattern_pattern**.
"""
val = ctypes.c_int()
cfunc = lib_importer.windll.DAQmxGetDigPatternRefTrigTrigWhen
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(ctypes.c_int)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return DigitalPatternCondition(val.value)
@dig_pattern_trig_when.setter
def dig_pattern_trig_when(self, val):
val = val.value
cfunc = lib_importer.windll.DAQmxSetDigPatternRefTrigTrigWhen
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_int]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@dig_pattern_trig_when.deleter
def dig_pattern_trig_when(self):
cfunc = lib_importer.windll.DAQmxResetDigPatternRefTrigTrigWhen
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def pretrig_samples(self):
"""
int: Specifies the minimum number of pretrigger samples to
acquire from each channel before recognizing the reference
trigger. Post-trigger samples per channel are equal to
**samp_quant_samp_per_chan** minus the number of pretrigger
samples per channel.
"""
val = ctypes.c_uint()
cfunc = lib_importer.windll.DAQmxGetRefTrigPreTrigSamples
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle,
ctypes.POINTER(ctypes.c_uint)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return val.value
@pretrig_samples.setter
def pretrig_samples(self, val):
cfunc = lib_importer.windll.DAQmxSetRefTrigPreTrigSamples
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_uint]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@pretrig_samples.deleter
def pretrig_samples(self):
cfunc = lib_importer.windll.DAQmxResetRefTrigPreTrigSamples
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
@property
def term(self):
"""
str: Indicates the name of the internal Reference Trigger
terminal for the task. This property does not return the
name of the trigger source terminal.
"""
cfunc = lib_importer.windll.DAQmxGetRefTrigTerm
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_char_p,
ctypes.c_uint]
temp_size = 0
while True:
val = ctypes.create_string_buffer(temp_size)
size_or_code = cfunc(
self._handle, val, temp_size)
if is_string_buffer_too_small(size_or_code):
# Buffer size must have changed between calls; check again.
temp_size = 0
elif size_or_code > 0 and temp_size == 0:
# Buffer size obtained, use to retrieve data.
temp_size = size_or_code
else:
break
check_for_error(size_or_code)
return val.value.decode('ascii')
@property
def trig_type(self):
"""
:class:`nidaqmx.constants.TriggerType`: Specifies the type of
trigger to use to mark a reference point for the
measurement.
"""
val = ctypes.c_int()
cfunc = lib_importer.windll.DAQmxGetRefTrigType
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.POINTER(ctypes.c_int)]
error_code = cfunc(
self._handle, ctypes.byref(val))
check_for_error(error_code)
return TriggerType(val.value)
@trig_type.setter
def trig_type(self, val):
val = val.value
cfunc = lib_importer.windll.DAQmxSetRefTrigType
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes.c_int]
error_code = cfunc(
self._handle, val)
check_for_error(error_code)
@trig_type.deleter
def trig_type(self):
cfunc = lib_importer.windll.DAQmxResetRefTrigType
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
def cfg_anlg_edge_ref_trig(
self, trigger_source, pretrigger_samples,
trigger_slope=Slope.RISING, trigger_level=0.0):
"""
Configures the task to stop the acquisition when the device
acquires all pretrigger samples; an analog signal reaches the
level you specify; and the device acquires all post-trigger
samples. When you use a Reference Trigger, the default for the
read RelativeTo property is **first_pretrigger_sample** with a
read Offset of 0.
Args:
trigger_source (str): Is the name of a virtual channel or
terminal where there is an analog signal to use as the
source of the trigger.
pretrigger_samples (int): Specifies the minimum number of
samples to acquire per channel before recognizing the
Reference Trigger. The number of post-trigger samples
per channel is equal to **number of samples per
channel** in the DAQmx Timing function minus
**pretrigger_samples**.
trigger_slope (Optional[nidaqmx.constants.Slope]): Specifies
on which slope of the signal the Reference Trigger
occurs.
trigger_level (Optional[float]): Specifies at what threshold
to trigger. Specify this value in the units of the
measurement or generation. Use **trigger_slope** to
specify on which slope to trigger at this threshold.
"""
cfunc = lib_importer.windll.DAQmxCfgAnlgEdgeRefTrig
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str,
ctypes.c_int, ctypes.c_double, ctypes.c_uint]
error_code = cfunc(
self._handle, trigger_source, trigger_slope.value, trigger_level,
pretrigger_samples)
check_for_error(error_code)
def cfg_anlg_window_ref_trig(
self, trigger_source, window_top, window_bottom,
pretrigger_samples,
trigger_when=WindowTriggerCondition1.ENTERING_WINDOW):
"""
Configures the task to stop the acquisition when the device
acquires all pretrigger samples; an analog signal enters or
leaves a range you specify; and the device acquires all post-
trigger samples. When you use a Reference Trigger, the default
for the read RelativeTo property is **first_pretrigger_sample**
with a read Offset of 0.
Args:
trigger_source (str): Is the name of a virtual channel or
terminal where there is an analog signal to use as the
source of the trigger.
window_top (float): Is the upper limit of the window.
Specify this value in the units of the measurement or
generation.
window_bottom (float): Is the lower limit of the window.
Specify this value in the units of the measurement or
generation.
pretrigger_samples (int): Specifies the minimum number of
samples to acquire per channel before recognizing the
Reference Trigger. The number of post-trigger samples
per channel is equal to **number of samples per
channel** in the DAQmx Timing function minus
**pretrigger_samples**.
trigger_when (Optional[nidaqmx.constants.WindowTriggerCondition1]):
Specifies whether the Reference Trigger occurs when the
signal enters the window or when it leaves the window.
Use **window_bottom** and **window_top** to specify the
limits of the window.
"""
cfunc = lib_importer.windll.DAQmxCfgAnlgWindowRefTrig
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str,
ctypes.c_int, ctypes.c_double, ctypes.c_double,
ctypes.c_uint]
error_code = cfunc(
self._handle, trigger_source, trigger_when.value, window_top,
window_bottom, pretrigger_samples)
check_for_error(error_code)
def cfg_dig_edge_ref_trig(
self, trigger_source, pretrigger_samples,
trigger_edge=Edge.RISING):
"""
Configures the task to stop the acquisition when the device
acquires all pretrigger samples, detects a rising or falling
edge of a digital signal, and acquires all posttrigger samples.
When you use a Reference Trigger, the default for the read
RelativeTo property is **first_pretrigger_sample** with a read
Offset of 0.
Args:
trigger_source (str): Specifies the name of a terminal where
there is a digital signal to use as the source of the
trigger.
pretrigger_samples (int): Specifies the minimum number of
samples to acquire per channel before recognizing the
Reference Trigger. The number of post-trigger samples
per channel is equal to **number of samples per
channel** in the DAQmx Timing function minus
**pretrigger_samples**.
trigger_edge (Optional[nidaqmx.constants.Edge]): Specifies
on which edge of the digital signal the Reference
Trigger occurs.
"""
cfunc = lib_importer.windll.DAQmxCfgDigEdgeRefTrig
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str,
ctypes.c_int, ctypes.c_uint]
error_code = cfunc(
self._handle, trigger_source, trigger_edge.value,
pretrigger_samples)
check_for_error(error_code)
def cfg_dig_pattern_ref_trig(
self, trigger_source, trigger_pattern, pretrigger_samples,
trigger_when=DigitalPatternCondition.PATTERN_MATCHES):
"""
Configures the task to stop the acquisition when the device
acquires all pretrigger samples, matches a digital pattern, and
acquires all posttrigger samples. When you use a Reference
Trigger, the default for the read RelativeTo property is First
PretriggerSample with a read Offset of zero.
Args:
trigger_source (str): Specifies the physical channels to use
for pattern matching. The order of the physical channels
determines the order of the pattern. If a port is
included, the order of the physical channels within the
port is in ascending order.
trigger_pattern (str): Specifies the digital pattern that
must be met for the trigger to occur.
pretrigger_samples (int): Specifies the minimum number of
samples to acquire per channel before recognizing the
Reference Trigger. The number of post-trigger samples
per channel is equal to **number of samples per
channel** in the DAQmx Timing function minus
**pretrigger_samples**.
trigger_when (Optional[nidaqmx.constants.DigitalPatternCondition]):
Specifies the condition under which the trigger occurs.
"""
cfunc = lib_importer.windll.DAQmxCfgDigPatternRefTrig
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle, ctypes_byte_str,
ctypes_byte_str, ctypes.c_int, ctypes.c_uint]
error_code = cfunc(
self._handle, trigger_source, trigger_pattern, trigger_when.value,
pretrigger_samples)
check_for_error(error_code)
def disable_ref_trig(self):
"""
Disables reference triggering for the measurement.
"""
cfunc = lib_importer.windll.DAQmxDisableRefTrig
if cfunc.argtypes is None:
with cfunc.arglock:
if cfunc.argtypes is None:
cfunc.argtypes = [
lib_importer.task_handle]
error_code = cfunc(
self._handle)
check_for_error(error_code)
| 34.801657 | 80 | 0.586952 | 7,812 | 71,413 | 5.124168 | 0.051843 | 0.106195 | 0.081689 | 0.092581 | 0.823857 | 0.799001 | 0.772795 | 0.758356 | 0.742168 | 0.718411 | 0 | 0.002078 | 0.352975 | 71,413 | 2,051 | 81 | 34.818625 | 0.86426 | 0.16462 | 0 | 0.816585 | 0 | 0 | 0.000785 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.077301 | false | 0 | 0.160928 | 0 | 0.264231 | 0.000703 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ec36dbee68e750b74f9751299226f48f93c92a93 | 221 | py | Python | bijou/callbacks/__init__.py | hitlic/agile | 96979f84ee1823b03da70dd1181885e7754d0d09 | [
"Apache-2.0"
] | 33 | 2020-02-04T15:17:42.000Z | 2021-11-21T20:50:34.000Z | bijou/callbacks/__init__.py | hitlic/agile | 96979f84ee1823b03da70dd1181885e7754d0d09 | [
"Apache-2.0"
] | 1 | 2021-07-14T08:41:12.000Z | 2021-07-18T15:19:21.000Z | bijou/callbacks/__init__.py | hitlic/agile | 96979f84ee1823b03da70dd1181885e7754d0d09 | [
"Apache-2.0"
] | 4 | 2020-02-04T15:16:19.000Z | 2021-08-30T01:08:34.000Z | from .basic_callbacks import *
from .performance import *
from .transforms import *
from .interpreters import *
try:
from .interpreters_dgl import *
from .interpreters_pyg import *
except Exception as e:
pass
| 22.1 | 35 | 0.746606 | 27 | 221 | 6 | 0.555556 | 0.246914 | 0.271605 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190045 | 221 | 9 | 36 | 24.555556 | 0.905028 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.111111 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
ec39d9ea89933b2e93625f0c41a3ea574b88ac30 | 370 | py | Python | tests/expectations/txt-x-cat-date-smoothed-col-pct-w3.py | Crunch-io/crunch-cube | 80986d5b2106c774f05176fb6c6a5ea0d840f09d | [
"MIT"
] | 3 | 2021-01-22T20:42:31.000Z | 2021-06-02T17:53:19.000Z | tests/expectations/txt-x-cat-date-smoothed-col-pct-w3.py | Crunch-io/crunch-cube | 80986d5b2106c774f05176fb6c6a5ea0d840f09d | [
"MIT"
] | 331 | 2017-11-13T22:41:56.000Z | 2021-12-02T21:59:43.000Z | tests/expectations/txt-x-cat-date-smoothed-col-pct-w3.py | Crunch-io/crunch-cube | 80986d5b2106c774f05176fb6c6a5ea0d840f09d | [
"MIT"
] | 1 | 2021-02-19T02:49:00.000Z | 2021-02-19T02:49:00.000Z | [
[float("NaN"), float("NaN"), 0.0, 0.0, 0.0],
[float("NaN"), float("NaN"), 0.0, 0.0, 33.33333333],
[float("NaN"), float("NaN"), 0.0, 33.33333333, 33.33333333],
[float("NaN"), float("NaN"), 33.33333333, 33.33333333, 0.0],
[float("NaN"), float("NaN"), 33.33333333, 0.0, 0.0],
[float("NaN"), float("NaN"), 33.33333333, 33.33333333, 33.33333333],
]
| 41.111111 | 72 | 0.551351 | 60 | 370 | 3.4 | 0.083333 | 0.127451 | 0.117647 | 0.470588 | 0.901961 | 0.901961 | 0.715686 | 0.715686 | 0 | 0 | 0 | 0.343949 | 0.151351 | 370 | 8 | 73 | 46.25 | 0.305732 | 0 | 0 | 0 | 0 | 0 | 0.097297 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
6bc17b9e9f860eec60fef4af9bd6cbb630185f4a | 244 | py | Python | src/c3nav/mapdata/render/geometry/__init__.py | johnjohndoe/c3nav | a17f863a3512e305595c16b0300796b6bae81241 | [
"Apache-2.0"
] | 132 | 2016-11-12T01:45:23.000Z | 2022-03-08T15:17:10.000Z | src/c3nav/mapdata/render/geometry/__init__.py | johnjohndoe/c3nav | a17f863a3512e305595c16b0300796b6bae81241 | [
"Apache-2.0"
] | 66 | 2016-09-29T09:46:19.000Z | 2022-03-11T23:26:18.000Z | src/c3nav/mapdata/render/geometry/__init__.py | johnjohndoe/c3nav | a17f863a3512e305595c16b0300796b6bae81241 | [
"Apache-2.0"
] | 42 | 2016-09-29T08:34:57.000Z | 2022-03-08T15:17:15.000Z | from c3nav.mapdata.render.geometry.hybrid import hybrid_union, HybridGeometry # noqa
from c3nav.mapdata.render.geometry.level import LevelGeometries # noqa
from c3nav.mapdata.render.geometry.altitudearea import AltitudeAreaGeometries # noqa
| 61 | 85 | 0.844262 | 29 | 244 | 7.068966 | 0.482759 | 0.131707 | 0.234146 | 0.321951 | 0.478049 | 0.331707 | 0 | 0 | 0 | 0 | 0 | 0.013514 | 0.090164 | 244 | 3 | 86 | 81.333333 | 0.90991 | 0.057377 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d41626d329a03bef82acded3ecf9ae4647e2c756 | 70,559 | py | Python | data/model.py | STHSF/panther | 8122f299c5225f683c24070a1048e7bfbbe831fd | [
"Apache-2.0"
] | 1 | 2019-10-16T06:24:41.000Z | 2019-10-16T06:24:41.000Z | data/model.py | STHSF/panther | 8122f299c5225f683c24070a1048e7bfbbe831fd | [
"Apache-2.0"
] | null | null | null | data/model.py | STHSF/panther | 8122f299c5225f683c24070a1048e7bfbbe831fd | [
"Apache-2.0"
] | 1 | 2020-01-14T05:15:02.000Z | 2020-01-14T05:15:02.000Z | #! /usr/bin/env python
# -*- coding: utf-8 -*-
"""
@version: ??
@author: li
@file: model.py
@time: 2019-08-28 16:17
"""
from sqlalchemy import Column, NUMERIC, INT
from sqlalchemy.types import DECIMAL, DATE, VARCHAR
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import BigInteger, Column, DateTime, Float, Index, Integer, String, Text, Boolean, text, JSON
Base = declarative_base() # 生成ORM基类
metadata = Base.metadata
class BalanceMRQ(Base):
__tablename__ = 'balance_mrq'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
ENDDATE = Column(VARCHAR(8))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
REPORTYEAR = Column(VARCHAR(10))
REPORTDATETYPE = Column(VARCHAR(10))
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
ISREALACCSTA = Column(VARCHAR(10))
ISACORRECT = Column(INT)
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
CUR = Column(VARCHAR(10))
CURFDS = Column(NUMERIC(26, 2))
SETTRESEDEPO = Column(NUMERIC(26, 2))
PLAC = Column(NUMERIC(26, 2))
TRADFINASSET = Column(NUMERIC(26, 2))
DERIFINAASSET = Column(NUMERIC(26, 2))
NOTESRECE = Column(NUMERIC(26, 2))
ACCORECE = Column(NUMERIC(26, 2))
PREP = Column(NUMERIC(26, 2))
PREMRECE = Column(NUMERIC(26, 2))
REINRECE = Column(NUMERIC(26, 2))
REINCONTRESE = Column(NUMERIC(26, 2))
INTERECE = Column(NUMERIC(26, 2))
DIVIDRECE = Column(NUMERIC(26, 2))
OTHERRECE = Column(NUMERIC(26, 2))
EXPOTAXREBARECE = Column(NUMERIC(26, 2))
SUBSRECE = Column(NUMERIC(26, 2))
MARGRECE = Column(NUMERIC(26, 2))
INTELRECE = Column(NUMERIC(26, 2))
PURCRESAASSET = Column(NUMERIC(26, 2))
INVE = Column(NUMERIC(26, 2))
ACCHELDFORS = Column(NUMERIC(26, 2))
PREPEXPE = Column(NUMERIC(26, 2))
UNSEG = Column(NUMERIC(26, 2))
EXPINONCURRASSET = Column(NUMERIC(26, 2))
OTHERCURRASSE = Column(NUMERIC(26, 2))
TOTCURRASSET = Column(NUMERIC(26, 2))
LENDANDLOAN = Column(NUMERIC(26, 2))
AVAISELLASSE = Column(NUMERIC(26, 2))
HOLDINVEDUE = Column(NUMERIC(26, 2))
LONGRECE = Column(NUMERIC(26, 2))
EQUIINVE = Column(NUMERIC(26, 2))
OTHERLONGINVE = Column(NUMERIC(26, 2))
INVEPROP = Column(NUMERIC(26, 2))
FIXEDASSEIMMO = Column(NUMERIC(26, 2))
ACCUDEPR = Column(NUMERIC(26, 2))
FIXEDASSENETW = Column(NUMERIC(26, 2))
FIXEDASSEIMPA = Column(NUMERIC(26, 2))
FIXEDASSENET = Column(NUMERIC(26, 2))
CONSPROG = Column(NUMERIC(26, 2))
ENGIMATE = Column(NUMERIC(26, 2))
FIXEDASSECLEA = Column(NUMERIC(26, 2))
PRODASSE = Column(NUMERIC(26, 2))
COMASSE = Column(NUMERIC(26, 2))
HYDRASSET = Column(NUMERIC(26, 2))
INTAASSET = Column(NUMERIC(26, 2))
DEVEEXPE = Column(NUMERIC(26, 2))
GOODWILL = Column(NUMERIC(26, 2))
LOGPREPEXPE = Column(NUMERIC(26, 2))
TRADSHARTRAD = Column(NUMERIC(26, 2))
DEFETAXASSET = Column(NUMERIC(26, 2))
OTHERNONCASSE = Column(NUMERIC(26, 2))
TOTALNONCASSETS = Column(NUMERIC(26, 2))
TOTASSET = Column(NUMERIC(26, 2))
SHORTTERMBORR = Column(NUMERIC(26, 2))
CENBANKBORR = Column(NUMERIC(26, 2))
DEPOSIT = Column(NUMERIC(26, 2))
FDSBORR = Column(NUMERIC(26, 2))
TRADFINLIAB = Column(NUMERIC(26, 2))
DERILIAB = Column(NUMERIC(26, 2))
NOTESPAYA = Column(NUMERIC(26, 2))
ACCOPAYA = Column(NUMERIC(26, 2))
ADVAPAYM = Column(NUMERIC(26, 2))
SELLREPASSE = Column(NUMERIC(26, 2))
COPEPOUN = Column(NUMERIC(26, 2))
COPEWORKERSAL = Column(NUMERIC(26, 2))
TAXESPAYA = Column(NUMERIC(26, 2))
INTEPAYA = Column(NUMERIC(26, 2))
DIVIPAYA = Column(NUMERIC(26, 2))
OTHERFEEPAYA = Column(NUMERIC(26, 2))
MARGREQU = Column(NUMERIC(26, 2))
INTELPAY = Column(NUMERIC(26, 2))
OTHERPAY = Column(NUMERIC(26, 2))
ACCREXPE = Column(NUMERIC(26, 2))
EXPECURRLIAB = Column(NUMERIC(26, 2))
COPEWITHREINRECE = Column(NUMERIC(26, 2))
INSUCONTRESE = Column(NUMERIC(26, 2))
ACTITRADSECU = Column(NUMERIC(26, 2))
ACTIUNDESECU = Column(NUMERIC(26, 2))
INTETICKSETT = Column(NUMERIC(26, 2))
DOMETICKSETT = Column(NUMERIC(26, 2))
DEFEREVE = Column(NUMERIC(26, 2))
SHORTTERMBDSPAYA = Column(NUMERIC(26, 2))
LIABHELDFORS = Column(NUMERIC(26, 2))
DUENONCLIAB = Column(NUMERIC(26, 2))
OTHERCURRELIABI = Column(NUMERIC(26, 2))
TOTALCURRLIAB = Column(NUMERIC(26, 2))
LONGBORR = Column(NUMERIC(26, 2))
LCOPEWORKERSAL = Column(NUMERIC(26, 2))
BDSPAYA = Column(NUMERIC(26, 2))
BDSPAYAPREST = Column(NUMERIC(26, 2))
BDSPAYAPERBOND = Column(NUMERIC(26, 2))
LONGPAYA = Column(NUMERIC(26, 2))
SPECPAYA = Column(NUMERIC(26, 2))
EXPENONCLIAB = Column(NUMERIC(26, 2))
LONGDEFEINCO = Column(NUMERIC(26, 2))
DEFEINCOTAXLIAB = Column(NUMERIC(26, 2))
OTHERNONCLIABI = Column(NUMERIC(26, 2))
TOTALNONCLIAB = Column(NUMERIC(26, 2))
TOTLIAB = Column(NUMERIC(26, 2))
PAIDINCAPI = Column(NUMERIC(26, 2))
OTHEQUIN = Column(NUMERIC(26, 2))
PREST = Column(NUMERIC(26, 2))
PERBOND = Column(NUMERIC(26, 2))
CAPISURP = Column(NUMERIC(26, 2))
TREASTK = Column(NUMERIC(26, 2))
OCL = Column(NUMERIC(26, 2))
SPECRESE = Column(NUMERIC(26, 2))
RESE = Column(NUMERIC(26, 2))
GENERISKRESE = Column(NUMERIC(26, 2))
UNREINVELOSS = Column(NUMERIC(26, 2))
UNDIPROF = Column(NUMERIC(26, 2))
TOPAYCASHDIVI = Column(NUMERIC(26, 2))
CURTRANDIFF = Column(NUMERIC(26, 2))
PARESHARRIGH = Column(NUMERIC(26, 2))
MINYSHARRIGH = Column(NUMERIC(26, 2))
RIGHAGGR = Column(NUMERIC(26, 2))
TOTLIABSHAREQUI = Column(NUMERIC(26, 2))
WARLIABRESE = Column(NUMERIC(26, 2))
ISVALID = Column(INT)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
DECLAREDATE = Column(VARCHAR(8))
SUNEVENCURRASSE = Column(NUMERIC(26, 2))
SFORMATCURRASSE = Column(NUMERIC(26, 2))
SMERGERCURRASSE = Column(NUMERIC(26, 2))
SUNEVENNONCASSE = Column(NUMERIC(26, 2))
SFORMATNONCASSE = Column(NUMERIC(26, 2))
SMERGERNONCASSE = Column(NUMERIC(26, 2))
SUNEVENTOTASSET = Column(NUMERIC(26, 2))
SFORMATTOTASSET = Column(NUMERIC(26, 2))
SMERGERTOTASSET = Column(NUMERIC(26, 2))
SUNEVENCURRELIABI = Column(NUMERIC(26, 2))
SFORMATCURRELIABI = Column(NUMERIC(26, 2))
SMERGERCURRELIABI = Column(NUMERIC(26, 2))
SUNEVENNONCLIAB = Column(NUMERIC(26, 2))
SFORMATNONCLIAB = Column(NUMERIC(26, 2))
SMERGERNONCLIAB = Column(NUMERIC(26, 2))
SUNEVENTOTLIAB = Column(NUMERIC(26, 2))
SFORMATTOTLIAB = Column(NUMERIC(26, 2))
SMERGERTOTLIAB = Column(NUMERIC(26, 2))
SUNEVENPARESHARRIGH = Column(NUMERIC(26, 2))
SFORMATPARESHARRIGH = Column(NUMERIC(26, 2))
SMERGERPARESHARRIGH = Column(NUMERIC(26, 2))
SUNEVENRIGHAGGR = Column(NUMERIC(26, 2))
SFORMATRIGHAGGR = Column(NUMERIC(26, 2))
SMERGERRIGHAGGR = Column(NUMERIC(26, 2))
SUNEVENTOTLIABSHAREQUI = Column(NUMERIC(26, 2))
SFORMATTOTLIABSHAREQUI = Column(NUMERIC(26, 2))
SMERGERTOTLIABSHAREQUI = Column(NUMERIC(26, 2))
SUNEVENASSETLIABEUQI = Column(NUMERIC(26, 2))
NOTESACCORECE = Column(NUMERIC(26, 2))
CONTRACTASSET = Column(NUMERIC(26, 2))
OTHDEBTINVEST = Column(NUMERIC(26, 2))
OTHEQUININVEST = Column(NUMERIC(26, 2))
OTHERNONCFINASSE = Column(NUMERIC(26, 2))
NOTESACCOPAYA = Column(NUMERIC(26, 2))
CONTRACTLIAB = Column(NUMERIC(26, 2))
FAIRVALUEASSETS = Column(NUMERIC(26, 2))
AMORTIZCOSTASSETS = Column(NUMERIC(26, 2))
OTHERRECETOT = Column(NUMERIC(26, 2))
OTHERPAYTOT = Column(NUMERIC(26, 2))
FIXEDASSECLEATOT = Column(NUMERIC(26, 2))
CONSPROGTOT = Column(NUMERIC(26, 2))
LONGPAYATOT = Column(NUMERIC(26, 2))
RECFINANC = Column(NUMERIC(26, 2))
RUSEASSETS = Column(NUMERIC(26, 2))
LEASELIAB = Column(NUMERIC(26, 2))
TMSTAMP = Column(Integer)
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class BalanceTTM(Base):
__tablename__ = 'balance_ttm'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
ENDDATE = Column(VARCHAR(8))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
REPORTYEAR = Column(VARCHAR(10))
REPORTDATETYPE = Column(VARCHAR(10))
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
ISREALACCSTA = Column(VARCHAR(10))
ISACORRECT = Column(INT)
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
CUR = Column(VARCHAR(10))
CURFDS = Column(NUMERIC(26, 2))
SETTRESEDEPO = Column(NUMERIC(26, 2))
PLAC = Column(NUMERIC(26, 2))
TRADFINASSET = Column(NUMERIC(26, 2))
DERIFINAASSET = Column(NUMERIC(26, 2))
NOTESRECE = Column(NUMERIC(26, 2))
ACCORECE = Column(NUMERIC(26, 2))
PREP = Column(NUMERIC(26, 2))
PREMRECE = Column(NUMERIC(26, 2))
REINRECE = Column(NUMERIC(26, 2))
REINCONTRESE = Column(NUMERIC(26, 2))
INTERECE = Column(NUMERIC(26, 2))
DIVIDRECE = Column(NUMERIC(26, 2))
OTHERRECE = Column(NUMERIC(26, 2))
EXPOTAXREBARECE = Column(NUMERIC(26, 2))
SUBSRECE = Column(NUMERIC(26, 2))
MARGRECE = Column(NUMERIC(26, 2))
INTELRECE = Column(NUMERIC(26, 2))
PURCRESAASSET = Column(NUMERIC(26, 2))
INVE = Column(NUMERIC(26, 2))
ACCHELDFORS = Column(NUMERIC(26, 2))
PREPEXPE = Column(NUMERIC(26, 2))
UNSEG = Column(NUMERIC(26, 2))
EXPINONCURRASSET = Column(NUMERIC(26, 2))
OTHERCURRASSE = Column(NUMERIC(26, 2))
TOTCURRASSET = Column(NUMERIC(26, 2))
LENDANDLOAN = Column(NUMERIC(26, 2))
AVAISELLASSE = Column(NUMERIC(26, 2))
HOLDINVEDUE = Column(NUMERIC(26, 2))
LONGRECE = Column(NUMERIC(26, 2))
EQUIINVE = Column(NUMERIC(26, 2))
OTHERLONGINVE = Column(NUMERIC(26, 2))
INVEPROP = Column(NUMERIC(26, 2))
FIXEDASSEIMMO = Column(NUMERIC(26, 2))
ACCUDEPR = Column(NUMERIC(26, 2))
FIXEDASSENETW = Column(NUMERIC(26, 2))
FIXEDASSEIMPA = Column(NUMERIC(26, 2))
FIXEDASSENET = Column(NUMERIC(26, 2))
CONSPROG = Column(NUMERIC(26, 2))
ENGIMATE = Column(NUMERIC(26, 2))
FIXEDASSECLEA = Column(NUMERIC(26, 2))
PRODASSE = Column(NUMERIC(26, 2))
COMASSE = Column(NUMERIC(26, 2))
HYDRASSET = Column(NUMERIC(26, 2))
INTAASSET = Column(NUMERIC(26, 2))
DEVEEXPE = Column(NUMERIC(26, 2))
GOODWILL = Column(NUMERIC(26, 2))
LOGPREPEXPE = Column(NUMERIC(26, 2))
TRADSHARTRAD = Column(NUMERIC(26, 2))
DEFETAXASSET = Column(NUMERIC(26, 2))
OTHERNONCASSE = Column(NUMERIC(26, 2))
TOTALNONCASSETS = Column(NUMERIC(26, 2))
TOTASSET = Column(NUMERIC(26, 2))
SHORTTERMBORR = Column(NUMERIC(26, 2))
CENBANKBORR = Column(NUMERIC(26, 2))
DEPOSIT = Column(NUMERIC(26, 2))
FDSBORR = Column(NUMERIC(26, 2))
TRADFINLIAB = Column(NUMERIC(26, 2))
DERILIAB = Column(NUMERIC(26, 2))
NOTESPAYA = Column(NUMERIC(26, 2))
ACCOPAYA = Column(NUMERIC(26, 2))
ADVAPAYM = Column(NUMERIC(26, 2))
SELLREPASSE = Column(NUMERIC(26, 2))
COPEPOUN = Column(NUMERIC(26, 2))
COPEWORKERSAL = Column(NUMERIC(26, 2))
TAXESPAYA = Column(NUMERIC(26, 2))
INTEPAYA = Column(NUMERIC(26, 2))
DIVIPAYA = Column(NUMERIC(26, 2))
OTHERFEEPAYA = Column(NUMERIC(26, 2))
MARGREQU = Column(NUMERIC(26, 2))
INTELPAY = Column(NUMERIC(26, 2))
OTHERPAY = Column(NUMERIC(26, 2))
ACCREXPE = Column(NUMERIC(26, 2))
EXPECURRLIAB = Column(NUMERIC(26, 2))
COPEWITHREINRECE = Column(NUMERIC(26, 2))
INSUCONTRESE = Column(NUMERIC(26, 2))
ACTITRADSECU = Column(NUMERIC(26, 2))
ACTIUNDESECU = Column(NUMERIC(26, 2))
INTETICKSETT = Column(NUMERIC(26, 2))
DOMETICKSETT = Column(NUMERIC(26, 2))
DEFEREVE = Column(NUMERIC(26, 2))
SHORTTERMBDSPAYA = Column(NUMERIC(26, 2))
LIABHELDFORS = Column(NUMERIC(26, 2))
DUENONCLIAB = Column(NUMERIC(26, 2))
OTHERCURRELIABI = Column(NUMERIC(26, 2))
TOTALCURRLIAB = Column(NUMERIC(26, 2))
LONGBORR = Column(NUMERIC(26, 2))
LCOPEWORKERSAL = Column(NUMERIC(26, 2))
BDSPAYA = Column(NUMERIC(26, 2))
BDSPAYAPREST = Column(NUMERIC(26, 2))
BDSPAYAPERBOND = Column(NUMERIC(26, 2))
LONGPAYA = Column(NUMERIC(26, 2))
SPECPAYA = Column(NUMERIC(26, 2))
EXPENONCLIAB = Column(NUMERIC(26, 2))
LONGDEFEINCO = Column(NUMERIC(26, 2))
DEFEINCOTAXLIAB = Column(NUMERIC(26, 2))
OTHERNONCLIABI = Column(NUMERIC(26, 2))
TOTALNONCLIAB = Column(NUMERIC(26, 2))
TOTLIAB = Column(NUMERIC(26, 2))
PAIDINCAPI = Column(NUMERIC(26, 2))
OTHEQUIN = Column(NUMERIC(26, 2))
PREST = Column(NUMERIC(26, 2))
PERBOND = Column(NUMERIC(26, 2))
CAPISURP = Column(NUMERIC(26, 2))
TREASTK = Column(NUMERIC(26, 2))
OCL = Column(NUMERIC(26, 2))
SPECRESE = Column(NUMERIC(26, 2))
RESE = Column(NUMERIC(26, 2))
GENERISKRESE = Column(NUMERIC(26, 2))
UNREINVELOSS = Column(NUMERIC(26, 2))
UNDIPROF = Column(NUMERIC(26, 2))
TOPAYCASHDIVI = Column(NUMERIC(26, 2))
CURTRANDIFF = Column(NUMERIC(26, 2))
PARESHARRIGH = Column(NUMERIC(26, 2))
MINYSHARRIGH = Column(NUMERIC(26, 2))
RIGHAGGR = Column(NUMERIC(26, 2))
TOTLIABSHAREQUI = Column(NUMERIC(26, 2))
WARLIABRESE = Column(NUMERIC(26, 2))
ISVALID = Column(INT)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
DECLAREDATE = Column(VARCHAR(8))
SUNEVENCURRASSE = Column(NUMERIC(26, 2))
SFORMATCURRASSE = Column(NUMERIC(26, 2))
SMERGERCURRASSE = Column(NUMERIC(26, 2))
SUNEVENNONCASSE = Column(NUMERIC(26, 2))
SFORMATNONCASSE = Column(NUMERIC(26, 2))
SMERGERNONCASSE = Column(NUMERIC(26, 2))
SUNEVENTOTASSET = Column(NUMERIC(26, 2))
SFORMATTOTASSET = Column(NUMERIC(26, 2))
SMERGERTOTASSET = Column(NUMERIC(26, 2))
SUNEVENCURRELIABI = Column(NUMERIC(26, 2))
SFORMATCURRELIABI = Column(NUMERIC(26, 2))
SMERGERCURRELIABI = Column(NUMERIC(26, 2))
SUNEVENNONCLIAB = Column(NUMERIC(26, 2))
SFORMATNONCLIAB = Column(NUMERIC(26, 2))
SMERGERNONCLIAB = Column(NUMERIC(26, 2))
SUNEVENTOTLIAB = Column(NUMERIC(26, 2))
SFORMATTOTLIAB = Column(NUMERIC(26, 2))
SMERGERTOTLIAB = Column(NUMERIC(26, 2))
SUNEVENPARESHARRIGH = Column(NUMERIC(26, 2))
SFORMATPARESHARRIGH = Column(NUMERIC(26, 2))
SMERGERPARESHARRIGH = Column(NUMERIC(26, 2))
SUNEVENRIGHAGGR = Column(NUMERIC(26, 2))
SFORMATRIGHAGGR = Column(NUMERIC(26, 2))
SMERGERRIGHAGGR = Column(NUMERIC(26, 2))
SUNEVENTOTLIABSHAREQUI = Column(NUMERIC(26, 2))
SFORMATTOTLIABSHAREQUI = Column(NUMERIC(26, 2))
SMERGERTOTLIABSHAREQUI = Column(NUMERIC(26, 2))
SUNEVENASSETLIABEUQI = Column(NUMERIC(26, 2))
NOTESACCORECE = Column(NUMERIC(26, 2))
CONTRACTASSET = Column(NUMERIC(26, 2))
OTHDEBTINVEST = Column(NUMERIC(26, 2))
OTHEQUININVEST = Column(NUMERIC(26, 2))
OTHERNONCFINASSE = Column(NUMERIC(26, 2))
NOTESACCOPAYA = Column(NUMERIC(26, 2))
CONTRACTLIAB = Column(NUMERIC(26, 2))
FAIRVALUEASSETS = Column(NUMERIC(26, 2))
AMORTIZCOSTASSETS = Column(NUMERIC(26, 2))
OTHERRECETOT = Column(NUMERIC(26, 2))
OTHERPAYTOT = Column(NUMERIC(26, 2))
FIXEDASSECLEATOT = Column(NUMERIC(26, 2))
CONSPROGTOT = Column(NUMERIC(26, 2))
LONGPAYATOT = Column(NUMERIC(26, 2))
RECFINANC = Column(NUMERIC(26, 2))
RUSEASSETS = Column(NUMERIC(26, 2))
LEASELIAB = Column(NUMERIC(26, 2))
TMSTAMP = Column(Integer)
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class BalanceReport(Base):
__tablename__ = 'balance_report'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
ENDDATE = Column(VARCHAR(8))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
REPORTYEAR = Column(VARCHAR(10))
REPORTDATETYPE = Column(VARCHAR(10))
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
ISREALACCSTA = Column(VARCHAR(10))
ISACORRECT = Column(INT)
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
CUR = Column(VARCHAR(10))
CURFDS = Column(NUMERIC(26, 2))
SETTRESEDEPO = Column(NUMERIC(26, 2))
PLAC = Column(NUMERIC(26, 2))
TRADFINASSET = Column(NUMERIC(26, 2))
DERIFINAASSET = Column(NUMERIC(26, 2))
NOTESRECE = Column(NUMERIC(26, 2))
ACCORECE = Column(NUMERIC(26, 2))
PREP = Column(NUMERIC(26, 2))
PREMRECE = Column(NUMERIC(26, 2))
REINRECE = Column(NUMERIC(26, 2))
REINCONTRESE = Column(NUMERIC(26, 2))
INTERECE = Column(NUMERIC(26, 2))
DIVIDRECE = Column(NUMERIC(26, 2))
OTHERRECE = Column(NUMERIC(26, 2))
EXPOTAXREBARECE = Column(NUMERIC(26, 2))
SUBSRECE = Column(NUMERIC(26, 2))
MARGRECE = Column(NUMERIC(26, 2))
INTELRECE = Column(NUMERIC(26, 2))
PURCRESAASSET = Column(NUMERIC(26, 2))
INVE = Column(NUMERIC(26, 2))
ACCHELDFORS = Column(NUMERIC(26, 2))
PREPEXPE = Column(NUMERIC(26, 2))
UNSEG = Column(NUMERIC(26, 2))
EXPINONCURRASSET = Column(NUMERIC(26, 2))
OTHERCURRASSE = Column(NUMERIC(26, 2))
TOTCURRASSET = Column(NUMERIC(26, 2))
LENDANDLOAN = Column(NUMERIC(26, 2))
AVAISELLASSE = Column(NUMERIC(26, 2))
HOLDINVEDUE = Column(NUMERIC(26, 2))
LONGRECE = Column(NUMERIC(26, 2))
EQUIINVE = Column(NUMERIC(26, 2))
OTHERLONGINVE = Column(NUMERIC(26, 2))
INVEPROP = Column(NUMERIC(26, 2))
FIXEDASSEIMMO = Column(NUMERIC(26, 2))
ACCUDEPR = Column(NUMERIC(26, 2))
FIXEDASSENETW = Column(NUMERIC(26, 2))
FIXEDASSEIMPA = Column(NUMERIC(26, 2))
FIXEDASSENET = Column(NUMERIC(26, 2))
CONSPROG = Column(NUMERIC(26, 2))
ENGIMATE = Column(NUMERIC(26, 2))
FIXEDASSECLEA = Column(NUMERIC(26, 2))
PRODASSE = Column(NUMERIC(26, 2))
COMASSE = Column(NUMERIC(26, 2))
HYDRASSET = Column(NUMERIC(26, 2))
INTAASSET = Column(NUMERIC(26, 2))
DEVEEXPE = Column(NUMERIC(26, 2))
GOODWILL = Column(NUMERIC(26, 2))
LOGPREPEXPE = Column(NUMERIC(26, 2))
TRADSHARTRAD = Column(NUMERIC(26, 2))
DEFETAXASSET = Column(NUMERIC(26, 2))
OTHERNONCASSE = Column(NUMERIC(26, 2))
TOTALNONCASSETS = Column(NUMERIC(26, 2))
TOTASSET = Column(NUMERIC(26, 2))
SHORTTERMBORR = Column(NUMERIC(26, 2))
CENBANKBORR = Column(NUMERIC(26, 2))
DEPOSIT = Column(NUMERIC(26, 2))
FDSBORR = Column(NUMERIC(26, 2))
TRADFINLIAB = Column(NUMERIC(26, 2))
DERILIAB = Column(NUMERIC(26, 2))
NOTESPAYA = Column(NUMERIC(26, 2))
ACCOPAYA = Column(NUMERIC(26, 2))
ADVAPAYM = Column(NUMERIC(26, 2))
SELLREPASSE = Column(NUMERIC(26, 2))
COPEPOUN = Column(NUMERIC(26, 2))
COPEWORKERSAL = Column(NUMERIC(26, 2))
TAXESPAYA = Column(NUMERIC(26, 2))
INTEPAYA = Column(NUMERIC(26, 2))
DIVIPAYA = Column(NUMERIC(26, 2))
OTHERFEEPAYA = Column(NUMERIC(26, 2))
MARGREQU = Column(NUMERIC(26, 2))
INTELPAY = Column(NUMERIC(26, 2))
OTHERPAY = Column(NUMERIC(26, 2))
ACCREXPE = Column(NUMERIC(26, 2))
EXPECURRLIAB = Column(NUMERIC(26, 2))
COPEWITHREINRECE = Column(NUMERIC(26, 2))
INSUCONTRESE = Column(NUMERIC(26, 2))
ACTITRADSECU = Column(NUMERIC(26, 2))
ACTIUNDESECU = Column(NUMERIC(26, 2))
INTETICKSETT = Column(NUMERIC(26, 2))
DOMETICKSETT = Column(NUMERIC(26, 2))
DEFEREVE = Column(NUMERIC(26, 2))
SHORTTERMBDSPAYA = Column(NUMERIC(26, 2))
LIABHELDFORS = Column(NUMERIC(26, 2))
DUENONCLIAB = Column(NUMERIC(26, 2))
OTHERCURRELIABI = Column(NUMERIC(26, 2))
TOTALCURRLIAB = Column(NUMERIC(26, 2))
LONGBORR = Column(NUMERIC(26, 2))
LCOPEWORKERSAL = Column(NUMERIC(26, 2))
BDSPAYA = Column(NUMERIC(26, 2))
BDSPAYAPREST = Column(NUMERIC(26, 2))
BDSPAYAPERBOND = Column(NUMERIC(26, 2))
LONGPAYA = Column(NUMERIC(26, 2))
SPECPAYA = Column(NUMERIC(26, 2))
EXPENONCLIAB = Column(NUMERIC(26, 2))
LONGDEFEINCO = Column(NUMERIC(26, 2))
DEFEINCOTAXLIAB = Column(NUMERIC(26, 2))
OTHERNONCLIABI = Column(NUMERIC(26, 2))
TOTALNONCLIAB = Column(NUMERIC(26, 2))
TOTLIAB = Column(NUMERIC(26, 2))
PAIDINCAPI = Column(NUMERIC(26, 2))
OTHEQUIN = Column(NUMERIC(26, 2))
PREST = Column(NUMERIC(26, 2))
PERBOND = Column(NUMERIC(26, 2))
CAPISURP = Column(NUMERIC(26, 2))
TREASTK = Column(NUMERIC(26, 2))
OCL = Column(NUMERIC(26, 2))
SPECRESE = Column(NUMERIC(26, 2))
RESE = Column(NUMERIC(26, 2))
GENERISKRESE = Column(NUMERIC(26, 2))
UNREINVELOSS = Column(NUMERIC(26, 2))
UNDIPROF = Column(NUMERIC(26, 2))
TOPAYCASHDIVI = Column(NUMERIC(26, 2))
CURTRANDIFF = Column(NUMERIC(26, 2))
PARESHARRIGH = Column(NUMERIC(26, 2))
MINYSHARRIGH = Column(NUMERIC(26, 2))
RIGHAGGR = Column(NUMERIC(26, 2))
TOTLIABSHAREQUI = Column(NUMERIC(26, 2))
WARLIABRESE = Column(NUMERIC(26, 2))
ISVALID = Column(INT)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
DECLAREDATE = Column(VARCHAR(8))
SUNEVENCURRASSE = Column(NUMERIC(26, 2))
SFORMATCURRASSE = Column(NUMERIC(26, 2))
SMERGERCURRASSE = Column(NUMERIC(26, 2))
SUNEVENNONCASSE = Column(NUMERIC(26, 2))
SFORMATNONCASSE = Column(NUMERIC(26, 2))
SMERGERNONCASSE = Column(NUMERIC(26, 2))
SUNEVENTOTASSET = Column(NUMERIC(26, 2))
SFORMATTOTASSET = Column(NUMERIC(26, 2))
SMERGERTOTASSET = Column(NUMERIC(26, 2))
SUNEVENCURRELIABI = Column(NUMERIC(26, 2))
SFORMATCURRELIABI = Column(NUMERIC(26, 2))
SMERGERCURRELIABI = Column(NUMERIC(26, 2))
SUNEVENNONCLIAB = Column(NUMERIC(26, 2))
SFORMATNONCLIAB = Column(NUMERIC(26, 2))
SMERGERNONCLIAB = Column(NUMERIC(26, 2))
SUNEVENTOTLIAB = Column(NUMERIC(26, 2))
SFORMATTOTLIAB = Column(NUMERIC(26, 2))
SMERGERTOTLIAB = Column(NUMERIC(26, 2))
SUNEVENPARESHARRIGH = Column(NUMERIC(26, 2))
SFORMATPARESHARRIGH = Column(NUMERIC(26, 2))
SMERGERPARESHARRIGH = Column(NUMERIC(26, 2))
SUNEVENRIGHAGGR = Column(NUMERIC(26, 2))
SFORMATRIGHAGGR = Column(NUMERIC(26, 2))
SMERGERRIGHAGGR = Column(NUMERIC(26, 2))
SUNEVENTOTLIABSHAREQUI = Column(NUMERIC(26, 2))
SFORMATTOTLIABSHAREQUI = Column(NUMERIC(26, 2))
SMERGERTOTLIABSHAREQUI = Column(NUMERIC(26, 2))
SUNEVENASSETLIABEUQI = Column(NUMERIC(26, 2))
NOTESACCORECE = Column(NUMERIC(26, 2))
CONTRACTASSET = Column(NUMERIC(26, 2))
OTHDEBTINVEST = Column(NUMERIC(26, 2))
OTHEQUININVEST = Column(NUMERIC(26, 2))
OTHERNONCFINASSE = Column(NUMERIC(26, 2))
NOTESACCOPAYA = Column(NUMERIC(26, 2))
CONTRACTLIAB = Column(NUMERIC(26, 2))
FAIRVALUEASSETS = Column(NUMERIC(26, 2))
AMORTIZCOSTASSETS = Column(NUMERIC(26, 2))
OTHERRECETOT = Column(NUMERIC(26, 2))
OTHERPAYTOT = Column(NUMERIC(26, 2))
FIXEDASSECLEATOT = Column(NUMERIC(26, 2))
CONSPROGTOT = Column(NUMERIC(26, 2))
LONGPAYATOT = Column(NUMERIC(26, 2))
RECFINANC = Column(NUMERIC(26, 2))
RUSEASSETS = Column(NUMERIC(26, 2))
LEASELIAB = Column(NUMERIC(26, 2))
TMSTAMP = Column(Integer)
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class CashFlowMRQ(Base):
__tablename__ = 'cash_flow_mrq'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
BEGINDATE = Column(VARCHAR(8))
ENDDATE = Column(VARCHAR(8))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
LABORGETCASH = Column(NUMERIC(26, 2))
DEPONETR = Column(NUMERIC(26, 2))
BANKLOANNETINCR = Column(NUMERIC(26, 2))
FININSTNETR = Column(NUMERIC(26, 2))
INSPREMCASH = Column(NUMERIC(26, 2))
INSNETC = Column(NUMERIC(26, 2))
SAVINETR = Column(NUMERIC(26, 2))
DISPTRADNETINCR = Column(NUMERIC(26, 2))
CHARINTECASH = Column(NUMERIC(26, 2))
FDSBORRNETR = Column(NUMERIC(26, 2))
REPNETINCR = Column(NUMERIC(26, 2))
TAXREFD = Column(NUMERIC(26, 2))
RECEOTHERBIZCASH = Column(NUMERIC(26, 2))
BIZCASHINFL = Column(NUMERIC(26, 2))
LABOPAYC = Column(NUMERIC(26, 2))
LOANSNETR = Column(NUMERIC(26, 2))
TRADEPAYMNETR = Column(NUMERIC(26, 2))
PAYCOMPGOLD = Column(NUMERIC(26, 2))
PAYINTECASH = Column(NUMERIC(26, 2))
PAYDIVICASH = Column(NUMERIC(26, 2))
PAYWORKCASH = Column(NUMERIC(26, 2))
PAYTAX = Column(NUMERIC(26, 2))
PAYACTICASH = Column(NUMERIC(26, 2))
BIZCASHOUTF = Column(NUMERIC(26, 2))
MANANETR = Column(NUMERIC(26, 2))
WITHINVGETCASH = Column(NUMERIC(26, 2))
INVERETUGETCASH = Column(NUMERIC(26, 2))
FIXEDASSETNETC = Column(NUMERIC(26, 2))
SUBSNETC = Column(NUMERIC(26, 2))
RECEINVCASH = Column(NUMERIC(26, 2))
REDUCASHPLED = Column(NUMERIC(26, 2))
INVCASHINFL = Column(NUMERIC(26, 2))
ACQUASSETCASH = Column(NUMERIC(26, 2))
INVPAYC = Column(NUMERIC(26, 2))
LOANNETR = Column(NUMERIC(26, 2))
SUBSPAYNETCASH = Column(NUMERIC(26, 2))
PAYINVECASH = Column(NUMERIC(26, 2))
INCRCASHPLED = Column(NUMERIC(26, 2))
INVCASHOUTF = Column(NUMERIC(26, 2))
INVNETCASHFLOW = Column(NUMERIC(26, 2))
INVRECECASH = Column(NUMERIC(26, 2))
SUBSRECECASH = Column(NUMERIC(26, 2))
RECEFROMLOAN = Column(NUMERIC(26, 2))
ISSBDRECECASH = Column(NUMERIC(26, 2))
RECEFINCASH = Column(NUMERIC(26, 2))
FINCASHINFL = Column(NUMERIC(26, 2))
DEBTPAYCASH = Column(NUMERIC(26, 2))
DIVIPROFPAYCASH = Column(NUMERIC(26, 2))
SUBSPAYDIVID = Column(NUMERIC(26, 2))
FINRELACASH = Column(NUMERIC(26, 2))
FINCASHOUTF = Column(NUMERIC(26, 2))
FINNETCFLOW = Column(NUMERIC(26, 2))
CHGEXCHGCHGS = Column(NUMERIC(26, 2))
CASHNETR = Column(NUMERIC(26, 2))
INICASHBALA = Column(NUMERIC(26, 2))
FINALCASHBALA = Column(NUMERIC(26, 2))
NETPROFIT = Column(NUMERIC(26, 2))
MINYSHARRIGH = Column(NUMERIC(26, 2))
UNREINVELOSS = Column(NUMERIC(26, 2))
ASSEIMPA = Column(NUMERIC(26, 2))
ASSEDEPR = Column(NUMERIC(26, 2))
REALESTADEP = Column(NUMERIC(26, 2))
INTAASSEAMOR = Column(NUMERIC(26, 2))
LONGDEFEEXPENAMOR = Column(NUMERIC(26, 2))
PREPEXPEDECR = Column(NUMERIC(26, 2))
ACCREXPEINCR = Column(NUMERIC(26, 2))
DISPFIXEDASSETLOSS = Column(NUMERIC(26, 2))
FIXEDASSESCRALOSS = Column(NUMERIC(26, 2))
VALUECHGLOSS = Column(NUMERIC(26, 2))
DEFEINCOINCR = Column(NUMERIC(26, 2))
ESTIDEBTS = Column(NUMERIC(26, 2))
FINEXPE = Column(NUMERIC(26, 2))
INVELOSS = Column(NUMERIC(26, 2))
DEFETAXASSETDECR = Column(NUMERIC(26, 2))
DEFETAXLIABINCR = Column(NUMERIC(26, 2))
INVEREDU = Column(NUMERIC(26, 2))
RECEREDU = Column(NUMERIC(26, 2))
PAYAINCR = Column(NUMERIC(26, 2))
UNSEPARACHG = Column(NUMERIC(26, 2))
UNFIPARACHG = Column(NUMERIC(26, 2))
OTHER = Column(NUMERIC(26, 2))
BIZNETCFLOW = Column(NUMERIC(26, 2))
DEBTINTOCAPI = Column(NUMERIC(26, 2))
EXPICONVBD = Column(NUMERIC(26, 2))
FINFIXEDASSET = Column(NUMERIC(26, 2))
CASHFINALBALA = Column(NUMERIC(26, 2))
CASHOPENBALA = Column(NUMERIC(26, 2))
EQUFINALBALA = Column(NUMERIC(26, 2))
EQUOPENBALA = Column(NUMERIC(26, 2))
CASHNETI = Column(NUMERIC(26, 2))
ISVALID = Column(INT)
TMSTAMP = Column(Integer)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
REPORTYEAR = Column(VARCHAR(10))
REPORTDATETYPE = Column(VARCHAR(10))
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
ISREALACCSTA = Column(VARCHAR(10))
ISACORRECT = Column(INT)
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
CUR = Column(VARCHAR(10))
DECLAREDATE = Column(VARCHAR(8))
SUNEVENBIZCASHINFL = Column(NUMERIC(26, 2))
SFORMATBIZCASHINFL = Column(NUMERIC(26, 2))
SMERGERBIZCASHINFL = Column(NUMERIC(26, 2))
SUNEVENBIZCASHOUTF = Column(NUMERIC(26, 2))
SFORMATBIZCASHOUTF = Column(NUMERIC(26, 2))
SMERGERBIZCASHOUTF = Column(NUMERIC(26, 2))
SUNEVENMANANETR = Column(NUMERIC(26, 2))
SFORMATMANANETR = Column(NUMERIC(26, 2))
SMERGERMANANETR = Column(NUMERIC(26, 2))
SUNEVENINVCASHINFL = Column(NUMERIC(26, 2))
SFORMATINVCASHINFL = Column(NUMERIC(26, 2))
SMERGERINVCASHINFL = Column(NUMERIC(26, 2))
SUNEVENINVCASHOUTF = Column(NUMERIC(26, 2))
SFORMATINVCASHOUTF = Column(NUMERIC(26, 2))
SMERGERINVCASHOUTF = Column(NUMERIC(26, 2))
SUNEVENINVNETCASHFLOW = Column(NUMERIC(26, 2))
SMERGERINVNETCASHFLOW = Column(NUMERIC(26, 2))
SUNEVENFINCASHINFL = Column(NUMERIC(26, 2))
SFORMATFINCASHINFL = Column(NUMERIC(26, 2))
SMERGERFINCASHINFL = Column(NUMERIC(26, 2))
SUNEVENFINCASHOUTF = Column(NUMERIC(26, 2))
SFORMATFINCASHOUTF = Column(NUMERIC(26, 2))
SMERGERFINCASHOUTF = Column(NUMERIC(26, 2))
SUNEVENFINNETCFLOW = Column(NUMERIC(26, 2))
SMERGERFINNETCFLOW = Column(NUMERIC(26, 2))
SUNEVENCASHNETR = Column(NUMERIC(26, 2))
SFORMATCASHNETR = Column(NUMERIC(26, 2))
SMERGERCASHNETR = Column(NUMERIC(26, 2))
SUNEVENFINALCASHBALA = Column(NUMERIC(26, 2))
SFORMATFINALCASHBALA = Column(NUMERIC(26, 2))
SMERGERFINALCASHBALA = Column(NUMERIC(26, 2))
SUNEVENBIZNETCFLOW = Column(NUMERIC(26, 2))
SFORMATBIZNETCFLOW = Column(NUMERIC(26, 2))
SMERGERBIZNETCFLOW = Column(NUMERIC(26, 2))
SUNEVENMANANETRMS = Column(NUMERIC(26, 2))
SUNEVENCASHNETI = Column(NUMERIC(26, 2))
SFORMATCASHNETI = Column(NUMERIC(26, 2))
SMERGERCASHNETI = Column(NUMERIC(26, 2))
SUNEVENCASHNETIMS = Column(NUMERIC(26, 2))
DISPFINANETINCRINVE = Column(NUMERIC(26, 2))
CREDITIMPLOSSE = Column(NUMERIC(26, 2))
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class CashFlowTTM(Base):
__tablename__ = 'cash_flow_ttm'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
BEGINDATE = Column(VARCHAR(8))
ENDDATE = Column(VARCHAR(8))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
LABORGETCASH = Column(NUMERIC(26, 2))
DEPONETR = Column(NUMERIC(26, 2))
BANKLOANNETINCR = Column(NUMERIC(26, 2))
FININSTNETR = Column(NUMERIC(26, 2))
INSPREMCASH = Column(NUMERIC(26, 2))
INSNETC = Column(NUMERIC(26, 2))
SAVINETR = Column(NUMERIC(26, 2))
DISPTRADNETINCR = Column(NUMERIC(26, 2))
CHARINTECASH = Column(NUMERIC(26, 2))
FDSBORRNETR = Column(NUMERIC(26, 2))
REPNETINCR = Column(NUMERIC(26, 2))
TAXREFD = Column(NUMERIC(26, 2))
RECEOTHERBIZCASH = Column(NUMERIC(26, 2))
BIZCASHINFL = Column(NUMERIC(26, 2))
LABOPAYC = Column(NUMERIC(26, 2))
LOANSNETR = Column(NUMERIC(26, 2))
TRADEPAYMNETR = Column(NUMERIC(26, 2))
PAYCOMPGOLD = Column(NUMERIC(26, 2))
PAYINTECASH = Column(NUMERIC(26, 2))
PAYDIVICASH = Column(NUMERIC(26, 2))
PAYWORKCASH = Column(NUMERIC(26, 2))
PAYTAX = Column(NUMERIC(26, 2))
PAYACTICASH = Column(NUMERIC(26, 2))
BIZCASHOUTF = Column(NUMERIC(26, 2))
MANANETR = Column(NUMERIC(26, 2))
WITHINVGETCASH = Column(NUMERIC(26, 2))
INVERETUGETCASH = Column(NUMERIC(26, 2))
FIXEDASSETNETC = Column(NUMERIC(26, 2))
SUBSNETC = Column(NUMERIC(26, 2))
RECEINVCASH = Column(NUMERIC(26, 2))
REDUCASHPLED = Column(NUMERIC(26, 2))
INVCASHINFL = Column(NUMERIC(26, 2))
ACQUASSETCASH = Column(NUMERIC(26, 2))
INVPAYC = Column(NUMERIC(26, 2))
LOANNETR = Column(NUMERIC(26, 2))
SUBSPAYNETCASH = Column(NUMERIC(26, 2))
PAYINVECASH = Column(NUMERIC(26, 2))
INCRCASHPLED = Column(NUMERIC(26, 2))
INVCASHOUTF = Column(NUMERIC(26, 2))
INVNETCASHFLOW = Column(NUMERIC(26, 2))
INVRECECASH = Column(NUMERIC(26, 2))
SUBSRECECASH = Column(NUMERIC(26, 2))
RECEFROMLOAN = Column(NUMERIC(26, 2))
ISSBDRECECASH = Column(NUMERIC(26, 2))
RECEFINCASH = Column(NUMERIC(26, 2))
FINCASHINFL = Column(NUMERIC(26, 2))
DEBTPAYCASH = Column(NUMERIC(26, 2))
DIVIPROFPAYCASH = Column(NUMERIC(26, 2))
SUBSPAYDIVID = Column(NUMERIC(26, 2))
FINRELACASH = Column(NUMERIC(26, 2))
FINCASHOUTF = Column(NUMERIC(26, 2))
FINNETCFLOW = Column(NUMERIC(26, 2))
CHGEXCHGCHGS = Column(NUMERIC(26, 2))
CASHNETR = Column(NUMERIC(26, 2))
INICASHBALA = Column(NUMERIC(26, 2))
FINALCASHBALA = Column(NUMERIC(26, 2))
NETPROFIT = Column(NUMERIC(26, 2))
MINYSHARRIGH = Column(NUMERIC(26, 2))
UNREINVELOSS = Column(NUMERIC(26, 2))
ASSEIMPA = Column(NUMERIC(26, 2))
ASSEDEPR = Column(NUMERIC(26, 2))
REALESTADEP = Column(NUMERIC(26, 2))
INTAASSEAMOR = Column(NUMERIC(26, 2))
LONGDEFEEXPENAMOR = Column(NUMERIC(26, 2))
PREPEXPEDECR = Column(NUMERIC(26, 2))
ACCREXPEINCR = Column(NUMERIC(26, 2))
DISPFIXEDASSETLOSS = Column(NUMERIC(26, 2))
FIXEDASSESCRALOSS = Column(NUMERIC(26, 2))
VALUECHGLOSS = Column(NUMERIC(26, 2))
DEFEINCOINCR = Column(NUMERIC(26, 2))
ESTIDEBTS = Column(NUMERIC(26, 2))
FINEXPE = Column(NUMERIC(26, 2))
INVELOSS = Column(NUMERIC(26, 2))
DEFETAXASSETDECR = Column(NUMERIC(26, 2))
DEFETAXLIABINCR = Column(NUMERIC(26, 2))
INVEREDU = Column(NUMERIC(26, 2))
RECEREDU = Column(NUMERIC(26, 2))
PAYAINCR = Column(NUMERIC(26, 2))
UNSEPARACHG = Column(NUMERIC(26, 2))
UNFIPARACHG = Column(NUMERIC(26, 2))
OTHER = Column(NUMERIC(26, 2))
BIZNETCFLOW = Column(NUMERIC(26, 2))
DEBTINTOCAPI = Column(NUMERIC(26, 2))
EXPICONVBD = Column(NUMERIC(26, 2))
FINFIXEDASSET = Column(NUMERIC(26, 2))
CASHFINALBALA = Column(NUMERIC(26, 2))
CASHOPENBALA = Column(NUMERIC(26, 2))
EQUFINALBALA = Column(NUMERIC(26, 2))
EQUOPENBALA = Column(NUMERIC(26, 2))
CASHNETI = Column(NUMERIC(26, 2))
ISVALID = Column(INT)
TMSTAMP = Column(Integer)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
REPORTYEAR = Column(VARCHAR(10))
REPORTDATETYPE = Column(VARCHAR(10))
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
ISREALACCSTA = Column(VARCHAR(10))
ISACORRECT = Column(INT)
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
CUR = Column(VARCHAR(10))
DECLAREDATE = Column(VARCHAR(8))
SUNEVENBIZCASHINFL = Column(NUMERIC(26, 2))
SFORMATBIZCASHINFL = Column(NUMERIC(26, 2))
SMERGERBIZCASHINFL = Column(NUMERIC(26, 2))
SUNEVENBIZCASHOUTF = Column(NUMERIC(26, 2))
SFORMATBIZCASHOUTF = Column(NUMERIC(26, 2))
SMERGERBIZCASHOUTF = Column(NUMERIC(26, 2))
SUNEVENMANANETR = Column(NUMERIC(26, 2))
SFORMATMANANETR = Column(NUMERIC(26, 2))
SMERGERMANANETR = Column(NUMERIC(26, 2))
SUNEVENINVCASHINFL = Column(NUMERIC(26, 2))
SFORMATINVCASHINFL = Column(NUMERIC(26, 2))
SMERGERINVCASHINFL = Column(NUMERIC(26, 2))
SUNEVENINVCASHOUTF = Column(NUMERIC(26, 2))
SFORMATINVCASHOUTF = Column(NUMERIC(26, 2))
SMERGERINVCASHOUTF = Column(NUMERIC(26, 2))
SUNEVENINVNETCASHFLOW = Column(NUMERIC(26, 2))
SMERGERINVNETCASHFLOW = Column(NUMERIC(26, 2))
SUNEVENFINCASHINFL = Column(NUMERIC(26, 2))
SFORMATFINCASHINFL = Column(NUMERIC(26, 2))
SMERGERFINCASHINFL = Column(NUMERIC(26, 2))
SUNEVENFINCASHOUTF = Column(NUMERIC(26, 2))
SFORMATFINCASHOUTF = Column(NUMERIC(26, 2))
SMERGERFINCASHOUTF = Column(NUMERIC(26, 2))
SUNEVENFINNETCFLOW = Column(NUMERIC(26, 2))
SMERGERFINNETCFLOW = Column(NUMERIC(26, 2))
SUNEVENCASHNETR = Column(NUMERIC(26, 2))
SFORMATCASHNETR = Column(NUMERIC(26, 2))
SMERGERCASHNETR = Column(NUMERIC(26, 2))
SUNEVENFINALCASHBALA = Column(NUMERIC(26, 2))
SFORMATFINALCASHBALA = Column(NUMERIC(26, 2))
SMERGERFINALCASHBALA = Column(NUMERIC(26, 2))
SUNEVENBIZNETCFLOW = Column(NUMERIC(26, 2))
SFORMATBIZNETCFLOW = Column(NUMERIC(26, 2))
SMERGERBIZNETCFLOW = Column(NUMERIC(26, 2))
SUNEVENMANANETRMS = Column(NUMERIC(26, 2))
SUNEVENCASHNETI = Column(NUMERIC(26, 2))
SFORMATCASHNETI = Column(NUMERIC(26, 2))
SMERGERCASHNETI = Column(NUMERIC(26, 2))
SUNEVENCASHNETIMS = Column(NUMERIC(26, 2))
DISPFINANETINCRINVE = Column(NUMERIC(26, 2))
CREDITIMPLOSSE = Column(NUMERIC(26, 2))
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class CashFlowReport(Base):
__tablename__ = 'cash_flow_report'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
BEGINDATE = Column(VARCHAR(8))
ENDDATE = Column(VARCHAR(8))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
LABORGETCASH = Column(NUMERIC(26, 2))
DEPONETR = Column(NUMERIC(26, 2))
BANKLOANNETINCR = Column(NUMERIC(26, 2))
FININSTNETR = Column(NUMERIC(26, 2))
INSPREMCASH = Column(NUMERIC(26, 2))
INSNETC = Column(NUMERIC(26, 2))
SAVINETR = Column(NUMERIC(26, 2))
DISPTRADNETINCR = Column(NUMERIC(26, 2))
CHARINTECASH = Column(NUMERIC(26, 2))
FDSBORRNETR = Column(NUMERIC(26, 2))
REPNETINCR = Column(NUMERIC(26, 2))
TAXREFD = Column(NUMERIC(26, 2))
RECEOTHERBIZCASH = Column(NUMERIC(26, 2))
BIZCASHINFL = Column(NUMERIC(26, 2))
LABOPAYC = Column(NUMERIC(26, 2))
LOANSNETR = Column(NUMERIC(26, 2))
TRADEPAYMNETR = Column(NUMERIC(26, 2))
PAYCOMPGOLD = Column(NUMERIC(26, 2))
PAYINTECASH = Column(NUMERIC(26, 2))
PAYDIVICASH = Column(NUMERIC(26, 2))
PAYWORKCASH = Column(NUMERIC(26, 2))
PAYTAX = Column(NUMERIC(26, 2))
PAYACTICASH = Column(NUMERIC(26, 2))
BIZCASHOUTF = Column(NUMERIC(26, 2))
MANANETR = Column(NUMERIC(26, 2))
WITHINVGETCASH = Column(NUMERIC(26, 2))
INVERETUGETCASH = Column(NUMERIC(26, 2))
FIXEDASSETNETC = Column(NUMERIC(26, 2))
SUBSNETC = Column(NUMERIC(26, 2))
RECEINVCASH = Column(NUMERIC(26, 2))
REDUCASHPLED = Column(NUMERIC(26, 2))
INVCASHINFL = Column(NUMERIC(26, 2))
ACQUASSETCASH = Column(NUMERIC(26, 2))
INVPAYC = Column(NUMERIC(26, 2))
LOANNETR = Column(NUMERIC(26, 2))
SUBSPAYNETCASH = Column(NUMERIC(26, 2))
PAYINVECASH = Column(NUMERIC(26, 2))
INCRCASHPLED = Column(NUMERIC(26, 2))
INVCASHOUTF = Column(NUMERIC(26, 2))
INVNETCASHFLOW = Column(NUMERIC(26, 2))
INVRECECASH = Column(NUMERIC(26, 2))
SUBSRECECASH = Column(NUMERIC(26, 2))
RECEFROMLOAN = Column(NUMERIC(26, 2))
ISSBDRECECASH = Column(NUMERIC(26, 2))
RECEFINCASH = Column(NUMERIC(26, 2))
FINCASHINFL = Column(NUMERIC(26, 2))
DEBTPAYCASH = Column(NUMERIC(26, 2))
DIVIPROFPAYCASH = Column(NUMERIC(26, 2))
SUBSPAYDIVID = Column(NUMERIC(26, 2))
FINRELACASH = Column(NUMERIC(26, 2))
FINCASHOUTF = Column(NUMERIC(26, 2))
FINNETCFLOW = Column(NUMERIC(26, 2))
CHGEXCHGCHGS = Column(NUMERIC(26, 2))
CASHNETR = Column(NUMERIC(26, 2))
INICASHBALA = Column(NUMERIC(26, 2))
FINALCASHBALA = Column(NUMERIC(26, 2))
NETPROFIT = Column(NUMERIC(26, 2))
MINYSHARRIGH = Column(NUMERIC(26, 2))
UNREINVELOSS = Column(NUMERIC(26, 2))
ASSEIMPA = Column(NUMERIC(26, 2))
ASSEDEPR = Column(NUMERIC(26, 2))
REALESTADEP = Column(NUMERIC(26, 2))
INTAASSEAMOR = Column(NUMERIC(26, 2))
LONGDEFEEXPENAMOR = Column(NUMERIC(26, 2))
PREPEXPEDECR = Column(NUMERIC(26, 2))
ACCREXPEINCR = Column(NUMERIC(26, 2))
DISPFIXEDASSETLOSS = Column(NUMERIC(26, 2))
FIXEDASSESCRALOSS = Column(NUMERIC(26, 2))
VALUECHGLOSS = Column(NUMERIC(26, 2))
DEFEINCOINCR = Column(NUMERIC(26, 2))
ESTIDEBTS = Column(NUMERIC(26, 2))
FINEXPE = Column(NUMERIC(26, 2))
INVELOSS = Column(NUMERIC(26, 2))
DEFETAXASSETDECR = Column(NUMERIC(26, 2))
DEFETAXLIABINCR = Column(NUMERIC(26, 2))
INVEREDU = Column(NUMERIC(26, 2))
RECEREDU = Column(NUMERIC(26, 2))
PAYAINCR = Column(NUMERIC(26, 2))
UNSEPARACHG = Column(NUMERIC(26, 2))
UNFIPARACHG = Column(NUMERIC(26, 2))
OTHER = Column(NUMERIC(26, 2))
BIZNETCFLOW = Column(NUMERIC(26, 2))
DEBTINTOCAPI = Column(NUMERIC(26, 2))
EXPICONVBD = Column(NUMERIC(26, 2))
FINFIXEDASSET = Column(NUMERIC(26, 2))
CASHFINALBALA = Column(NUMERIC(26, 2))
CASHOPENBALA = Column(NUMERIC(26, 2))
EQUFINALBALA = Column(NUMERIC(26, 2))
EQUOPENBALA = Column(NUMERIC(26, 2))
CASHNETI = Column(NUMERIC(26, 2))
ISVALID = Column(INT)
TMSTAMP = Column(Integer)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
REPORTYEAR = Column(VARCHAR(10))
REPORTDATETYPE = Column(VARCHAR(10))
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
ISREALACCSTA = Column(VARCHAR(10))
ISACORRECT = Column(INT)
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
CUR = Column(VARCHAR(10))
DECLAREDATE = Column(VARCHAR(8))
SUNEVENBIZCASHINFL = Column(NUMERIC(26, 2))
SFORMATBIZCASHINFL = Column(NUMERIC(26, 2))
SMERGERBIZCASHINFL = Column(NUMERIC(26, 2))
SUNEVENBIZCASHOUTF = Column(NUMERIC(26, 2))
SFORMATBIZCASHOUTF = Column(NUMERIC(26, 2))
SMERGERBIZCASHOUTF = Column(NUMERIC(26, 2))
SUNEVENMANANETR = Column(NUMERIC(26, 2))
SFORMATMANANETR = Column(NUMERIC(26, 2))
SMERGERMANANETR = Column(NUMERIC(26, 2))
SUNEVENINVCASHINFL = Column(NUMERIC(26, 2))
SFORMATINVCASHINFL = Column(NUMERIC(26, 2))
SMERGERINVCASHINFL = Column(NUMERIC(26, 2))
SUNEVENINVCASHOUTF = Column(NUMERIC(26, 2))
SFORMATINVCASHOUTF = Column(NUMERIC(26, 2))
SMERGERINVCASHOUTF = Column(NUMERIC(26, 2))
SUNEVENINVNETCASHFLOW = Column(NUMERIC(26, 2))
SMERGERINVNETCASHFLOW = Column(NUMERIC(26, 2))
SUNEVENFINCASHINFL = Column(NUMERIC(26, 2))
SFORMATFINCASHINFL = Column(NUMERIC(26, 2))
SMERGERFINCASHINFL = Column(NUMERIC(26, 2))
SUNEVENFINCASHOUTF = Column(NUMERIC(26, 2))
SFORMATFINCASHOUTF = Column(NUMERIC(26, 2))
SMERGERFINCASHOUTF = Column(NUMERIC(26, 2))
SUNEVENFINNETCFLOW = Column(NUMERIC(26, 2))
SMERGERFINNETCFLOW = Column(NUMERIC(26, 2))
SUNEVENCASHNETR = Column(NUMERIC(26, 2))
SFORMATCASHNETR = Column(NUMERIC(26, 2))
SMERGERCASHNETR = Column(NUMERIC(26, 2))
SUNEVENFINALCASHBALA = Column(NUMERIC(26, 2))
SFORMATFINALCASHBALA = Column(NUMERIC(26, 2))
SMERGERFINALCASHBALA = Column(NUMERIC(26, 2))
SUNEVENBIZNETCFLOW = Column(NUMERIC(26, 2))
SFORMATBIZNETCFLOW = Column(NUMERIC(26, 2))
SMERGERBIZNETCFLOW = Column(NUMERIC(26, 2))
SUNEVENMANANETRMS = Column(NUMERIC(26, 2))
SUNEVENCASHNETI = Column(NUMERIC(26, 2))
SFORMATCASHNETI = Column(NUMERIC(26, 2))
SMERGERCASHNETI = Column(NUMERIC(26, 2))
SUNEVENCASHNETIMS = Column(NUMERIC(26, 2))
DISPFINANETINCRINVE = Column(NUMERIC(26, 2))
CREDITIMPLOSSE = Column(NUMERIC(26, 2))
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class IncomeTTM(Base):
__tablename__ = 'income_ttm'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
BEGINDATE = Column(VARCHAR(8))
ENDDATE = Column(VARCHAR(8))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
BIZTOTINCO = Column(NUMERIC(26, 2))
BIZINCO = Column(NUMERIC(26, 2))
INTEINCO = Column(NUMERIC(26, 2))
EARNPREM = Column(NUMERIC(26, 2))
POUNINCO = Column(NUMERIC(26, 2))
REALSALE = Column(NUMERIC(26, 2))
OTHERBIZINCO = Column(NUMERIC(26, 2))
BIZTOTCOST = Column(NUMERIC(26, 2))
BIZCOST = Column(NUMERIC(26, 2))
INTEEXPE = Column(NUMERIC(26, 2))
POUNEXPE = Column(NUMERIC(26, 2))
REALSALECOST = Column(NUMERIC(26, 2))
DEVEEXPE = Column(NUMERIC(26, 2))
SURRGOLD = Column(NUMERIC(26, 2))
COMPNETEXPE = Column(NUMERIC(26, 2))
CONTRESS = Column(NUMERIC(26, 2))
POLIDIVIEXPE = Column(NUMERIC(26, 2))
REINEXPE = Column(NUMERIC(26, 2))
OTHERBIZCOST = Column(NUMERIC(26, 2))
BIZTAX = Column(NUMERIC(26, 2))
SALESEXPE = Column(NUMERIC(26, 2))
MANAEXPE = Column(NUMERIC(26, 2))
FINEXPE = Column(NUMERIC(26, 2))
ASSEIMPALOSS = Column(NUMERIC(26, 2))
VALUECHGLOSS = Column(NUMERIC(26, 2))
INVEINCO = Column(NUMERIC(26, 2))
ASSOINVEPROF = Column(NUMERIC(26, 2))
EXCHGGAIN = Column(NUMERIC(26, 2))
FUTULOSS = Column(NUMERIC(26, 2))
CUSTINCO = Column(NUMERIC(26, 2))
SUBSIDYINCOME = Column(NUMERIC(26, 2))
OTHERBIZPROF = Column(NUMERIC(26, 2))
PERPROFIT = Column(NUMERIC(26, 2))
NONOREVE = Column(NUMERIC(26, 2))
NONOEXPE = Column(NUMERIC(26, 2))
NONCASSETSDISL = Column(NUMERIC(26, 2))
TOTPROFIT = Column(NUMERIC(26, 2))
INCOTAXEXPE = Column(NUMERIC(26, 2))
UNREINVELOSS = Column(NUMERIC(26, 2))
NETPROFIT = Column(NUMERIC(26, 2))
PARENETP = Column(NUMERIC(26, 2))
MERGEFORMNETPROF = Column(NUMERIC(26, 2))
MINYSHARRIGH = Column(NUMERIC(26, 2))
BASICEPS = Column(NUMERIC(30, 6))
DILUTEDEPS = Column(NUMERIC(30, 6))
OTHERCOMPINCO = Column(NUMERIC(26, 2))
PARECOMPINCO = Column(NUMERIC(26, 2))
MINYSHARINCO = Column(NUMERIC(26, 2))
COMPINCOAMT = Column(NUMERIC(26, 2))
PARECOMPINCOAMT = Column(NUMERIC(26, 2))
MINYSHARINCOAMT = Column(NUMERIC(26, 2))
ISVALID = Column(INT)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
REPORTYEAR = Column(VARCHAR(10))
REPORTDATETYPE = Column(VARCHAR(10))
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
ISREALACCSTA = Column(VARCHAR(10))
ISACORRECT = Column(INT)
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
CUR = Column(VARCHAR(10))
DECLAREDATE = Column(VARCHAR(8))
MAINBIZINCO = Column(NUMERIC(26, 2))
SUNEVENBIZTOTINCO = Column(NUMERIC(26, 2))
SFORMATBIZTOTINCO = Column(NUMERIC(26, 2))
SMERGERBIZTOTINCO = Column(NUMERIC(26, 2))
MAINBIZCOST = Column(NUMERIC(26, 2))
SUNEVENBIZTOTCOST = Column(NUMERIC(26, 2))
SFORMATBIZTOTCOST = Column(NUMERIC(26, 2))
SMERGERBIZTOTCOST = Column(NUMERIC(26, 2))
SUNEVENPERPROFIT = Column(NUMERIC(26, 2))
SFORMATPERPROFIT = Column(NUMERIC(26, 2))
SMERGERPERPROFIT = Column(NUMERIC(26, 2))
SUNEVENTOTPROFIT = Column(NUMERIC(26, 2))
SFORMATTOTPROFIT = Column(NUMERIC(26, 2))
SMERGERTOTPROFIT = Column(NUMERIC(26, 2))
SUNEVENNETPROFIT = Column(NUMERIC(26, 2))
SFORMATNETPROFIT = Column(NUMERIC(26, 2))
SMERGERNETPROFIT = Column(NUMERIC(26, 2))
SUNEVENNETPROFITSUB = Column(NUMERIC(26, 2))
SFORMATNETPROFITSUB = Column(NUMERIC(26, 2))
SMERGERNETPROFITSUB = Column(NUMERIC(26, 2))
EARLYUNDIPROF = Column(NUMERIC(26, 2))
RUNDISPROBYRREGCAP = Column(NUMERIC(26, 2))
OTHERREASADJU = Column(NUMERIC(26, 2))
SUNEVENAVAIDISTPROF = Column(NUMERIC(26, 2))
SFORMATAVAIDISTPROF = Column(NUMERIC(26, 2))
SMERGERAVAIDISTPROF = Column(NUMERIC(26, 2))
AVAIDISTPROF = Column(NUMERIC(26, 2))
LEGALSURP = Column(NUMERIC(26, 2))
STATEXTRUNDI = Column(NUMERIC(26, 2))
PEXTCCAPIFD = Column(NUMERIC(26, 2))
EXTSTAFFFUND = Column(NUMERIC(26, 2))
TRUSTLOSS = Column(NUMERIC(26, 2))
PEXTCDEVEFD = Column(NUMERIC(26, 2))
PPROFRETUINVE = Column(NUMERIC(26, 2))
PSUPPFLOWCAPI = Column(NUMERIC(26, 2))
SUNEVENAVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
SFORMATAVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
SMERGERAVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
AVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
PREFSTOCKDIVI = Column(NUMERIC(26, 2))
EXTRARBIRESE = Column(NUMERIC(26, 2))
COMDIVPAYBABLE = Column(NUMERIC(26, 2))
TURNCAPSDIVI = Column(NUMERIC(26, 2))
SUNEVENUNDIPROF = Column(NUMERIC(26, 2))
SFORMATUNDIPROF = Column(NUMERIC(26, 2))
SMERGERUNDIPROF = Column(NUMERIC(26, 2))
UNDIPROF = Column(NUMERIC(26, 2))
SUNEVENOTHCOMPINCOAMT = Column(NUMERIC(26, 2))
SUNEVENCOMPINCOAMT = Column(NUMERIC(26, 2))
SUNEVENCOMPINCOAMTSUB = Column(NUMERIC(26, 2))
SMERGERCOMPINCOAMTSUB = Column(NUMERIC(26, 2))
NONCASSETSDISI = Column(NUMERIC(26, 2))
NCPOTHINCO = Column(NUMERIC(26, 2))
CINALIBOFRBP = Column(NUMERIC(26, 2))
EQUMCPOTHINCO = Column(NUMERIC(26, 2))
CPLTOHINCO = Column(NUMERIC(26, 2))
EUQMICOLOTHINCO = Column(NUMERIC(26, 2))
CINAFORSFV = Column(NUMERIC(26, 2))
HTMCCINAFORSFV = Column(NUMERIC(26, 2))
EPOCFHGL = Column(NUMERIC(26, 2))
TDIFFFORCUR = Column(NUMERIC(26, 2))
OTHERCPLTOHINCO = Column(NUMERIC(26, 2))
ASSETSDISLINCO = Column(NUMERIC(26, 2))
OTHERINCO = Column(NUMERIC(26, 2))
CONOPERNPROFIT = Column(NUMERIC(26, 2))
TEROPERNPROFIT = Column(NUMERIC(26, 2))
INTERESTEXPENSE = Column(NUMERIC(26, 2))
INTEINCOOPCOST = Column(NUMERIC(26, 2))
CREDITIMPLOSSE = Column(NUMERIC(26, 2))
NETEXPOHEDINC = Column(NUMERIC(26, 2))
OTHEQUINFAVAL = Column(NUMERIC(26, 2))
COMPCREDITFAVAL = Column(NUMERIC(26, 2))
OTHDEBTINVFAVAL = Column(NUMERIC(26, 2))
FINASSINTOOTHINCO = Column(NUMERIC(26, 2))
OTHDEBTINVCREDIMPR = Column(NUMERIC(26, 2))
OTHERSHAREDISTPROF = Column(NUMERIC(26, 2))
HEDCASHFLOW = Column(NUMERIC(26, 2))
EXTGENERISKRESE = Column(NUMERIC(26, 2))
AMORTIZCOSTASSETSSAPI = Column(NUMERIC(26, 2))
INTEPEDEPAYA = Column(NUMERIC(26, 2))
ASSEIMPALOSSPROFIT = Column(NUMERIC(26, 2))
CREDITIMPLOSSEPROFIT = Column(NUMERIC(26, 2))
TMSTAMP = Column(Integer)
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class IncomeMRQ(Base):
__tablename__ = 'income_mrq'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
BEGINDATE = Column(VARCHAR(8))
ENDDATE = Column(VARCHAR(8))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
BIZTOTINCO = Column(NUMERIC(26, 2))
BIZINCO = Column(NUMERIC(26, 2))
INTEINCO = Column(NUMERIC(26, 2))
EARNPREM = Column(NUMERIC(26, 2))
POUNINCO = Column(NUMERIC(26, 2))
REALSALE = Column(NUMERIC(26, 2))
OTHERBIZINCO = Column(NUMERIC(26, 2))
BIZTOTCOST = Column(NUMERIC(26, 2))
BIZCOST = Column(NUMERIC(26, 2))
INTEEXPE = Column(NUMERIC(26, 2))
POUNEXPE = Column(NUMERIC(26, 2))
REALSALECOST = Column(NUMERIC(26, 2))
DEVEEXPE = Column(NUMERIC(26, 2))
SURRGOLD = Column(NUMERIC(26, 2))
COMPNETEXPE = Column(NUMERIC(26, 2))
CONTRESS = Column(NUMERIC(26, 2))
POLIDIVIEXPE = Column(NUMERIC(26, 2))
REINEXPE = Column(NUMERIC(26, 2))
OTHERBIZCOST = Column(NUMERIC(26, 2))
BIZTAX = Column(NUMERIC(26, 2))
SALESEXPE = Column(NUMERIC(26, 2))
MANAEXPE = Column(NUMERIC(26, 2))
FINEXPE = Column(NUMERIC(26, 2))
ASSEIMPALOSS = Column(NUMERIC(26, 2))
VALUECHGLOSS = Column(NUMERIC(26, 2))
INVEINCO = Column(NUMERIC(26, 2))
ASSOINVEPROF = Column(NUMERIC(26, 2))
EXCHGGAIN = Column(NUMERIC(26, 2))
FUTULOSS = Column(NUMERIC(26, 2))
CUSTINCO = Column(NUMERIC(26, 2))
SUBSIDYINCOME = Column(NUMERIC(26, 2))
OTHERBIZPROF = Column(NUMERIC(26, 2))
PERPROFIT = Column(NUMERIC(26, 2))
NONOREVE = Column(NUMERIC(26, 2))
NONOEXPE = Column(NUMERIC(26, 2))
NONCASSETSDISL = Column(NUMERIC(26, 2))
TOTPROFIT = Column(NUMERIC(26, 2))
INCOTAXEXPE = Column(NUMERIC(26, 2))
UNREINVELOSS = Column(NUMERIC(26, 2))
NETPROFIT = Column(NUMERIC(26, 2))
PARENETP = Column(NUMERIC(26, 2))
MERGEFORMNETPROF = Column(NUMERIC(26, 2))
MINYSHARRIGH = Column(NUMERIC(26, 2))
BASICEPS = Column(NUMERIC(30, 6))
DILUTEDEPS = Column(NUMERIC(30, 6))
OTHERCOMPINCO = Column(NUMERIC(26, 2))
PARECOMPINCO = Column(NUMERIC(26, 2))
MINYSHARINCO = Column(NUMERIC(26, 2))
COMPINCOAMT = Column(NUMERIC(26, 2))
PARECOMPINCOAMT = Column(NUMERIC(26, 2))
MINYSHARINCOAMT = Column(NUMERIC(26, 2))
ISVALID = Column(INT)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
REPORTYEAR = Column(VARCHAR(10))
REPORTDATETYPE = Column(VARCHAR(10))
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
ISREALACCSTA = Column(VARCHAR(10))
ISACORRECT = Column(INT)
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
CUR = Column(VARCHAR(10))
DECLAREDATE = Column(VARCHAR(8))
MAINBIZINCO = Column(NUMERIC(26, 2))
SUNEVENBIZTOTINCO = Column(NUMERIC(26, 2))
SFORMATBIZTOTINCO = Column(NUMERIC(26, 2))
SMERGERBIZTOTINCO = Column(NUMERIC(26, 2))
MAINBIZCOST = Column(NUMERIC(26, 2))
SUNEVENBIZTOTCOST = Column(NUMERIC(26, 2))
SFORMATBIZTOTCOST = Column(NUMERIC(26, 2))
SMERGERBIZTOTCOST = Column(NUMERIC(26, 2))
SUNEVENPERPROFIT = Column(NUMERIC(26, 2))
SFORMATPERPROFIT = Column(NUMERIC(26, 2))
SMERGERPERPROFIT = Column(NUMERIC(26, 2))
SUNEVENTOTPROFIT = Column(NUMERIC(26, 2))
SFORMATTOTPROFIT = Column(NUMERIC(26, 2))
SMERGERTOTPROFIT = Column(NUMERIC(26, 2))
SUNEVENNETPROFIT = Column(NUMERIC(26, 2))
SFORMATNETPROFIT = Column(NUMERIC(26, 2))
SMERGERNETPROFIT = Column(NUMERIC(26, 2))
SUNEVENNETPROFITSUB = Column(NUMERIC(26, 2))
SFORMATNETPROFITSUB = Column(NUMERIC(26, 2))
SMERGERNETPROFITSUB = Column(NUMERIC(26, 2))
EARLYUNDIPROF = Column(NUMERIC(26, 2))
RUNDISPROBYRREGCAP = Column(NUMERIC(26, 2))
OTHERREASADJU = Column(NUMERIC(26, 2))
SUNEVENAVAIDISTPROF = Column(NUMERIC(26, 2))
SFORMATAVAIDISTPROF = Column(NUMERIC(26, 2))
SMERGERAVAIDISTPROF = Column(NUMERIC(26, 2))
AVAIDISTPROF = Column(NUMERIC(26, 2))
LEGALSURP = Column(NUMERIC(26, 2))
STATEXTRUNDI = Column(NUMERIC(26, 2))
PEXTCCAPIFD = Column(NUMERIC(26, 2))
EXTSTAFFFUND = Column(NUMERIC(26, 2))
TRUSTLOSS = Column(NUMERIC(26, 2))
PEXTCDEVEFD = Column(NUMERIC(26, 2))
PPROFRETUINVE = Column(NUMERIC(26, 2))
PSUPPFLOWCAPI = Column(NUMERIC(26, 2))
SUNEVENAVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
SFORMATAVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
SMERGERAVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
AVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
PREFSTOCKDIVI = Column(NUMERIC(26, 2))
EXTRARBIRESE = Column(NUMERIC(26, 2))
COMDIVPAYBABLE = Column(NUMERIC(26, 2))
TURNCAPSDIVI = Column(NUMERIC(26, 2))
SUNEVENUNDIPROF = Column(NUMERIC(26, 2))
SFORMATUNDIPROF = Column(NUMERIC(26, 2))
SMERGERUNDIPROF = Column(NUMERIC(26, 2))
UNDIPROF = Column(NUMERIC(26, 2))
SUNEVENOTHCOMPINCOAMT = Column(NUMERIC(26, 2))
SUNEVENCOMPINCOAMT = Column(NUMERIC(26, 2))
SUNEVENCOMPINCOAMTSUB = Column(NUMERIC(26, 2))
SMERGERCOMPINCOAMTSUB = Column(NUMERIC(26, 2))
NONCASSETSDISI = Column(NUMERIC(26, 2))
NCPOTHINCO = Column(NUMERIC(26, 2))
CINALIBOFRBP = Column(NUMERIC(26, 2))
EQUMCPOTHINCO = Column(NUMERIC(26, 2))
CPLTOHINCO = Column(NUMERIC(26, 2))
EUQMICOLOTHINCO = Column(NUMERIC(26, 2))
CINAFORSFV = Column(NUMERIC(26, 2))
HTMCCINAFORSFV = Column(NUMERIC(26, 2))
EPOCFHGL = Column(NUMERIC(26, 2))
TDIFFFORCUR = Column(NUMERIC(26, 2))
OTHERCPLTOHINCO = Column(NUMERIC(26, 2))
ASSETSDISLINCO = Column(NUMERIC(26, 2))
OTHERINCO = Column(NUMERIC(26, 2))
CONOPERNPROFIT = Column(NUMERIC(26, 2))
TEROPERNPROFIT = Column(NUMERIC(26, 2))
INTERESTEXPENSE = Column(NUMERIC(26, 2))
INTEINCOOPCOST = Column(NUMERIC(26, 2))
CREDITIMPLOSSE = Column(NUMERIC(26, 2))
NETEXPOHEDINC = Column(NUMERIC(26, 2))
OTHEQUINFAVAL = Column(NUMERIC(26, 2))
COMPCREDITFAVAL = Column(NUMERIC(26, 2))
OTHDEBTINVFAVAL = Column(NUMERIC(26, 2))
FINASSINTOOTHINCO = Column(NUMERIC(26, 2))
OTHDEBTINVCREDIMPR = Column(NUMERIC(26, 2))
OTHERSHAREDISTPROF = Column(NUMERIC(26, 2))
HEDCASHFLOW = Column(NUMERIC(26, 2))
EXTGENERISKRESE = Column(NUMERIC(26, 2))
AMORTIZCOSTASSETSSAPI = Column(NUMERIC(26, 2))
INTEPEDEPAYA = Column(NUMERIC(26, 2))
ASSEIMPALOSSPROFIT = Column(NUMERIC(26, 2))
CREDITIMPLOSSEPROFIT = Column(NUMERIC(26, 2))
TMSTAMP = Column(Integer)
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class IncomeReport(Base):
__tablename__ = 'income_report'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
BEGINDATE = Column(VARCHAR(8))
ENDDATE = Column(VARCHAR(8))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
BIZTOTINCO = Column(NUMERIC(26, 2))
BIZINCO = Column(NUMERIC(26, 2))
INTEINCO = Column(NUMERIC(26, 2))
EARNPREM = Column(NUMERIC(26, 2))
POUNINCO = Column(NUMERIC(26, 2))
REALSALE = Column(NUMERIC(26, 2))
OTHERBIZINCO = Column(NUMERIC(26, 2))
BIZTOTCOST = Column(NUMERIC(26, 2))
BIZCOST = Column(NUMERIC(26, 2))
INTEEXPE = Column(NUMERIC(26, 2))
POUNEXPE = Column(NUMERIC(26, 2))
REALSALECOST = Column(NUMERIC(26, 2))
DEVEEXPE = Column(NUMERIC(26, 2))
SURRGOLD = Column(NUMERIC(26, 2))
COMPNETEXPE = Column(NUMERIC(26, 2))
CONTRESS = Column(NUMERIC(26, 2))
POLIDIVIEXPE = Column(NUMERIC(26, 2))
REINEXPE = Column(NUMERIC(26, 2))
OTHERBIZCOST = Column(NUMERIC(26, 2))
BIZTAX = Column(NUMERIC(26, 2))
SALESEXPE = Column(NUMERIC(26, 2))
MANAEXPE = Column(NUMERIC(26, 2))
FINEXPE = Column(NUMERIC(26, 2))
ASSEIMPALOSS = Column(NUMERIC(26, 2))
VALUECHGLOSS = Column(NUMERIC(26, 2))
INVEINCO = Column(NUMERIC(26, 2))
ASSOINVEPROF = Column(NUMERIC(26, 2))
EXCHGGAIN = Column(NUMERIC(26, 2))
FUTULOSS = Column(NUMERIC(26, 2))
CUSTINCO = Column(NUMERIC(26, 2))
SUBSIDYINCOME = Column(NUMERIC(26, 2))
OTHERBIZPROF = Column(NUMERIC(26, 2))
PERPROFIT = Column(NUMERIC(26, 2))
NONOREVE = Column(NUMERIC(26, 2))
NONOEXPE = Column(NUMERIC(26, 2))
NONCASSETSDISL = Column(NUMERIC(26, 2))
TOTPROFIT = Column(NUMERIC(26, 2))
INCOTAXEXPE = Column(NUMERIC(26, 2))
UNREINVELOSS = Column(NUMERIC(26, 2))
NETPROFIT = Column(NUMERIC(26, 2))
PARENETP = Column(NUMERIC(26, 2))
MERGEFORMNETPROF = Column(NUMERIC(26, 2))
MINYSHARRIGH = Column(NUMERIC(26, 2))
BASICEPS = Column(NUMERIC(30, 6))
DILUTEDEPS = Column(NUMERIC(30, 6))
OTHERCOMPINCO = Column(NUMERIC(26, 2))
PARECOMPINCO = Column(NUMERIC(26, 2))
MINYSHARINCO = Column(NUMERIC(26, 2))
COMPINCOAMT = Column(NUMERIC(26, 2))
PARECOMPINCOAMT = Column(NUMERIC(26, 2))
MINYSHARINCOAMT = Column(NUMERIC(26, 2))
ISVALID = Column(INT)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
REPORTYEAR = Column(VARCHAR(10))
REPORTDATETYPE = Column(VARCHAR(10))
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
ISREALACCSTA = Column(VARCHAR(10))
ISACORRECT = Column(INT)
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
CUR = Column(VARCHAR(10))
DECLAREDATE = Column(VARCHAR(8))
MAINBIZINCO = Column(NUMERIC(26, 2))
SUNEVENBIZTOTINCO = Column(NUMERIC(26, 2))
SFORMATBIZTOTINCO = Column(NUMERIC(26, 2))
SMERGERBIZTOTINCO = Column(NUMERIC(26, 2))
MAINBIZCOST = Column(NUMERIC(26, 2))
SUNEVENBIZTOTCOST = Column(NUMERIC(26, 2))
SFORMATBIZTOTCOST = Column(NUMERIC(26, 2))
SMERGERBIZTOTCOST = Column(NUMERIC(26, 2))
SUNEVENPERPROFIT = Column(NUMERIC(26, 2))
SFORMATPERPROFIT = Column(NUMERIC(26, 2))
SMERGERPERPROFIT = Column(NUMERIC(26, 2))
SUNEVENTOTPROFIT = Column(NUMERIC(26, 2))
SFORMATTOTPROFIT = Column(NUMERIC(26, 2))
SMERGERTOTPROFIT = Column(NUMERIC(26, 2))
SUNEVENNETPROFIT = Column(NUMERIC(26, 2))
SFORMATNETPROFIT = Column(NUMERIC(26, 2))
SMERGERNETPROFIT = Column(NUMERIC(26, 2))
SUNEVENNETPROFITSUB = Column(NUMERIC(26, 2))
SFORMATNETPROFITSUB = Column(NUMERIC(26, 2))
SMERGERNETPROFITSUB = Column(NUMERIC(26, 2))
EARLYUNDIPROF = Column(NUMERIC(26, 2))
RUNDISPROBYRREGCAP = Column(NUMERIC(26, 2))
OTHERREASADJU = Column(NUMERIC(26, 2))
SUNEVENAVAIDISTPROF = Column(NUMERIC(26, 2))
SFORMATAVAIDISTPROF = Column(NUMERIC(26, 2))
SMERGERAVAIDISTPROF = Column(NUMERIC(26, 2))
AVAIDISTPROF = Column(NUMERIC(26, 2))
LEGALSURP = Column(NUMERIC(26, 2))
STATEXTRUNDI = Column(NUMERIC(26, 2))
PEXTCCAPIFD = Column(NUMERIC(26, 2))
EXTSTAFFFUND = Column(NUMERIC(26, 2))
TRUSTLOSS = Column(NUMERIC(26, 2))
PEXTCDEVEFD = Column(NUMERIC(26, 2))
PPROFRETUINVE = Column(NUMERIC(26, 2))
PSUPPFLOWCAPI = Column(NUMERIC(26, 2))
SUNEVENAVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
SFORMATAVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
SMERGERAVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
AVAIDISTSHAREPROF = Column(NUMERIC(26, 2))
PREFSTOCKDIVI = Column(NUMERIC(26, 2))
EXTRARBIRESE = Column(NUMERIC(26, 2))
COMDIVPAYBABLE = Column(NUMERIC(26, 2))
TURNCAPSDIVI = Column(NUMERIC(26, 2))
SUNEVENUNDIPROF = Column(NUMERIC(26, 2))
SFORMATUNDIPROF = Column(NUMERIC(26, 2))
SMERGERUNDIPROF = Column(NUMERIC(26, 2))
UNDIPROF = Column(NUMERIC(26, 2))
SUNEVENOTHCOMPINCOAMT = Column(NUMERIC(26, 2))
SUNEVENCOMPINCOAMT = Column(NUMERIC(26, 2))
SUNEVENCOMPINCOAMTSUB = Column(NUMERIC(26, 2))
SMERGERCOMPINCOAMTSUB = Column(NUMERIC(26, 2))
NONCASSETSDISI = Column(NUMERIC(26, 2))
NCPOTHINCO = Column(NUMERIC(26, 2))
CINALIBOFRBP = Column(NUMERIC(26, 2))
EQUMCPOTHINCO = Column(NUMERIC(26, 2))
CPLTOHINCO = Column(NUMERIC(26, 2))
EUQMICOLOTHINCO = Column(NUMERIC(26, 2))
CINAFORSFV = Column(NUMERIC(26, 2))
HTMCCINAFORSFV = Column(NUMERIC(26, 2))
EPOCFHGL = Column(NUMERIC(26, 2))
TDIFFFORCUR = Column(NUMERIC(26, 2))
OTHERCPLTOHINCO = Column(NUMERIC(26, 2))
ASSETSDISLINCO = Column(NUMERIC(26, 2))
OTHERINCO = Column(NUMERIC(26, 2))
CONOPERNPROFIT = Column(NUMERIC(26, 2))
TEROPERNPROFIT = Column(NUMERIC(26, 2))
INTERESTEXPENSE = Column(NUMERIC(26, 2))
INTEINCOOPCOST = Column(NUMERIC(26, 2))
CREDITIMPLOSSE = Column(NUMERIC(26, 2))
NETEXPOHEDINC = Column(NUMERIC(26, 2))
OTHEQUINFAVAL = Column(NUMERIC(26, 2))
COMPCREDITFAVAL = Column(NUMERIC(26, 2))
OTHDEBTINVFAVAL = Column(NUMERIC(26, 2))
FINASSINTOOTHINCO = Column(NUMERIC(26, 2))
OTHDEBTINVCREDIMPR = Column(NUMERIC(26, 2))
OTHERSHAREDISTPROF = Column(NUMERIC(26, 2))
HEDCASHFLOW = Column(NUMERIC(26, 2))
EXTGENERISKRESE = Column(NUMERIC(26, 2))
AMORTIZCOSTASSETSSAPI = Column(NUMERIC(26, 2))
INTEPEDEPAYA = Column(NUMERIC(26, 2))
ASSEIMPALOSSPROFIT = Column(NUMERIC(26, 2))
CREDITIMPLOSSEPROFIT = Column(NUMERIC(26, 2))
TMSTAMP = Column(Integer)
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class IndicatorReport(Base):
__tablename__ = 'indicator_report'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
ENDPUBLISHDATE = Column(DATE)
ENDDATE = Column(VARCHAR(8))
REPORTDATETYPE = Column(VARCHAR(10))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
NPCUT = Column(NUMERIC(26, 6))
EPSDILUTED = Column(NUMERIC(26, 6))
EPSWEIGHTED = Column(NUMERIC(26, 6))
EPSDILUTEDCUT = Column(NUMERIC(26, 6))
EPSWEIGHTEDCUT = Column(NUMERIC(26, 6))
EPSFULLDILUTED = Column(NUMERIC(26, 6))
EPSBASIC = Column(NUMERIC(26, 6))
EPSBASICEPSCUT = Column(NUMERIC(26, 6))
ROEDILUTED = Column(NUMERIC(26, 6))
ROEWEIGHTED = Column(NUMERIC(26, 6))
ROEDILUTEDCUT = Column(NUMERIC(26, 6))
ROEWEIGHTEDCUT = Column(NUMERIC(26, 6))
NAPS = Column(NUMERIC(26, 6))
NAPSADJ = Column(NUMERIC(26, 6))
OPNCFPS = Column(NUMERIC(26, 6))
EBIT = Column(NUMERIC(26, 6))
EBITSCOVER = Column(NUMERIC(26, 6))
EBITDA = Column(NUMERIC(26, 6))
EBITDASCOVER = Column(NUMERIC(26, 6))
EPSFULLDILUTEDCUT = Column(NUMERIC(26, 6))
ROEDILUTEDMOP = Column(NUMERIC(26, 6))
ROEWEIGHTEDMOP = Column(NUMERIC(26, 6))
EPSDILUTEDMOP = Column(NUMERIC(26, 6))
EPSWEIGHTEDMOP = Column(NUMERIC(26, 6))
ROEDILUTEDOP = Column(NUMERIC(26, 6))
ROEWEIGHTEDOP = Column(NUMERIC(26, 6))
EPSDILUTEDOP = Column(NUMERIC(26, 6))
EPSWEIGHTEDOP = Column(NUMERIC(26, 6))
ISVALID = Column(INT)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
REPORTYEAR = Column(VARCHAR(10))
BEGINDATE = Column(VARCHAR(8))
CUR = Column(VARCHAR(10))
ISREALACCSTA = Column(INT)
ISACORRECT = Column(INT)
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
FINSTATMENTCODE = Column(INT)
TMSTAMP = Column(Integer)
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class IndicatorMRQ(Base):
__tablename__ = 'indicator_mrq'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
ENDPUBLISHDATE = Column(DATE)
ENDDATE = Column(VARCHAR(8))
REPORTDATETYPE = Column(VARCHAR(10))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
NPCUT = Column(NUMERIC(26, 6))
EPSDILUTED = Column(NUMERIC(26, 6))
EPSWEIGHTED = Column(NUMERIC(26, 6))
EPSDILUTEDCUT = Column(NUMERIC(26, 6))
EPSWEIGHTEDCUT = Column(NUMERIC(26, 6))
EPSFULLDILUTED = Column(NUMERIC(26, 6))
EPSBASIC = Column(NUMERIC(26, 6))
EPSBASICEPSCUT = Column(NUMERIC(26, 6))
ROEDILUTED = Column(NUMERIC(26, 6))
ROEWEIGHTED = Column(NUMERIC(26, 6))
ROEDILUTEDCUT = Column(NUMERIC(26, 6))
ROEWEIGHTEDCUT = Column(NUMERIC(26, 6))
NAPS = Column(NUMERIC(26, 6))
NAPSADJ = Column(NUMERIC(26, 6))
OPNCFPS = Column(NUMERIC(26, 6))
EBIT = Column(NUMERIC(26, 6))
EBITSCOVER = Column(NUMERIC(26, 6))
EBITDA = Column(NUMERIC(26, 6))
EBITDASCOVER = Column(NUMERIC(26, 6))
EPSFULLDILUTEDCUT = Column(NUMERIC(26, 6))
ROEDILUTEDMOP = Column(NUMERIC(26, 6))
ROEWEIGHTEDMOP = Column(NUMERIC(26, 6))
EPSDILUTEDMOP = Column(NUMERIC(26, 6))
EPSWEIGHTEDMOP = Column(NUMERIC(26, 6))
ROEDILUTEDOP = Column(NUMERIC(26, 6))
ROEWEIGHTEDOP = Column(NUMERIC(26, 6))
EPSDILUTEDOP = Column(NUMERIC(26, 6))
EPSWEIGHTEDOP = Column(NUMERIC(26, 6))
ISVALID = Column(INT)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
REPORTYEAR = Column(VARCHAR(10))
BEGINDATE = Column(VARCHAR(8))
CUR = Column(VARCHAR(10))
ISREALACCSTA = Column(INT)
ISACORRECT = Column(INT)
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
FINSTATMENTCODE = Column(INT)
TMSTAMP = Column(Integer)
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
class IndicatorTTM(Base):
__tablename__ = 'indicator_ttm'
__table_args__ = {"useexisting": True}
ID = Column(VARCHAR(32), primary_key=True)
CHID = Column(INT)
COMPCODE = Column(VARCHAR(20))
PUBLISHDATE = Column(DATE)
ENDPUBLISHDATE = Column(DATE)
ENDDATE = Column(VARCHAR(8))
REPORTDATETYPE = Column(VARCHAR(10))
REPORTTYPE = Column(VARCHAR(10))
ACCSTACODE = Column(VARCHAR(10))
NPCUT = Column(NUMERIC(26, 6))
EPSDILUTED = Column(NUMERIC(26, 6))
EPSWEIGHTED = Column(NUMERIC(26, 6))
EPSDILUTEDCUT = Column(NUMERIC(26, 6))
EPSWEIGHTEDCUT = Column(NUMERIC(26, 6))
EPSFULLDILUTED = Column(NUMERIC(26, 6))
EPSBASIC = Column(NUMERIC(26, 6))
EPSBASICEPSCUT = Column(NUMERIC(26, 6))
ROEDILUTED = Column(NUMERIC(26, 6))
ROEWEIGHTED = Column(NUMERIC(26, 6))
ROEDILUTEDCUT = Column(NUMERIC(26, 6))
ROEWEIGHTEDCUT = Column(NUMERIC(26, 6))
NAPS = Column(NUMERIC(26, 6))
NAPSADJ = Column(NUMERIC(26, 6))
OPNCFPS = Column(NUMERIC(26, 6))
EBIT = Column(NUMERIC(26, 6))
EBITSCOVER = Column(NUMERIC(26, 6))
EBITDA = Column(NUMERIC(26, 6))
EBITDASCOVER = Column(NUMERIC(26, 6))
EPSFULLDILUTEDCUT = Column(NUMERIC(26, 6))
ROEDILUTEDMOP = Column(NUMERIC(26, 6))
ROEWEIGHTEDMOP = Column(NUMERIC(26, 6))
EPSDILUTEDMOP = Column(NUMERIC(26, 6))
EPSWEIGHTEDMOP = Column(NUMERIC(26, 6))
ROEDILUTEDOP = Column(NUMERIC(26, 6))
ROEWEIGHTEDOP = Column(NUMERIC(26, 6))
EPSDILUTEDOP = Column(NUMERIC(26, 6))
EPSWEIGHTEDOP = Column(NUMERIC(26, 6))
ISVALID = Column(INT)
ENTRYDATE = Column(DATE)
ENTRYTIME = Column(VARCHAR(8))
REPORTYEAR = Column(VARCHAR(10))
BEGINDATE = Column(VARCHAR(8))
CUR = Column(VARCHAR(10))
ISREALACCSTA = Column(INT)
ISACORRECT = Column(INT)
ISAUDIT = Column(INT)
INTEGRITY = Column(VARCHAR(10))
DATASOURCE = Column(VARCHAR(10))
ISACTPUB = Column(INT)
FINSTATMENTCODE = Column(INT)
TMSTAMP = Column(Integer)
creat_time = Column(DATE)
update_time = Column(DATE)
__pit_column__ = {
'pub_date': PUBLISHDATE,
'filter_date': ENDDATE,
'index': COMPCODE
}
| 39.418436 | 109 | 0.666875 | 8,211 | 70,559 | 5.700767 | 0.067227 | 0.379371 | 0.435493 | 0.435814 | 0.984704 | 0.984704 | 0.984704 | 0.984704 | 0.984704 | 0.984704 | 0 | 0.077157 | 0.194731 | 70,559 | 1,789 | 110 | 39.44047 | 0.746669 | 0.001644 | 0 | 0.976068 | 0 | 0 | 0.008135 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.001709 | 0.002279 | 0 | 0.97151 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 11 |
d42c2db4da31cd7adc4cbaf2ed67df5e303dd072 | 55,639 | py | Python | foreman/data_refinery_foreman/foreman/test_main.py | cgreene/refinebio | fe75e42f2963d60c4307806cba11520754547190 | [
"BSD-3-Clause"
] | null | null | null | foreman/data_refinery_foreman/foreman/test_main.py | cgreene/refinebio | fe75e42f2963d60c4307806cba11520754547190 | [
"BSD-3-Clause"
] | null | null | null | foreman/data_refinery_foreman/foreman/test_main.py | cgreene/refinebio | fe75e42f2963d60c4307806cba11520754547190 | [
"BSD-3-Clause"
] | null | null | null | from unittest.mock import patch, MagicMock
import datetime
import math
import time
from django.utils import timezone
from django.test import TransactionTestCase, TestCase
from data_refinery_foreman.foreman import main
from data_refinery_common.models import (
ComputedFile,
ComputationalResult,
Dataset,
DownloaderJob,
DownloaderJobOriginalFileAssociation,
Experiment,
ExperimentSampleAssociation,
Organism,
OriginalFile,
OriginalFileSampleAssociation,
ProcessorJob,
ProcessorJobDatasetAssociation,
ProcessorJobOriginalFileAssociation,
Sample,
SampleComputedFileAssociation,
SurveyJob,
SurveyJobKeyValue,
)
from test.support import EnvironmentVarGuard # Python >=3
# For use in tests that test the JOB_CREATED_AT_CUTOFF functionality.
DAY_BEFORE_JOB_CUTOFF = main.JOB_CREATED_AT_CUTOFF - datetime.timedelta(days=1)
class ForemanTestCase(TestCase):
def create_downloader_job(self, suffix="e8eaf540"):
job = DownloaderJob(downloader_task="SRA",
nomad_job_id="DOWNLOADER/dispatch-1528945054-" + suffix,
num_retries=0,
accession_code="NUNYA",
success=None)
job.save()
og_file = OriginalFile()
og_file.source_filename = "doesn't matter"
og_file.filename = "this either"
og_file.absolute_file_path = "nor this"
og_file.save()
assoc1 = DownloaderJobOriginalFileAssociation()
assoc1.original_file = og_file
assoc1.downloader_job = job
assoc1.save()
og_file = OriginalFile()
og_file.source_filename = "doesn't matter"
og_file.filename = "this either"
og_file.absolute_file_path = "nor this"
og_file.save()
assoc = DownloaderJobOriginalFileAssociation()
assoc.original_file = og_file
assoc.downloader_job = job
assoc.save()
return job
@patch('data_refinery_foreman.foreman.main.send_job')
def test_requeuing_downloader_job(self, mock_send_job):
mock_send_job.return_value = True
job = self.create_downloader_job()
main.requeue_downloader_job(job)
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = DownloaderJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
self.assertEqual(retried_job.original_files.count(), 2)
@patch('data_refinery_foreman.foreman.main.send_job')
def test_repeated_download_failures(self, mock_send_job):
"""Jobs will be repeatedly retried."""
mock_send_job.return_value = True
job = self.create_downloader_job()
for i in range(main.MAX_NUM_RETRIES):
main.handle_downloader_jobs([job])
self.assertEqual(i + 1, len(mock_send_job.mock_calls))
jobs = DownloaderJob.objects.all().order_by("-id")
previous_job = jobs[1]
self.assertTrue(previous_job.retried)
self.assertEqual(previous_job.num_retries, i)
self.assertFalse(previous_job.success)
job = jobs[0]
self.assertFalse(job.retried)
self.assertEqual(job.num_retries, i + 1)
# Once MAX_NUM_RETRIES has been hit handle_repeated_failure
# should be called.
main.handle_downloader_jobs([job])
last_job = DownloaderJob.objects.all().order_by("-id")[0]
self.assertTrue(last_job.retried)
self.assertEqual(last_job.num_retries, main.MAX_NUM_RETRIES)
self.assertFalse(last_job.success)
@patch('data_refinery_foreman.foreman.main.send_job')
def test_retrying_failed_downloader_jobs(self, mock_send_job):
mock_send_job.return_value = True
job = self.create_downloader_job()
job.success = False
job.save()
main.retry_failed_downloader_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = DownloaderJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.get_nomad_jobs_breakdown')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_retrying_many_failed_downloader_jobs(self, mock_send_job, mock_breakdown, mock_active_volumes):
mock_send_job.return_value = True
mock_breakdown.return_value = {"nomad_pending_jobs_by_volume": {"0": 7, "1": 9},
"nomad_running_jobs_by_volume": {"0": 300, "1": 400}}
mock_active_volumes.return_value = ['0', '1']
main.update_volume_work_depth(datetime.timedelta(0))
self.assertEqual(main.VOLUME_WORK_DEPTH, {"0": 307, "1": 409})
# Ensure that there are at least enough jobs to saturate the desired work depth
# for both mocked volumes
NUM_PAGES = 4 + math.ceil(2 * main.DESIRED_WORK_DEPTH / main.PAGE_SIZE)
for x in range(0, main.PAGE_SIZE * NUM_PAGES):
job = self.create_downloader_job(str(x))
job.success = False
job.save()
main.retry_failed_downloader_jobs()
# No jobs actually make it in Nomad queue, but we keep a tally of the last reported work
# depth plus any new queued jobs, so this should only queue up enough jobs to fill the
# DESIRED_WORK_DEPTH for every node
# ((DESIRED_WORK_DEPTH - 67) + (DESIRED_WORK_DEPTH - 99) jobs in total)
self.assertEqual(len(mock_send_job.mock_calls), 2 * main.DESIRED_WORK_DEPTH - 307 - 409)
self.assertEqual(main.VOLUME_WORK_DEPTH,
{"0": main.DESIRED_WORK_DEPTH, "1": main.DESIRED_WORK_DEPTH})
jobs = DownloaderJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[main.PAGE_SIZE * NUM_PAGES + 1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_retrying_hung_downloader_jobs(self, mock_nomad, mock_send_job):
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_downloader_job()
job.start_time = timezone.now()
job.save()
main.retry_hung_downloader_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = DownloaderJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_not_retrying_hung_downloader_jobs(self, mock_nomad, mock_send_job):
"""Tests that we don't restart downloader jobs that are still running."""
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "running"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_downloader_job()
job.start_time = timezone.now()
job.save()
main.retry_hung_downloader_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = DownloaderJob.objects.order_by('id')
original_job = jobs[0]
self.assertFalse(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertEqual(original_job.success, None)
self.assertEqual(jobs.count(), 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_retrying_lost_downloader_jobs(self, mock_nomad, mock_send_job):
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_downloader_job()
job.created_at = timezone.now()
job.save()
main.retry_lost_downloader_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = DownloaderJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_not_retrying_old_downloader_jobs(self, mock_nomad, mock_send_job):
"""Makes sure temporary logic to limit the Foreman's scope works."""
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_downloader_job()
job.created_at = DAY_BEFORE_JOB_CUTOFF
job.save()
main.retry_lost_downloader_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = DownloaderJob.objects.order_by('id')
self.assertEqual(1, DownloaderJob.objects.all().count())
@patch('data_refinery_foreman.foreman.main.send_job')
def test_retrying_lost_downloader_jobs_time(self, mock_send_job):
mock_send_job.return_value = True
job = self.create_downloader_job()
job.created_at = timezone.now() - (main.MIN_LOOP_TIME + datetime.timedelta(minutes=1))
job.save()
main.retry_lost_downloader_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = DownloaderJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_not_retrying_lost_downloader_jobs(self, mock_nomad, mock_send_job):
"""Make sure that we don't retry downloader jobs we shouldn't."""
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "pending"}
return ret_value
mock_nomad.side_effect=mock_init_nomad
job = self.create_downloader_job()
job.created_at = timezone.now()
job.save()
main.retry_lost_downloader_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = DownloaderJob.objects.order_by('id')
original_job = jobs[0]
self.assertFalse(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertEqual(original_job.success, None)
# Make sure no additional job was created.
self.assertEqual(jobs.count(), 1)
def create_processor_job(self, pipeline="AFFY_TO_PCL", ram_amount=2048, start_time=None):
job = ProcessorJob(pipeline_applied=pipeline,
nomad_job_id="PROCESSOR/dispatch-1528945054-e8eaf540",
ram_amount=ram_amount,
num_retries=0,
volume_index="1",
success=None,
start_time=start_time)
job.save()
og_file = OriginalFile()
og_file.source_filename = "doesn't matter"
og_file.filename = "this either"
og_file.absolute_file_path = "nor this"
og_file.save()
assoc1 = ProcessorJobOriginalFileAssociation()
assoc1.original_file = og_file
assoc1.processor_job = job
assoc1.save()
og_file = OriginalFile()
og_file.source_filename = "doesn't matter"
og_file.filename = "this either"
og_file.absolute_file_path = "nor this"
og_file.save()
assoc = ProcessorJobOriginalFileAssociation()
assoc.original_file = og_file
assoc.processor_job = job
assoc.save()
return job
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_requeuing_processor_job(self, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
job = self.create_processor_job()
main.requeue_processor_job(job)
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_requeuing_processor_job_no_volume(self, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
job = self.create_processor_job()
job.volume_index = None
job.save()
self.env = EnvironmentVarGuard()
self.env.set('RUNING_IN_CLOUD', 'True')
with self.settings(RUNNING_IN_CLOUD=True):
main.requeue_processor_job(job)
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
self.assertIn(retried_job.volume_index, ["1", "2", "3"])
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_requeuing_compendia_job_no_volume(self, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
job = self.create_processor_job()
job.volume_index = None
job.pipeline_applied = "CREATE_COMPENDIA"
job.save()
self.env = EnvironmentVarGuard()
self.env.set('RUNING_IN_CLOUD', 'True')
with self.settings(RUNNING_IN_CLOUD=True):
main.requeue_processor_job(job)
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
self.assertEqual(retried_job.volume_index, None)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_requeuing_processor_job_w_more_ram(self, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
job = self.create_processor_job(pipeline="SALMON", ram_amount=16384, start_time=timezone.now())
main.requeue_processor_job(job)
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
self.assertEqual(original_job.ram_amount, 16384)
self.assertEqual(retried_job.ram_amount, 32768)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_repeated_processor_failures(self, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
"""Jobs will be repeatedly retried."""
job = self.create_processor_job()
for i in range(main.MAX_NUM_RETRIES):
main.handle_processor_jobs([job])
self.assertEqual(i + 1, len(mock_send_job.mock_calls))
jobs = ProcessorJob.objects.all().order_by("-id")
previous_job = jobs[1]
self.assertTrue(previous_job.retried)
self.assertEqual(previous_job.num_retries, i)
self.assertFalse(previous_job.success)
job = jobs[0]
self.assertFalse(job.retried)
self.assertEqual(job.num_retries, i + 1)
# Once MAX_NUM_RETRIES has been hit handle_repeated_failure
# should be called.
main.handle_processor_jobs([job])
last_job = ProcessorJob.objects.all().order_by("-id")[0]
self.assertTrue(last_job.retried)
self.assertEqual(last_job.num_retries, main.MAX_NUM_RETRIES)
self.assertFalse(last_job.success)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_retrying_failed_processor_jobs(self, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
job = self.create_processor_job()
job.success = False
job.save()
main.retry_failed_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_not_retrying_wrong_volume_index(self, mock_send_job, mock_get_active_volumes):
"""If a volume isn't mounted then we shouldn't queue jobs for it."""
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"2", "3"}
job = self.create_processor_job()
job.success = False
job.save()
main.retry_failed_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertFalse(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
self.assertEqual(len(jobs), 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_retrying_hung_processor_jobs(self, mock_nomad, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_processor_job()
job.start_time = timezone.now()
job.save()
main.retry_hung_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_not_retrying_hung_processor_jobs(self, mock_nomad, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
"""Tests that we don't restart processor jobs that are still running."""
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "running"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_processor_job()
job.start_time = timezone.now()
job.save()
main.retry_hung_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertFalse(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertEqual(original_job.success, None)
self.assertEqual(jobs.count(), 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_retrying_lost_processor_jobs(self, mock_nomad, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_processor_job()
job.created_at = timezone.now()
job.save()
main.retry_lost_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_retrying_lost_smasher_jobs(self, mock_nomad, mock_send_job, mock_get_active_volumes):
"""Make sure that the smasher jobs will get retried even though they
don't have a volume_index.
"""
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_processor_job(pipeline="SMASHER")
job.volume_index = None # Smasher jobs won't have a volume_index.
job.created_at = timezone.now()
job.save()
main.retry_lost_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_not_retrying_old_processor_jobs(self, mock_nomad, mock_send_job):
"""Makes sure temporary logic to limit the Foreman's scope works."""
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_processor_job()
job.created_at = DAY_BEFORE_JOB_CUTOFF
job.save()
main.retry_lost_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = ProcessorJob.objects.order_by('id')
self.assertEqual(1, ProcessorJob.objects.all().count())
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_not_retrying_lost_processor_jobs(self, mock_nomad, mock_send_job, mock_get_active_volumes):
"""Make sure that we don't retry processor jobs we shouldn't."""
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "pending"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_processor_job()
job.created_at = timezone.now()
job.save()
main.retry_lost_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertFalse(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertEqual(original_job.success, None)
# Make sure no additional job was created.
self.assertEqual(jobs.count(), 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_retrying_lost_processor_jobs_time(self, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
job = self.create_processor_job()
job.created_at = timezone.now() - (main.MIN_LOOP_TIME + datetime.timedelta(minutes=1))
job.save()
main.retry_lost_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = ProcessorJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_not_retrying_janitor_jobs(self, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
job = self.create_processor_job(pipeline="JANITOR")
job.created_at = timezone.now() - (main.MIN_LOOP_TIME + datetime.timedelta(minutes=1))
job.save()
main.retry_lost_processor_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = ProcessorJob.objects.order_by('id')
self.assertEqual(len(jobs), 1)
def create_survey_job(self):
job = SurveyJob(source_type="SRA",
nomad_job_id="SURVEYOR/dispatch-1528945054-e8eaf540",
num_retries=0, success=None)
job.save()
sjkv = SurveyJobKeyValue()
sjkv.key = "experiment_accession_code"
sjkv.value = "RJ-1234-XYZ"
sjkv.survey_job = job
sjkv.save()
return job
@patch('data_refinery_foreman.foreman.main.send_job')
def test_requeuing_survey_job(self, mock_send_job):
mock_send_job.return_value = True
job = self.create_survey_job()
main.requeue_survey_job(job)
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = SurveyJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.send_job')
def test_repeated_survey_failures(self, mock_send_job):
"""Jobs will be repeatedly retried."""
mock_send_job.return_value = True
job = self.create_survey_job()
for i in range(main.MAX_NUM_RETRIES):
main.handle_survey_jobs([job])
self.assertEqual(i + 1, len(mock_send_job.mock_calls))
jobs = SurveyJob.objects.all().order_by("-id")
previous_job = jobs[1]
self.assertTrue(previous_job.retried)
self.assertEqual(previous_job.num_retries, i)
self.assertFalse(previous_job.success)
job = jobs[0]
self.assertFalse(job.retried)
self.assertEqual(job.num_retries, i + 1)
# Once MAX_NUM_RETRIES has been hit handle_repeated_failure
# should be called.
main.handle_survey_jobs([job])
last_job = SurveyJob.objects.all().order_by("-id")[0]
self.assertTrue(last_job.retried)
self.assertEqual(last_job.num_retries, main.MAX_NUM_RETRIES)
self.assertFalse(last_job.success)
# MAX TOTAL tests
self.env = EnvironmentVarGuard()
self.env.set('MAX_TOTAL_JOBS', '0')
with self.env:
job = self.create_survey_job()
result = main.handle_survey_jobs([job])
self.assertFalse(result)
self.env.set('MAX_TOTAL_JOBS', '1000')
with self.env:
job = self.create_survey_job()
result = main.requeue_survey_job(job)
self.assertTrue(result)
@patch('data_refinery_foreman.foreman.main.send_job')
def test_retrying_failed_survey_jobs(self, mock_send_job):
mock_send_job.return_value = True
job = self.create_survey_job()
job.success = False
job.save()
main.retry_failed_survey_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = SurveyJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_retrying_hung_survey_jobs(self, mock_nomad, mock_send_job):
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_survey_job()
job.start_time = timezone.now()
job.save()
main.retry_hung_survey_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = SurveyJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_not_retrying_hung_survey_jobs(self, mock_nomad, mock_send_job):
"""Tests that we don't restart survey jobs that are still running."""
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "running"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_survey_job()
job.start_time = timezone.now()
job.save()
main.retry_hung_survey_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = SurveyJob.objects.order_by('id')
original_job = jobs[0]
self.assertFalse(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertEqual(original_job.success, None)
self.assertEqual(jobs.count(), 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_retrying_lost_survey_jobs(self, mock_nomad, mock_send_job):
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_survey_job()
job.created_at = timezone.now()
job.save()
main.retry_lost_survey_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = SurveyJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_not_retrying_old_survey_jobs(self, mock_nomad, mock_send_job):
"""Makes sure temporary logic to limit the Foreman's scope works."""
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "dead"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_survey_job()
job.created_at = DAY_BEFORE_JOB_CUTOFF
job.save()
main.retry_lost_survey_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = SurveyJob.objects.order_by('id')
self.assertEqual(1, SurveyJob.objects.all().count())
@patch('data_refinery_foreman.foreman.main.send_job')
@patch('data_refinery_foreman.foreman.main.Nomad')
def test_not_retrying_lost_survey_jobs(self, mock_nomad, mock_send_job):
"""Make sure that we don't retry survey jobs we shouldn't."""
mock_send_job.return_value = True
def mock_init_nomad(host, port=0, timeout=0):
ret_value = MagicMock()
ret_value.job = MagicMock()
ret_value.job.get_job = MagicMock()
ret_value.job.get_job.side_effect = lambda _: {"Status": "pending"}
return ret_value
mock_nomad.side_effect = mock_init_nomad
job = self.create_survey_job()
job.created_at = timezone.now()
job.save()
main.retry_lost_survey_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 0)
jobs = SurveyJob.objects.order_by('id')
original_job = jobs[0]
self.assertFalse(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertEqual(original_job.success, None)
# Make sure no additional job was created.
self.assertEqual(jobs.count(), 1)
@patch('data_refinery_foreman.foreman.main.send_job')
def test_retrying_lost_survey_jobs_time(self, mock_send_job):
mock_send_job.return_value = True
job = self.create_survey_job()
job.created_at = timezone.now() - (main.MIN_LOOP_TIME + datetime.timedelta(minutes=1))
job.save()
main.retry_lost_survey_jobs()
self.assertEqual(len(mock_send_job.mock_calls), 1)
jobs = SurveyJob.objects.order_by('id')
original_job = jobs[0]
self.assertTrue(original_job.retried)
self.assertEqual(original_job.num_retries, 0)
self.assertFalse(original_job.success)
retried_job = jobs[1]
self.assertEqual(retried_job.num_retries, 1)
@patch('data_refinery_foreman.foreman.main.get_active_volumes')
@patch('data_refinery_foreman.foreman.main.send_job')
def test_janitor(self, mock_send_job, mock_get_active_volumes):
mock_send_job.return_value = True
mock_get_active_volumes.return_value = {"1", "2", "3"}
for p in ["1", "2", "3"]:
pj = ProcessorJob()
pj.volume_index = p
pj.save()
main.send_janitor_jobs()
self.assertEqual(ProcessorJob.objects.all().count(), 7)
self.assertEqual(ProcessorJob.objects.filter(pipeline_applied="JANITOR").count(), 4)
# Make sure that the janitors are dispatched to the correct volumes.
ixs = ["1", "2", "3", None]
for p in ProcessorJob.objects.filter(pipeline_applied="JANITOR"):
self.assertTrue(p.volume_index in ixs)
ixs.remove(p.volume_index)
class CleanDatabaseTestCase(TransactionTestCase):
def test_cleandb(self):
sample = Sample()
sample.save()
result = ComputationalResult()
result.save()
good_file = ComputedFile()
good_file.s3_bucket = "my_cool_bucket"
good_file.s3_key = "my_sweet_key"
good_file.size_in_bytes = 1337
good_file.result = result
good_file.is_public = True
good_file.is_smashable = True
good_file.save()
sca = SampleComputedFileAssociation()
sca.sample = sample
sca.computed_file = good_file
sca.save()
bad_file = ComputedFile()
bad_file.s3_bucket = None
bad_file.s3_key = None
bad_file.result = result
bad_file.size_in_bytes = 7331
bad_file.is_public = True
bad_file.is_smashable = True
bad_file.save()
sca = SampleComputedFileAssociation()
sca.sample = sample
sca.computed_file = bad_file
sca.save()
self.assertEqual(sample.computed_files.count(), 2)
self.assertEqual(sample.get_most_recent_smashable_result_file().id, bad_file.id)
main.clean_database()
self.assertEqual(sample.get_most_recent_smashable_result_file().id, good_file.id)
# class JobPrioritizationTestCase(TestCase):
# def setUp(self):
# """Create a lot of resources that could be associated with either
# ProcessorJobs or DownloaderJobs. Since the logic of when to actually
# queue these is the same, we can use these for testing both. However
# The actual jobs that will be queued need to be created by the job-type
# specific functions.
# """
# human = Organism(name="HOMO_SAPIENS", taxonomy_id=9606, is_scientific_name=True)
# human.save()
# zebrafish = Organism(name="DANIO_RERIO", taxonomy_id=1337, is_scientific_name=True)
# zebrafish.save()
# # Salmon experiment that is 50% complete.
# experiment = Experiment(accession_code='ERP036000')
# experiment.save()
# ## First sample, this one has been processed.
# pj = ProcessorJob()
# pj.accession_code = "ERR036000"
# pj.pipeline_applied = "SALMON"
# pj.success = True
# pj.save()
# og = OriginalFile()
# og.filename = "ERR036000.fastq.gz"
# og.source_filename = "ERR036000.fastq.gz"
# og.source_url = "ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR036/ERR036000/ERR036000_1.fastq.gz"
# og.is_archive = True
# og.save()
# sample = Sample()
# sample.accession_code = 'ERR036000'
# sample.organism = human
# sample.save()
# assoc = OriginalFileSampleAssociation()
# assoc.sample = sample
# assoc.original_file = og
# assoc.save()
# assoc = ProcessorJobOriginalFileAssociation()
# assoc.processor_job = pj
# assoc.original_file = og
# assoc.save()
# assoc = ExperimentSampleAssociation()
# assoc.sample = sample
# assoc.experiment = experiment
# assoc.save()
# ## Second sample, this one hasn't been processed.
# self.in_progress_salmon_og = OriginalFile()
# self.in_progress_salmon_og.filename = "ERR036001.fastq.gz"
# self.in_progress_salmon_og.source_filename = "ERR036001.fastq.gz"
# self.in_progress_salmon_og.source_url = "ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR036/ERR036001/ERR036001_1.fastq.gz"
# self.in_progress_salmon_og.is_archive = True
# self.in_progress_salmon_og.save()
# self.in_progress_salmon_sample = Sample()
# self.in_progress_salmon_sample.accession_code = 'ERR036001'
# self.in_progress_salmon_sample.organism = human
# self.in_progress_salmon_sample.save()
# assoc = OriginalFileSampleAssociation()
# assoc.sample = self.in_progress_salmon_sample
# assoc.original_file = self.in_progress_salmon_og
# assoc.save()
# assoc = ExperimentSampleAssociation()
# assoc.sample = self.in_progress_salmon_sample
# assoc.experiment = experiment
# assoc.save()
# # Salmon experiment that is 0% complete.
# experiment = Experiment(accession_code='ERP037000')
# experiment.save()
# self.unstarted_salmon_og = OriginalFile()
# self.unstarted_salmon_og.filename = "ERR037001.fastq.gz"
# self.unstarted_salmon_og.source_filename = "ERR037001.fastq.gz"
# self.unstarted_salmon_og.source_url = "ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR037/ERR037001/ERR037001_1.fastq.gz"
# self.unstarted_salmon_og.is_archive = True
# self.unstarted_salmon_og.save()
# self.unstarted_salmon_sample = Sample()
# self.unstarted_salmon_sample.accession_code = 'ERR037001'
# self.unstarted_salmon_sample.organism = human
# self.unstarted_salmon_sample.save()
# assoc = OriginalFileSampleAssociation()
# assoc.sample = self.unstarted_salmon_sample
# assoc.original_file = self.unstarted_salmon_og
# assoc.save()
# assoc = ExperimentSampleAssociation()
# assoc.sample = self.unstarted_salmon_sample
# assoc.experiment = experiment
# assoc.save()
# # Zebrafish experiment.
# experiment = Experiment(accession_code='ERP038000')
# experiment.save()
# self.zebrafish_og = OriginalFile()
# self.zebrafish_og.source_filename = "ERR038001.fastq.gz"
# self.zebrafish_og.source_url = "ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR038/ERR038001/ERR038001_1.fastq.gz"
# self.zebrafish_og.is_archive = True
# self.zebrafish_og.save()
# self.zebrafish_sample = Sample()
# self.zebrafish_sample.accession_code = 'ERR038001'
# self.zebrafish_sample.organism = zebrafish
# self.zebrafish_sample.save()
# assoc = OriginalFileSampleAssociation()
# assoc.sample = self.zebrafish_sample
# assoc.original_file = self.zebrafish_og
# assoc.save()
# assoc = ExperimentSampleAssociation()
# assoc.sample = self.zebrafish_sample
# assoc.experiment = experiment
# assoc.save()
# # Pediatric experiment.
# experiment = Experiment(accession_code='GSE100568')
# experiment.save()
# self.pediatric_og = OriginalFile()
# self.pediatric_og.source_url = "https://www.ncbi.nlm.nih.gov/geo/download/?acc=GSE100568&format=file"
# self.pediatric_og.is_archive = True
# self.pediatric_og.save()
# self.pediatric_sample = Sample()
# self.pediatric_sample.accession_code = 'GSM2687180'
# self.pediatric_sample.organism = human
# self.pediatric_sample.save()
# assoc = OriginalFileSampleAssociation()
# assoc.sample = self.pediatric_sample
# assoc.original_file = self.pediatric_og
# assoc.save()
# assoc = ExperimentSampleAssociation()
# assoc.sample = self.pediatric_sample
# assoc.experiment = experiment
# assoc.save()
# # hgu133plus2 experiment.
# experiment = Experiment(accession_code='GSE100014')
# experiment.save()
# self.hgu133plus2_og = OriginalFile()
# self.hgu133plus2_og.source_url = "https://www.ncbi.nlm.nih.gov/geo/download/?acc=GSE100014&format=file"
# self.hgu133plus2_og.is_archive = True
# self.hgu133plus2_og.save()
# self.hgu133plus2_sample = Sample()
# self.hgu133plus2_sample.accession_code = 'GSM2667926'
# self.hgu133plus2_sample.organism = human
# self.hgu133plus2_sample.save()
# assoc = OriginalFileSampleAssociation()
# assoc.sample = self.hgu133plus2_sample
# assoc.original_file = self.hgu133plus2_og
# assoc.save()
# assoc = ExperimentSampleAssociation()
# assoc.sample = self.hgu133plus2_sample
# assoc.experiment = experiment
# assoc.save()
# @patch('data_refinery_foreman.foreman.main.Nomad')
# @patch('data_refinery_foreman.foreman.main.requeue_downloader_job')
# def test_handle_downloader_jobs(self, mock_requeue_downloader_job, mock_nomad):
# """Tests the prioritization of downloader jobs.
# We want zebrafish jobs to be first, then jobs for hgu133plus2,
# then jobs for pediatric cancer, finally salmon jobs should be
# prioritized based on how close to completion they are."""
# def mock_init_nomad(host, port=0, timeout=0):
# ret_value = MagicMock()
# ret_value.jobs = MagicMock()
# ret_value.jobs.get_jobs = MagicMock()
# ret_value.jobs.get_jobs.side_effect = lambda: []
# return ret_value
# mock_nomad.side_effect = mock_init_nomad
# unstarted_salmon_job = DownloaderJob()
# unstarted_salmon_job.accession_code = self.unstarted_salmon_sample.accession_code
# unstarted_salmon_job.save()
# assoc = DownloaderJobOriginalFileAssociation()
# assoc.downloader_job = unstarted_salmon_job
# assoc.original_file = self.unstarted_salmon_og
# assoc.save()
# in_progress_salmon_job = DownloaderJob()
# in_progress_salmon_job.accession_code = self.in_progress_salmon_sample.accession_code
# in_progress_salmon_job.save()
# assoc = DownloaderJobOriginalFileAssociation()
# assoc.downloader_job = in_progress_salmon_job
# assoc.original_file = self.in_progress_salmon_og
# assoc.save()
# zebrafish_job = DownloaderJob()
# zebrafish_job.accession_code = self.zebrafish_sample.accession_code
# zebrafish_job.save()
# assoc = DownloaderJobOriginalFileAssociation()
# assoc.downloader_job = zebrafish_job
# assoc.original_file = self.zebrafish_og
# assoc.save()
# pediatric_job = DownloaderJob()
# pediatric_job.accession_code = self.pediatric_sample.accession_code
# pediatric_job.save()
# assoc = DownloaderJobOriginalFileAssociation()
# assoc.downloader_job = pediatric_job
# assoc.original_file = self.pediatric_og
# assoc.save()
# hgu133plus2_job = DownloaderJob()
# hgu133plus2_job.accession_code = self.hgu133plus2_sample.accession_code
# hgu133plus2_job.save()
# assoc = DownloaderJobOriginalFileAssociation()
# assoc.downloader_job = hgu133plus2_job
# assoc.original_file = self.hgu133plus2_og
# assoc.save()
# jobs = [unstarted_salmon_job,
# in_progress_salmon_job,
# hgu133plus2_job,
# zebrafish_job,
# pediatric_job
# ]
# jobs_in_correct_order = [zebrafish_job,
# hgu133plus2_job,
# pediatric_job,
# in_progress_salmon_job,
# unstarted_salmon_job
# ]
# main.handle_downloader_jobs(jobs)
# for count, job in enumerate(jobs_in_correct_order):
# # Calls are a weird object that I think is just basically
# # a tuple. Index 1 of a call object is the arguments
# # tuple, we're interested in the first argument
# job_called_at_count = mock_requeue_downloader_job.mock_calls[count][1][0]
# self.assertEqual(job.id, job_called_at_count.id)
# @patch('data_refinery_foreman.foreman.main.Nomad')
# @patch('data_refinery_foreman.foreman.main.requeue_processor_job')
# def test_handle_processor_jobs(self, mock_requeue_processor_job, mock_nomad):
# """Tests the prioritization of processor jobs.
# We want zebrafish jobs to be first, then jobs for hgu133plus2,
# then jobs for pediatric cancer, finally salmon jobs should be
# prioritized based on how close to completion they are."""
# def mock_init_nomad(host, port=0, timeout=0):
# ret_value = MagicMock()
# ret_value.jobs = MagicMock()
# ret_value.jobs.get_jobs = MagicMock()
# ret_value.jobs.get_jobs.side_effect = lambda: []
# return ret_value
# mock_nomad.side_effect = mock_init_nomad
# unstarted_salmon_job = ProcessorJob()
# unstarted_salmon_job.accession_code = self.unstarted_salmon_sample.accession_code
# unstarted_salmon_job.pipeline_applied = "SALMON"
# unstarted_salmon_job.save()
# assoc = ProcessorJobOriginalFileAssociation()
# assoc.processor_job = unstarted_salmon_job
# assoc.original_file = self.unstarted_salmon_og
# assoc.save()
# in_progress_salmon_job = ProcessorJob()
# in_progress_salmon_job.accession_code = self.in_progress_salmon_sample.accession_code
# in_progress_salmon_job.pipeline_applied = "SALMON"
# in_progress_salmon_job.save()
# assoc = ProcessorJobOriginalFileAssociation()
# assoc.processor_job = in_progress_salmon_job
# assoc.original_file = self.in_progress_salmon_og
# assoc.save()
# zebrafish_job = ProcessorJob()
# zebrafish_job.accession_code = self.zebrafish_sample.accession_code
# zebrafish_job.pipeline_applied = "SALMON"
# zebrafish_job.save()
# assoc = ProcessorJobOriginalFileAssociation()
# assoc.processor_job = zebrafish_job
# assoc.original_file = self.zebrafish_og
# assoc.save()
# pediatric_job = ProcessorJob()
# pediatric_job.accession_code = self.pediatric_sample.accession_code
# pediatric_job.pipeline_applied = "SALMON"
# pediatric_job.save()
# assoc = ProcessorJobOriginalFileAssociation()
# assoc.processor_job = pediatric_job
# assoc.original_file = self.pediatric_og
# assoc.save()
# hgu133plus2_job = ProcessorJob()
# hgu133plus2_job.accession_code = self.hgu133plus2_sample.accession_code
# hgu133plus2_job.pipeline_applied = "SALMON"
# hgu133plus2_job.save()
# assoc = ProcessorJobOriginalFileAssociation()
# assoc.processor_job = hgu133plus2_job
# assoc.original_file = self.hgu133plus2_og
# assoc.save()
# jobs = [unstarted_salmon_job,
# in_progress_salmon_job,
# hgu133plus2_job,
# zebrafish_job,
# pediatric_job
# ]
# jobs_in_correct_order = [zebrafish_job,
# hgu133plus2_job,
# pediatric_job,
# in_progress_salmon_job,
# unstarted_salmon_job
# ]
# main.handle_processor_jobs(jobs)
# for count, job in enumerate(jobs_in_correct_order):
# # Calls are a weird object that I think is just basically
# # a tuple. Index 1 of a call object is the arguments
# # tuple, we're interested in the first argument
# job_called_at_count = mock_requeue_processor_job.mock_calls[count][1][0]
# self.assertEqual(job.id, job_called_at_count.id)
| 38.213599 | 124 | 0.660957 | 6,778 | 55,639 | 5.110947 | 0.058867 | 0.028087 | 0.033024 | 0.054789 | 0.845246 | 0.813435 | 0.784943 | 0.755672 | 0.731251 | 0.721985 | 0 | 0.017019 | 0.240677 | 55,639 | 1,455 | 125 | 38.239863 | 0.802949 | 0.27186 | 0 | 0.790303 | 0 | 0 | 0.097003 | 0.080957 | 0 | 0 | 0 | 0 | 0.224242 | 1 | 0.066667 | false | 0 | 0.010909 | 0 | 0.10303 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d4433314d30163c673c6376c88fa9c308bd45026 | 97 | py | Python | sibyl/util/__init__.py | fahminlb33/sibyl_eeg | dadb8e52d25ba51a66870d0296cc3e1af0ec0f37 | [
"MIT"
] | 1 | 2021-11-16T06:37:09.000Z | 2021-11-16T06:37:09.000Z | sibyl/util/__init__.py | fahminlb33/sibyl_eeg | dadb8e52d25ba51a66870d0296cc3e1af0ec0f37 | [
"MIT"
] | null | null | null | sibyl/util/__init__.py | fahminlb33/sibyl_eeg | dadb8e52d25ba51a66870d0296cc3e1af0ec0f37 | [
"MIT"
] | null | null | null | from sibyl.util.DownloadProgressBar import DownloadProgressBar
from sibyl.util import filesystem
| 32.333333 | 62 | 0.886598 | 11 | 97 | 7.818182 | 0.545455 | 0.209302 | 0.302326 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082474 | 97 | 2 | 63 | 48.5 | 0.966292 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d45bd80184c41597d4db87bb1ba3e8960ecef584 | 11,231 | py | Python | test/structure/test_tetrahedron_method.py | ladyteam/phonopy | 455ef61dfa15c01fb6b516461b52f15aefbf92b3 | [
"BSD-3-Clause"
] | 127 | 2015-01-21T17:50:58.000Z | 2020-02-04T13:46:13.000Z | test/structure/test_tetrahedron_method.py | ladyteam/phonopy | 455ef61dfa15c01fb6b516461b52f15aefbf92b3 | [
"BSD-3-Clause"
] | 100 | 2015-02-07T15:32:50.000Z | 2020-02-23T02:09:08.000Z | test/structure/test_tetrahedron_method.py | ladyteam/phonopy | 455ef61dfa15c01fb6b516461b52f15aefbf92b3 | [
"BSD-3-Clause"
] | 122 | 2015-02-07T15:39:28.000Z | 2020-02-10T22:33:16.000Z | """Tests for routines in tetrahedron_method.py."""
import numpy as np
from phonopy.structure.tetrahedron_method import (
get_all_tetrahedra_relative_grid_address,
get_tetrahedra_integration_weight,
)
rel_ga_ref = [
0,
0,
0,
1,
0,
0,
1,
1,
0,
1,
1,
1,
0,
0,
0,
1,
0,
0,
1,
0,
1,
1,
1,
1,
0,
0,
0,
0,
1,
0,
1,
1,
0,
1,
1,
1,
0,
0,
0,
0,
1,
0,
0,
1,
1,
1,
1,
1,
0,
0,
0,
0,
0,
1,
1,
0,
1,
1,
1,
1,
0,
0,
0,
0,
0,
1,
0,
1,
1,
1,
1,
1,
0,
0,
0,
0,
1,
0,
0,
1,
1,
-1,
0,
0,
0,
0,
0,
0,
0,
1,
0,
1,
1,
-1,
0,
0,
0,
0,
0,
1,
0,
0,
1,
0,
1,
0,
-1,
0,
0,
0,
0,
0,
0,
1,
1,
0,
1,
0,
-1,
0,
0,
0,
0,
0,
0,
1,
-1,
-1,
0,
0,
-1,
0,
0,
0,
0,
0,
0,
1,
-1,
-1,
0,
-1,
0,
0,
0,
0,
0,
1,
0,
0,
1,
1,
0,
0,
0,
-1,
0,
0,
0,
0,
1,
0,
1,
1,
0,
0,
0,
-1,
0,
0,
0,
0,
1,
0,
-1,
0,
-1,
0,
0,
-1,
0,
0,
0,
0,
1,
0,
-1,
0,
-1,
-1,
0,
0,
0,
0,
0,
1,
0,
0,
0,
-1,
-1,
0,
0,
-1,
0,
0,
0,
1,
0,
0,
0,
-1,
-1,
0,
-1,
0,
0,
0,
0,
-1,
-1,
-1,
0,
-1,
-1,
0,
0,
-1,
0,
0,
0,
-1,
-1,
-1,
0,
-1,
-1,
0,
-1,
0,
0,
0,
0,
-1,
-1,
-1,
-1,
0,
-1,
0,
0,
-1,
0,
0,
0,
-1,
-1,
-1,
-1,
0,
-1,
-1,
0,
0,
0,
0,
0,
-1,
-1,
-1,
-1,
-1,
0,
0,
-1,
0,
0,
0,
0,
-1,
-1,
-1,
-1,
-1,
0,
-1,
0,
0,
0,
0,
0,
1,
0,
0,
0,
1,
0,
0,
1,
1,
0,
0,
0,
1,
0,
0,
0,
0,
1,
0,
1,
1,
0,
0,
0,
-1,
1,
0,
-1,
1,
1,
-1,
0,
0,
0,
0,
0,
-1,
0,
1,
-1,
1,
1,
-1,
0,
0,
0,
0,
0,
-1,
1,
0,
0,
1,
0,
-1,
1,
1,
0,
0,
0,
0,
1,
0,
-1,
1,
1,
0,
1,
1,
0,
0,
0,
-1,
0,
1,
0,
0,
1,
-1,
1,
1,
0,
0,
0,
0,
0,
1,
-1,
1,
1,
0,
1,
1,
0,
0,
0,
0,
0,
1,
0,
-1,
0,
1,
-1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1,
1,
-1,
0,
0,
0,
0,
-1,
0,
1,
0,
-1,
0,
-1,
0,
0,
0,
0,
0,
-1,
0,
1,
0,
0,
1,
0,
-1,
0,
0,
0,
0,
0,
1,
0,
0,
0,
-1,
1,
0,
-1,
0,
0,
0,
1,
0,
0,
0,
1,
0,
1,
0,
-1,
0,
0,
0,
-1,
1,
0,
0,
0,
-1,
-1,
0,
0,
0,
0,
0,
-1,
1,
0,
0,
1,
0,
0,
0,
-1,
0,
0,
0,
0,
-1,
-1,
1,
-1,
-1,
0,
0,
-1,
0,
0,
0,
0,
-1,
-1,
1,
-1,
-1,
0,
-1,
0,
0,
0,
0,
1,
-1,
-1,
0,
0,
-1,
1,
0,
-1,
0,
0,
0,
1,
0,
0,
1,
-1,
-1,
1,
0,
-1,
0,
0,
0,
1,
-1,
-1,
0,
-1,
0,
1,
-1,
0,
0,
0,
0,
1,
0,
0,
1,
-1,
-1,
1,
-1,
0,
0,
0,
0,
0,
-1,
-1,
0,
0,
-1,
-1,
0,
0,
0,
0,
0,
0,
-1,
-1,
0,
-1,
0,
-1,
0,
0,
0,
0,
0,
1,
0,
0,
0,
1,
0,
1,
0,
1,
0,
0,
0,
0,
1,
0,
0,
0,
1,
1,
0,
1,
0,
0,
0,
-1,
1,
0,
0,
0,
1,
-1,
0,
0,
0,
0,
0,
-1,
1,
0,
0,
1,
0,
0,
0,
1,
0,
0,
0,
1,
-1,
1,
0,
-1,
0,
1,
-1,
0,
0,
0,
0,
0,
-1,
1,
1,
-1,
1,
0,
-1,
0,
0,
0,
0,
1,
0,
0,
1,
-1,
1,
1,
-1,
0,
0,
0,
0,
1,
0,
0,
1,
-1,
1,
1,
0,
1,
0,
0,
0,
0,
-1,
1,
1,
-1,
1,
0,
0,
1,
0,
0,
0,
1,
-1,
1,
0,
0,
1,
1,
0,
1,
0,
0,
0,
0,
-1,
1,
0,
-1,
0,
-1,
0,
0,
0,
0,
0,
0,
-1,
1,
0,
0,
1,
-1,
0,
0,
0,
0,
0,
1,
0,
0,
0,
0,
-1,
0,
1,
-1,
0,
0,
0,
1,
0,
0,
0,
1,
0,
0,
1,
-1,
0,
0,
0,
-1,
0,
-1,
0,
0,
-1,
-1,
1,
-1,
0,
0,
0,
-1,
0,
-1,
-1,
1,
-1,
-1,
0,
0,
0,
0,
0,
0,
0,
-1,
-1,
1,
-1,
0,
1,
-1,
0,
0,
0,
0,
1,
0,
-1,
1,
-1,
0,
1,
-1,
0,
0,
0,
-1,
1,
0,
-1,
1,
-1,
-1,
0,
0,
0,
0,
0,
-1,
1,
0,
0,
1,
0,
-1,
1,
-1,
0,
0,
0,
0,
0,
-1,
0,
-1,
0,
1,
-1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
-1,
1,
-1,
0,
0,
0,
0,
-1,
0,
-1,
0,
0,
-1,
0,
-1,
0,
0,
0,
0,
-1,
0,
-1,
0,
-1,
0,
-1,
0,
0,
0,
0,
0,
1,
0,
0,
1,
1,
0,
0,
0,
1,
0,
0,
0,
0,
1,
0,
1,
1,
0,
0,
0,
1,
0,
0,
0,
0,
1,
0,
-1,
0,
1,
-1,
0,
0,
0,
0,
0,
0,
1,
0,
-1,
0,
1,
0,
0,
1,
0,
0,
0,
1,
0,
0,
0,
-1,
1,
0,
-1,
0,
0,
0,
0,
1,
0,
0,
0,
-1,
1,
0,
0,
1,
0,
0,
0,
-1,
-1,
1,
-1,
-1,
0,
0,
-1,
0,
0,
0,
0,
-1,
-1,
1,
-1,
-1,
0,
-1,
0,
0,
0,
0,
0,
-1,
-1,
1,
0,
-1,
1,
0,
-1,
0,
0,
0,
0,
-1,
-1,
1,
-1,
0,
1,
-1,
0,
0,
0,
0,
0,
-1,
-1,
1,
0,
-1,
1,
0,
0,
1,
0,
0,
0,
-1,
-1,
1,
-1,
0,
1,
0,
0,
1,
0,
0,
0,
0,
0,
-1,
1,
0,
-1,
1,
1,
-1,
0,
0,
0,
0,
0,
-1,
0,
1,
-1,
1,
1,
-1,
0,
0,
0,
1,
0,
0,
1,
0,
-1,
1,
1,
-1,
0,
0,
0,
0,
1,
0,
0,
1,
-1,
1,
1,
-1,
0,
0,
0,
1,
0,
0,
1,
1,
0,
1,
1,
-1,
0,
0,
0,
0,
1,
0,
1,
1,
0,
1,
1,
-1,
0,
0,
0,
0,
0,
-1,
0,
1,
-1,
-1,
0,
0,
0,
0,
0,
0,
1,
0,
0,
1,
-1,
-1,
0,
0,
0,
0,
0,
0,
0,
-1,
1,
0,
-1,
0,
-1,
0,
0,
0,
0,
1,
0,
0,
1,
0,
-1,
0,
-1,
0,
0,
0,
0,
0,
0,
-1,
-1,
-1,
0,
0,
-1,
0,
0,
0,
0,
0,
0,
-1,
-1,
-1,
0,
-1,
0,
0,
]
freqs = [7.75038996, 8.45225776]
tetra_freqs = [
[8.31845176, 8.69248151, 8.78939432, 8.66179133],
[8.31845176, 8.69248151, 8.57211855, 8.66179133],
[8.31845176, 8.3073908, 8.78939432, 8.66179133],
[8.31845176, 8.3073908, 8.16360975, 8.66179133],
[8.31845176, 8.15781566, 8.57211855, 8.66179133],
[8.31845176, 8.15781566, 8.16360975, 8.66179133],
[8.31845176, 8.3073908, 8.16360975, 7.23665561],
[8.31845176, 8.15781566, 8.16360975, 7.23665561],
[8.31845176, 8.69248151, 8.57211855, 8.25247917],
[8.31845176, 8.15781566, 8.57211855, 8.25247917],
[8.31845176, 8.15781566, 7.40609306, 8.25247917],
[8.31845176, 8.15781566, 7.40609306, 7.23665561],
[8.31845176, 8.69248151, 8.78939432, 8.55165578],
[8.31845176, 8.3073908, 8.78939432, 8.55165578],
[8.31845176, 8.3073908, 7.56474684, 8.55165578],
[8.31845176, 8.3073908, 7.56474684, 7.23665561],
[8.31845176, 8.69248151, 8.60076148, 8.55165578],
[8.31845176, 8.69248151, 8.60076148, 8.25247917],
[8.31845176, 7.72920193, 8.60076148, 8.55165578],
[8.31845176, 7.72920193, 8.60076148, 8.25247917],
[8.31845176, 7.72920193, 7.56474684, 8.55165578],
[8.31845176, 7.72920193, 7.56474684, 7.23665561],
[8.31845176, 7.72920193, 7.40609306, 8.25247917],
[8.31845176, 7.72920193, 7.40609306, 7.23665561],
]
iw_I_ref = [0.37259443, 1.79993056]
iw_J_ref = [0.05740597, 0.76331859]
def test_get_all_tetrahedra_relative_grid_address():
"""Test of get_all_tetrahedra_relative_grid_address."""
rel_ga = get_all_tetrahedra_relative_grid_address()
# for i, line in enumerate(rel_ga.reshape(-1, 12)):
# print("%03d: " % i + "".join(["%d, " % v for v in line]))
np.testing.assert_array_equal(rel_ga.ravel(), np.array(rel_ga_ref).ravel())
def test_get_tetrahedra_integration_weight():
"""Test of get_tetrahedra_integration_weight."""
iw_I = get_tetrahedra_integration_weight(freqs, tetra_freqs, function="I")
iw_J = get_tetrahedra_integration_weight(freqs, tetra_freqs, function="J")
np.testing.assert_allclose(iw_I_ref, iw_I, atol=1e-5)
np.testing.assert_allclose(iw_J_ref, iw_J, atol=1e-5)
def test_get_tetrahedra_integration_weight_one_freq():
"""Test of get_tetrahedra_integration_weight with float as first parameter."""
iw_I = []
iw_J = []
for i in range(2):
iw_I.append(
get_tetrahedra_integration_weight(freqs[i], tetra_freqs, function="I")
)
iw_J.append(
get_tetrahedra_integration_weight(freqs[i], tetra_freqs, function="J")
)
np.testing.assert_allclose(iw_I_ref, iw_I, atol=1e-5)
np.testing.assert_allclose(iw_J_ref, iw_J, atol=1e-5)
| 9.168163 | 82 | 0.283501 | 1,589 | 11,231 | 1.944619 | 0.054122 | 0.266019 | 0.240777 | 0.168285 | 0.86699 | 0.86699 | 0.735599 | 0.617799 | 0.468608 | 0.468608 | 0 | 0.414522 | 0.553646 | 11,231 | 1,224 | 83 | 9.175654 | 0.201875 | 0.02876 | 0 | 0.956954 | 0 | 0 | 0.000367 | 0 | 0 | 0 | 0 | 0 | 0.004139 | 1 | 0.002483 | false | 0 | 0.001656 | 0 | 0.004139 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
00f717f4ddf30b5662ec522966180077e24fc1c9 | 6,934 | py | Python | nets.py | ashaw596/squeezenas | bdb279854bbed6cc7790a3d0faafb4f7c6c5f01e | [
"MIT"
] | 65 | 2019-09-03T06:12:56.000Z | 2021-09-07T11:52:29.000Z | nets.py | ashaw596/squeezenas | bdb279854bbed6cc7790a3d0faafb4f7c6c5f01e | [
"MIT"
] | 9 | 2019-09-05T02:28:31.000Z | 2020-09-18T10:39:03.000Z | nets.py | ashaw596/squeezenas | bdb279854bbed6cc7790a3d0faafb4f7c6c5f01e | [
"MIT"
] | 10 | 2019-10-01T21:42:29.000Z | 2021-01-11T18:30:32.000Z | import torch
from arch.hyperparameters import get_cityscapes_hyperparams_small, get_cityscapes_hyperparams_large, \
get_cityscapes_hyperparams_xlarge
from arch.model_cityscapes import SqueezeNASNetCityscapes
from arch.operations import Ops
def get_squeezenas_mac_small():
# noinspection PyPep8
genotype = [Ops.inverse_residual_k3_e1_g2, Ops.inverse_residual_k3_e1_g1, Ops.inverse_residual_k3_e1_g1, Ops.inverse_residual_k3_e1_g2, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e1_g1, Ops.inverse_residual_k3_e1_g1, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e6_g1_d2, Ops.residual_skipish, Ops.inverse_residual_k3_e1_g2, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e3_g1_d2, Ops.inverse_residual_k3_e1_g1_d2, Ops.inverse_residual_k3_e1_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e1_g2_d2]
weight_path = "weights/mac_small.pth"
hyperparameters = get_cityscapes_hyperparams_small()
model = SqueezeNASNetCityscapes(hyperparameters, genotype, lr_aspp=True)
state_dict = torch.load(weight_path, map_location=torch.device('cpu'))
model.load_state_dict(state_dict)
return model
def get_squeezenas_mac_large():
# noinspection PyPep8
genotype = [Ops.inverse_residual_k3_e1_g2, Ops.inverse_residual_k3_e6_g1, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e1_g2, Ops.inverse_residual_k3_e1_g2, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k5_e1_g1, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e1_g1, Ops.inverse_residual_k5_e1_g1, Ops.inverse_residual_k3_e3_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e1_g1_d2]
weight_path = "weights/mac_large.pth"
hyperparameters = get_cityscapes_hyperparams_large()
model = SqueezeNASNetCityscapes(hyperparameters, genotype, lr_aspp=True)
state_dict = torch.load(weight_path, map_location=torch.device('cpu'))
model.load_state_dict(state_dict)
return model
def get_squeezenas_mac_xlarge():
# noinspection PyPep8
genotype = [Ops.inverse_residual_k3_e1_g2, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k3_e1_g2, Ops.inverse_residual_k5_e1_g1, Ops.inverse_residual_k3_e3_g1_d2, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k5_e1_g2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e3_g1, Ops.residual_skipish, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e1_g1, Ops.inverse_residual_k5_e6_g1, Ops.residual_skipish, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e3_g1_d2]
weight_path = "weights/mac_xlarge.pth"
hyperparameters = get_cityscapes_hyperparams_xlarge()
model = SqueezeNASNetCityscapes(hyperparameters, genotype, lr_aspp=False)
state_dict = torch.load(weight_path, map_location=torch.device('cpu'))
model.load_state_dict(state_dict)
return model
def get_squeezenas_lat_small():
# noinspection PyPep8
genotype = [Ops.inverse_residual_k3_e1_g1, Ops.residual_skipish, Ops.residual_skipish, Ops.residual_skipish, Ops.inverse_residual_k3_e6_g1, Ops.inverse_residual_k3_e6_g1, Ops.residual_skipish, Ops.residual_skipish, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k5_e1_g2, Ops.inverse_residual_k3_e1_g2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e1_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e3_g1_d2, Ops.inverse_residual_k3_e3_g1_d2, Ops.inverse_residual_k3_e3_g1_d2]
weight_path = "weights/lat_small.pth"
hyperparameters = get_cityscapes_hyperparams_small()
model = SqueezeNASNetCityscapes(hyperparameters, genotype, lr_aspp=True)
state_dict = torch.load(weight_path, map_location=torch.device('cpu'))
model.load_state_dict(state_dict)
return model
def get_squeezenas_lat_large():
# noinspection PyPep8
genotype = [Ops.residual_skipish, Ops.inverse_residual_k3_e6_g1, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k3_e1_g1_d2, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e6_g1, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k5_e1_g1, Ops.inverse_residual_k5_e1_g1, Ops.inverse_residual_k5_e1_g2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e1_g1, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e6_g1_d2]
weight_path = "weights/lat_large.pth"
hyperparameters = get_cityscapes_hyperparams_large()
model = SqueezeNASNetCityscapes(hyperparameters, genotype, lr_aspp=True)
state_dict = torch.load(weight_path, map_location=torch.device('cpu'))
model.load_state_dict(state_dict)
return model
def get_squeezenas_lat_xlarge():
# noinspection PyPep8
genotype = [Ops.residual_skipish, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k3_e1_g1_d2, Ops.inverse_residual_k3_e6_g1, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k5_e1_g2, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e6_g1, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e6_g1, Ops.inverse_residual_k5_e6_g1, Ops.inverse_residual_k3_e3_g1_d2, Ops.inverse_residual_k3_e1_g2_d2, Ops.inverse_residual_k3_e3_g1, Ops.inverse_residual_k3_e6_g1_d2, Ops.inverse_residual_k3_e3_g1_d2, Ops.inverse_residual_k5_e1_g1, Ops.inverse_residual_k3_e3_g1_d2, Ops.inverse_residual_k3_e6_g1_d2]
weight_path = "weights/lat_xlarge.pth"
hyperparameters = get_cityscapes_hyperparams_xlarge()
model = SqueezeNASNetCityscapes(hyperparameters, genotype, lr_aspp=False)
state_dict = torch.load(weight_path, map_location=torch.device('cpu'))
model.load_state_dict(state_dict)
return model
SQUEEZENAS_NETWORKS = {
'squeezenas_mac_small': get_squeezenas_mac_small,
'squeezenas_mac_large': get_squeezenas_mac_large,
'squeezenas_mac_xlarge': get_squeezenas_mac_xlarge,
'squeezenas_lat_small': get_squeezenas_lat_small,
'squeezenas_lat_large': get_squeezenas_lat_large,
'squeezenas_lat_xlarge': get_squeezenas_lat_xlarge
}
| 83.542169 | 721 | 0.850303 | 1,133 | 6,934 | 4.616946 | 0.044131 | 0.225578 | 0.406041 | 0.340279 | 0.907857 | 0.899828 | 0.894858 | 0.894858 | 0.889505 | 0.85758 | 0 | 0.064496 | 0.078742 | 6,934 | 82 | 722 | 84.560976 | 0.754383 | 0.017162 | 0 | 0.491803 | 0 | 0 | 0.039365 | 0.024971 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098361 | false | 0 | 0.065574 | 0 | 0.262295 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
cf00eb814e6bed11f0e826ff57a39605321325a0 | 137 | py | Python | andes/models/dynload/__init__.py | cuihantao/Andes | 6cdc057986c4a8382194ef440b6e92b8dfb77e25 | [
"Apache-2.0"
] | 16 | 2017-06-16T14:21:04.000Z | 2018-08-18T08:52:27.000Z | andes/models/dynload/__init__.py | cuihantao/Andes | 6cdc057986c4a8382194ef440b6e92b8dfb77e25 | [
"Apache-2.0"
] | 1 | 2017-12-12T07:51:16.000Z | 2017-12-12T07:51:16.000Z | andes/models/dynload/__init__.py | cuihantao/Andes | 6cdc057986c4a8382194ef440b6e92b8dfb77e25 | [
"Apache-2.0"
] | 7 | 2017-12-10T07:32:36.000Z | 2018-09-19T16:38:30.000Z | """
Module for dynamic loads.
"""
from andes.models.dynload.fload import FLoad # NOQA
from andes.models.dynload.zip import ZIP # NOQA
| 19.571429 | 52 | 0.737226 | 20 | 137 | 5.05 | 0.6 | 0.178218 | 0.29703 | 0.435644 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153285 | 137 | 6 | 53 | 22.833333 | 0.87069 | 0.262774 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
cf0fa64294444cf4615b82449c33c4a6596da5eb | 80 | py | Python | commands/util/__init__.py | kinpa200296/cmdpy | 3ce1e2c2c8803ad296d9b7c3ac0be5100938632e | [
"MIT"
] | null | null | null | commands/util/__init__.py | kinpa200296/cmdpy | 3ce1e2c2c8803ad296d9b7c3ac0be5100938632e | [
"MIT"
] | null | null | null | commands/util/__init__.py | kinpa200296/cmdpy | 3ce1e2c2c8803ad296d9b7c3ac0be5100938632e | [
"MIT"
] | null | null | null | from work_dir import print_dir, help_print_dir
from echo import echo, help_echo
| 26.666667 | 46 | 0.85 | 15 | 80 | 4.2 | 0.466667 | 0.253968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 80 | 2 | 47 | 40 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
cf6422db716cb46fab0f2cd1752e995b69de9616 | 87,975 | py | Python | fpn/core/loader.py | chi3x10/RepMet | d5b13e01940bbb7ed59dd1ff073e03c0808f76c0 | [
"Apache-2.0"
] | 103 | 2019-08-16T11:55:04.000Z | 2022-03-04T16:47:57.000Z | fpn/core/loader.py | chi3x10/RepMet | d5b13e01940bbb7ed59dd1ff073e03c0808f76c0 | [
"Apache-2.0"
] | 33 | 2019-05-25T08:42:06.000Z | 2022-03-08T21:32:10.000Z | fpn/core/loader.py | chi3x10/RepMet | d5b13e01940bbb7ed59dd1ff073e03c0808f76c0 | [
"Apache-2.0"
] | 18 | 2019-09-14T07:35:39.000Z | 2021-11-25T04:25:20.000Z | # --------------------------------------------------------
# Deformable Convolutional Networks
# Copyright (c) 2016 by Contributors
# Copyright (c) 2017 Microsoft
# Copyright (c) 2019 IBM Corp
# Licensed under The Apache-2.0 License [see LICENSE for details]
# Modified by Haozhi Qi
# --------------------------------------------------------
import os
import numpy as np
import mxnet as mx
from mxnet.executor_manager import _split_input_slice
import cPickle
from config.config import config
from rpn.rpn import get_rpn_testbatch, get_rpn_batch, assign_pyramid_anchor
from rcnn import get_rcnn_testbatch
def par_assign_anchor_wrapper(cfg, iroidb, feat_sym, feat_strides, anchor_scales, anchor_ratios, allowed_border):
# get testing data for multigpu
data, rpn_label, img_fname = get_rpn_batch(iroidb, cfg)
data_shape = {k: v.shape for k, v in data.items()}
del data_shape['im_info']
# add gt_boxes to data for e2e
data['gt_boxes'] = rpn_label['gt_boxes'][np.newaxis, :, :]
if not cfg.network.base_net_lock:
feat_shape = [y[1] for y in [x.infer_shape(**data_shape) for x in feat_sym]]
label = assign_pyramid_anchor(feat_shape, rpn_label['gt_boxes'], data['im_info'], cfg,
feat_strides, anchor_scales, anchor_ratios, allowed_border)
else:
label = None
return {'data': data, 'label': label,'img_fname':img_fname}
class TestLoader(mx.io.DataIter):
def __init__(self, roidb, config, batch_size=1, shuffle=False,
has_rpn=False):
super(TestLoader, self).__init__()
# save parameters as properties
self.cfg = config
self.roidb = roidb
self.batch_size = batch_size
self.shuffle = shuffle
self.has_rpn = has_rpn
# infer properties from roidb
self.size = len(self.roidb)
self.index = np.arange(self.size)
# decide data and label names (only for training)
if has_rpn:
self.data_name = ['data', 'im_info']
else:
self.data_name = ['data', 'rois']
self.label_name = None
# status variable for synchronization between get_data and get_label
self.cur = 0
self.data = None
self.label = []
self.im_info = None
# get first batch to fill in provide_data and provide_label
self.reset()
self.get_batch()
@property
def provide_data(self):
return [[(k, v.shape) for k, v in zip(self.data_name, idata)] for idata in self.data]
@property
def provide_label(self):
return [None for _ in range(len(self.data))]
@property
def provide_data_single(self):
return [(k, v.shape) for k, v in zip(self.data_name, self.data[0])]
@property
def provide_label_single(self):
return None
def reset(self):
self.cur = 0
if self.shuffle:
np.random.shuffle(self.index)
# self.filter_logic()
def filter_logic(self):
sel_set=[]
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
cats = cats[cats < (self.cfg.dataset.NUM_CLASSES-1)]
if not cats.size:
continue
sel_set.append(cur)
sel_set=np.array(sel_set)
if self.shuffle:
p = np.random.permutation(np.arange(len(sel_set)))
sel_set = sel_set[p]
self.index = sel_set
self.size = len(self.index)
print('total size {0}'.format(self.size))
def iter_next(self):
return self.cur < self.size
def next(self):
if self.iter_next():
self.get_batch()
self.cur += self.batch_size
return self.im_info, mx.io.DataBatch(data=self.data, label=self.label,
pad=self.getpad(), index=self.getindex(),
provide_data=self.provide_data, provide_label=self.provide_label)
else:
raise StopIteration
def getindex(self):
return self.cur / self.batch_size
def getpad(self):
if self.cur + self.batch_size > self.size:
return self.cur + self.batch_size - self.size
else:
return 0
def get_batch(self):
cur_from = self.cur
cur_to = min(cur_from + self.batch_size, self.size)
roidb = [self.roidb[self.index[i]] for i in range(cur_from, cur_to)]
#print(roidb[0]['image'])
#print(roidb[0]['gt_names'])
if self.has_rpn:
data, label, im_info = get_rpn_testbatch(roidb, self.cfg)
else:
data, label, im_info = get_rcnn_testbatch(roidb, self.cfg)
self.data = [[mx.nd.array(idata[name]) for name in self.data_name] for idata in data]
self.im_info = im_info
def get_batch_individual(self):
cur_from = self.cur
cur_to = min(cur_from + self.batch_size, self.size)
roidb = [self.roidb[self.index[i]] for i in range(cur_from, cur_to)]
if self.has_rpn:
data, label, im_info = get_rpn_testbatch(roidb, self.cfg)
else:
data, label, im_info = get_rcnn_testbatch(roidb, self.cfg)
self.data = [mx.nd.array(data[name]) for name in self.data_name]
self.im_info = im_info
class PyramidAnchorIterator(mx.io.DataIter):
# pool = Pool(processes=4)
def __init__(self, feat_sym, roidb, cfg, batch_size=1, shuffle=False, ctx=None, work_load_list=None,
feat_strides=(4, 8, 16, 32, 64), anchor_scales=(8, ), anchor_ratios=(0.5, 1, 2), allowed_border=0,
aspect_grouping=False):
"""
This Iter will provide roi data to Fast R-CNN network
:param feat_sym: to infer shape of assign_output
:param roidb: must be preprocessed
:param batch_size: must divide BATCH_SIZE(128)
:param shuffle: bool
:param ctx: list of contexts
:param work_load_list: list of work load
:param aspect_grouping: group images with similar aspects
:return: AnchorLoader
"""
super(PyramidAnchorIterator, self).__init__()
# save parameters as properties
self.feat_sym = feat_sym
import random
random.seed(901)
from random import shuffle
self.roidb = roidb
shuffle(self.roidb)
self.cfg = cfg
self.batch_size = batch_size
self.shuffle = shuffle
self.ctx = ctx
if self.ctx is None:
self.ctx = [mx.cpu()]
self.work_load_list = work_load_list
self.feat_strides = feat_strides
self.anchor_scales = anchor_scales
self.anchor_ratios = anchor_ratios
self.allowed_border = allowed_border
self.aspect_grouping = aspect_grouping
# infer properties from roidb
self.size = len(roidb)
self.index = np.arange(self.size)
# decide data and label names
if self.cfg.TRAIN.END2END:
self.data_name = ['data', 'im_info', 'gt_boxes']
else:
self.data_name = ['data']
self.feat_pyramid_level = np.log2(self.cfg.network.RPN_FEAT_STRIDE).astype(int)
# self.label_name = ['label_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_target_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_weight_p' + str(x) for x in self.feat_pyramid_level]
if self.cfg.network.base_net_lock:
self.label_name = []
else:
self.label_name = ['label', 'bbox_target', 'bbox_weight']
# status variable for synchronization between get_data and get_label
self.cur = 0
self.batch = None
self.data = None
self.label = None
self.img_fname= None
# get first batch to fill in provide_data and provide_label
self.reset()
self.get_batch_parallel()
@property
def provide_data(self):
return [[(k, v.shape) for k, v in zip(self.data_name, self.data[i])] for i in xrange(len(self.data))]
@property
def provide_label(self):
return [[(k, v.shape) for k, v in zip(self.label_name, self.label[i])] for i in xrange(len(self.data))]
@property
def provide_data_single(self):
return [(k, v.shape) for k, v in zip(self.data_name, self.data[0])]
@property
def provide_label_single(self):
return [(k, v.shape) for k, v in zip(self.label_name, self.label[0])]
def reset(self):
self.size = len(self.roidb)
self.cur = 0
if self.shuffle:
if self.aspect_grouping:
widths = np.array([r['width'] for r in self.roidb])
heights = np.array([r['height'] for r in self.roidb])
horz = (widths >= heights)
vert = np.logical_not(horz)
horz_inds = np.where(horz)[0]
vert_inds = np.where(vert)[0]
inds = np.hstack((np.random.permutation(horz_inds), np.random.permutation(vert_inds)))
extra = inds.shape[0] % self.batch_size
inds_ = np.reshape(inds[:-extra], (-1, self.batch_size))
row_perm = np.random.permutation(np.arange(inds_.shape[0]))
inds[:-extra] = np.reshape(inds_[row_perm, :], (-1,))
self.index = inds
else:
np.random.shuffle(self.index)
#self.apply_index_constraints()
if self.cfg.dataset.order_classes_incrementally:
self.order_classes_incrementally()
if self.cfg.dataset.balance_classes:
self.balance_classes()
def balance_classes(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
cnts = np.zeros((10000))
sel_set=[]
sel_set_cats=[]
if config.dataset.cls_filter_files is not None:
fls = config.dataset.cls_filter_files.split(':')
with open(fls[0],'rb') as f:
cls2id_map = cPickle.load(f)
with open(fls[1]) as f:
classes2use = [x.strip().lower() for x in f.readlines()]
#classes2use = [x.strip() for x in f.readlines()]
clsIds2use = set()
for cls in classes2use:
clsIds2use.add(cls2id_map[cls])
self.cfg.dataset.clsIds2use = clsIds2use.copy()
self.cfg.dataset.clsIds2use.add(0)
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if config.dataset.cls_filter_files is not None:
cats = np.array([x for x in cats if (x+1) in clsIds2use])
# else:
# cats = cats[cats < (self.cfg.dataset.NUM_CLASSES-1)]
if not cats.size:
continue
ix = np.argmin(cnts[cats])
if cnts[cats[ix]] < num_ex_per_class:
cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
sel_set.append(cur)
sel_set_cats.append(cats)
sel_set=np.array(sel_set)
p = np.random.permutation(np.arange(len(sel_set)))
sel_set = sel_set[p]
self.index = sel_set
self.size = len(self.index)
print('total size {0}'.format(self.size))
def order_classes_incrementally(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
num_ex_between_extras = self.cfg.dataset.num_ex_between_extras
cls=[x['gt_classes'] for x in self.roidb]
base_set=[]
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
base_flags = np.zeros((num_classes,),dtype=bool)
if self.cfg.dataset.num_ex_base_limit > 0:
base_cnts = np.zeros((10000))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
is_base = True
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
is_base = False
base_flags[cats] = is_base
if is_base:
if self.cfg.dataset.num_ex_base_limit > 0:
ix = np.argmin(base_cnts[cats])
if base_cnts[cats[ix]] < self.cfg.dataset.num_ex_base_limit:
base_cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
base_set.append(cur)
base_set=np.array(base_set)
inds=[]
extra_cat_inds=[i for i in range(len(base_flags)) if not base_flags[i]]
for iC, C in enumerate(extra_cat_inds):
print(C)
if iC > self.cfg.dataset.max_num_extra_classes:
break
base_set_ind = 0
cat_ix = np.array([i for i in range(len(cls)) if C+1 in cls[i]])
p = np.random.permutation(np.arange(len(cat_ix)))
cat_ix = cat_ix[p]
for iE in range(num_ex_per_class):
inds.append(np.array([cat_ix[iE]]))
inds.append(base_set[base_set_ind:base_set_ind+num_ex_between_extras])
if base_set_ind >= (len(base_set)-num_ex_between_extras):
base_set_ind = 0
else:
base_set_ind += num_ex_between_extras
base_set = np.concatenate((base_set,cat_ix[0:num_ex_per_class]))
p = np.random.permutation(np.arange(len(base_set)))
base_set = base_set[p]
inds=np.concatenate(inds)
self.index = inds
self.size = len(self.index)
print('total size {0}'.format(self.size))
def apply_index_constraints(self):
# self.roidb, per_category_epoch_max
# self.index
valid = np.ones(self.index.shape,dtype=bool)
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
cls_counts = np.zeros((num_classes,))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if np.any(cls_counts[cats] < m):
cls_counts[cats] += 1
else:
valid[ix] = False
self.index = self.index[valid]
self.size = len(self.index)
def iter_next(self):
return self.cur + self.batch_size <= self.size
def next(self):
if self.iter_next():
self.get_batch_parallel()
# self.get_batch()
self.cur += self.batch_size
return mx.io.DataBatch(data=self.data, label=self.label,
pad=self.getpad(), index=self.getindex(),
provide_data=self.provide_data, provide_label=self.provide_label)
else:
raise StopIteration
def getindex(self):
return self.cur / self.batch_size
def getpad(self):
if self.cur + self.batch_size > self.size:
return self.cur + self.batch_size - self.size
else:
return 0
def infer_shape(self, max_data_shape=None, max_label_shape=None):
""" Return maximum data and label shape for single gpu """
if max_data_shape is None:
max_data_shape = []
if max_label_shape is None:
max_label_shape = []
max_shapes = dict(max_data_shape + max_label_shape)
input_batch_size = max_shapes['data'][0]
im_info = [[max_shapes['data'][2], max_shapes['data'][3], 1.0]]
feat_shape = [y[1] for y in [x.infer_shape(**max_shapes) for x in self.feat_sym]]
label = assign_pyramid_anchor(feat_shape, np.zeros((0, 5)), im_info, self.cfg,
self.feat_strides, self.anchor_scales, self.anchor_ratios, self.allowed_border)
label = [label[k] for k in self.label_name]
label_shape = [(k, tuple([input_batch_size] + list(v.shape[1:]))) for k, v in zip(self.label_name, label)]
return max_data_shape, label_shape
def get_batch_parallel(self):
cur_from = self.cur
cur_to = min(cur_from + self.batch_size, self.size)
roidb = [self.roidb[self.index[i]] for i in range(cur_from, cur_to)]
# if len(roidb)>0:
# print('index '+str(self.index[cur_from]) )
# for entry in roidb:
# print(entry['image'])
# print('width '+ str(entry['width']))
# print('height ' + str(entry['height']))
# decide multi device slice
work_load_list = self.work_load_list
ctx = self.ctx
if work_load_list is None:
work_load_list = [1] * len(ctx)
assert isinstance(work_load_list, list) and len(work_load_list) == len(ctx), \
"Invalid settings for work load. "
slices = _split_input_slice(self.batch_size, work_load_list)
rst = []
for idx, islice in enumerate(slices):
iroidb = [roidb[i] for i in range(islice.start, islice.stop)]
rst.append(par_assign_anchor_wrapper(self.cfg, iroidb, self.feat_sym, self.feat_strides, self.anchor_scales,
self.anchor_ratios, self.allowed_border))
all_data = [_['data'] for _ in rst]
all_label = [_['label'] for _ in rst]
all_img_fname = [_['img_fname'] for _ in rst]
self.data = [[mx.nd.array(data[key]) for key in self.data_name] for data in all_data]
self.label = [[mx.nd.array(label[key]) for key in self.label_name] for label in all_label]
self.img_fname = all_img_fname
class PyramidAnchorIterator_resumable(mx.io.DataIter):
# pool = Pool(processes=4)
def __init__(self, feat_sym, roidb, cfg, batch_size=1, shuffle=False, ctx=None, work_load_list=None,
feat_strides=(4, 8, 16, 32, 64), anchor_scales=(8, ), anchor_ratios=(0.5, 1, 2), allowed_border=0,
aspect_grouping=False, order=None):
"""
This Iter will provide roi data to Fast R-CNN network
:param feat_sym: to infer shape of assign_output
:param roidb: must be preprocessed
:param batch_size: must divide BATCH_SIZE(128)
:param shuffle: bool
:param ctx: list of contexts
:param work_load_list: list of work load
:param aspect_grouping: group images with similar aspects
:return: AnchorLoader
"""
super(PyramidAnchorIterator_resumable, self).__init__()
# save parameters as properties
self.feat_sym = feat_sym
import random
random.seed(901)
from random import shuffle
self.roidb = roidb
self.order = np.random.permutation(len(roidb)) if order is None else order
self.roidb = [self.roidb[idx] for idx in self.order]
self.cfg = cfg
self.batch_size = batch_size
self.shuffle = shuffle
self.ctx = ctx
if self.ctx is None:
self.ctx = [mx.cpu()]
self.work_load_list = work_load_list
self.feat_strides = feat_strides
self.anchor_scales = anchor_scales
self.anchor_ratios = anchor_ratios
self.allowed_border = allowed_border
self.aspect_grouping = aspect_grouping
# infer properties from roidb
self.size = len(roidb)
self.index = np.arange(self.size)
# decide data and label names
if self.cfg.TRAIN.END2END:
self.data_name = ['data', 'im_info', 'gt_boxes']
else:
self.data_name = ['data']
self.feat_pyramid_level = np.log2(self.cfg.network.RPN_FEAT_STRIDE).astype(int)
# self.label_name = ['label_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_target_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_weight_p' + str(x) for x in self.feat_pyramid_level]
if self.cfg.network.base_net_lock:
self.label_name = []
else:
self.label_name = ['label', 'bbox_target', 'bbox_weight']
# status variable for synchronization between get_data and get_label
self.cur = 0
self.batch = None
self.data = None
self.label = None
self.img_fname= None
# get first batch to fill in provide_data and provide_label
self.reset()
self.get_batch_parallel()
@property
def provide_data(self):
return [[(k, v.shape) for k, v in zip(self.data_name, self.data[i])] for i in xrange(len(self.data))]
@property
def provide_label(self):
return [[(k, v.shape) for k, v in zip(self.label_name, self.label[i])] for i in xrange(len(self.data))]
@property
def provide_data_single(self):
return [(k, v.shape) for k, v in zip(self.data_name, self.data[0])]
@property
def provide_label_single(self):
return [(k, v.shape) for k, v in zip(self.label_name, self.label[0])]
def reset(self):
self.size = len(self.roidb)
self.cur = 0
if self.shuffle:
if self.aspect_grouping:
widths = np.array([r['width'] for r in self.roidb])
heights = np.array([r['height'] for r in self.roidb])
horz = (widths >= heights)
vert = np.logical_not(horz)
horz_inds = np.where(horz)[0]
vert_inds = np.where(vert)[0]
inds = np.hstack((np.random.permutation(horz_inds), np.random.permutation(vert_inds)))
extra = inds.shape[0] % self.batch_size
inds_ = np.reshape(inds[:-extra], (-1, self.batch_size))
row_perm = np.random.permutation(np.arange(inds_.shape[0]))
inds[:-extra] = np.reshape(inds_[row_perm, :], (-1,))
self.index = inds
else:
np.random.shuffle(self.index)
#self.apply_index_constraints()
if self.cfg.dataset.order_classes_incrementally:
self.order_classes_incrementally()
if self.cfg.dataset.balance_classes:
self.balance_classes()
def balance_classes(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
cnts = np.zeros((10000))
sel_set=[]
sel_set_cats=[]
if config.dataset.cls_filter_files is not None:
fls = config.dataset.cls_filter_files.split(':')
with open(fls[0],'rb') as f:
cls2id_map = cPickle.load(f)
with open(fls[1]) as f:
classes2use = [x.strip().lower() for x in f.readlines()]
#classes2use = [x.strip() for x in f.readlines()]
clsIds2use = set()
for cls in classes2use:
clsIds2use.add(cls2id_map[cls])
self.cfg.dataset.clsIds2use = clsIds2use.copy()
self.cfg.dataset.clsIds2use.add(0)
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if config.dataset.cls_filter_files is not None:
cats = np.array([x for x in cats if (x+1) in clsIds2use])
# else:
# cats = cats[cats < (self.cfg.dataset.NUM_CLASSES-1)]
if not cats.size:
continue
ix = np.argmin(cnts[cats])
if cnts[cats[ix]] < num_ex_per_class:
cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
sel_set.append(cur)
sel_set_cats.append(cats)
sel_set=np.array(sel_set)
p = np.random.permutation(np.arange(len(sel_set)))
sel_set = sel_set[p]
self.index = sel_set
self.size = len(self.index)
print('total size {0}'.format(self.size))
def order_classes_incrementally(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
num_ex_between_extras = self.cfg.dataset.num_ex_between_extras
cls=[x['gt_classes'] for x in self.roidb]
base_set=[]
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
base_flags = np.zeros((num_classes,),dtype=bool)
if self.cfg.dataset.num_ex_base_limit > 0:
base_cnts = np.zeros((10000))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
is_base = True
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
is_base = False
base_flags[cats] = is_base
if is_base:
if self.cfg.dataset.num_ex_base_limit > 0:
ix = np.argmin(base_cnts[cats])
if base_cnts[cats[ix]] < self.cfg.dataset.num_ex_base_limit:
base_cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
base_set.append(cur)
base_set=np.array(base_set)
inds=[]
extra_cat_inds=[i for i in range(len(base_flags)) if not base_flags[i]]
for iC, C in enumerate(extra_cat_inds):
print(C)
if iC > self.cfg.dataset.max_num_extra_classes:
break
base_set_ind = 0
cat_ix = np.array([i for i in range(len(cls)) if C+1 in cls[i]])
p = np.random.permutation(np.arange(len(cat_ix)))
cat_ix = cat_ix[p]
for iE in range(num_ex_per_class):
inds.append(np.array([cat_ix[iE]]))
inds.append(base_set[base_set_ind:base_set_ind+num_ex_between_extras])
if base_set_ind >= (len(base_set)-num_ex_between_extras):
base_set_ind = 0
else:
base_set_ind += num_ex_between_extras
base_set = np.concatenate((base_set,cat_ix[0:num_ex_per_class]))
p = np.random.permutation(np.arange(len(base_set)))
base_set = base_set[p]
inds=np.concatenate(inds)
self.index = inds
self.size = len(self.index)
print('total size {0}'.format(self.size))
def apply_index_constraints(self):
# self.roidb, per_category_epoch_max
# self.index
valid = np.ones(self.index.shape,dtype=bool)
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
cls_counts = np.zeros((num_classes,))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if np.any(cls_counts[cats] < m):
cls_counts[cats] += 1
else:
valid[ix] = False
self.index = self.index[valid]
self.size = len(self.index)
def iter_next(self):
return self.cur + self.batch_size <= self.size
def next(self):
if self.iter_next():
self.get_batch_parallel()
# self.get_batch()
self.cur += self.batch_size
return mx.io.DataBatch(data=self.data, label=self.label,
pad=self.getpad(), index=self.getindex(),
provide_data=self.provide_data, provide_label=self.provide_label)
else:
raise StopIteration
def getindex(self):
return self.cur / self.batch_size
def getpad(self):
if self.cur + self.batch_size > self.size:
return self.cur + self.batch_size - self.size
else:
return 0
def infer_shape(self, max_data_shape=None, max_label_shape=None):
""" Return maximum data and label shape for single gpu """
if max_data_shape is None:
max_data_shape = []
if max_label_shape is None:
max_label_shape = []
max_shapes = dict(max_data_shape + max_label_shape)
input_batch_size = max_shapes['data'][0]
im_info = [[max_shapes['data'][2], max_shapes['data'][3], 1.0]]
feat_shape = [y[1] for y in [x.infer_shape(**max_shapes) for x in self.feat_sym]]
label = assign_pyramid_anchor(feat_shape, np.zeros((0, 5)), im_info, self.cfg,
self.feat_strides, self.anchor_scales, self.anchor_ratios, self.allowed_border)
label = [label[k] for k in self.label_name]
label_shape = [(k, tuple([input_batch_size] + list(v.shape[1:]))) for k, v in zip(self.label_name, label)]
return max_data_shape, label_shape
def get_batch_parallel(self):
cur_from = self.cur
cur_to = min(cur_from + self.batch_size, self.size)
roidb = [self.roidb[self.index[i]] for i in range(cur_from, cur_to)]
# if len(roidb)>0:
# print('index '+str(self.index[cur_from]) )
# for entry in roidb:
# print(entry['image'])
# print('width '+ str(entry['width']))
# print('height ' + str(entry['height']))
# decide multi device slice
work_load_list = self.work_load_list
ctx = self.ctx
if work_load_list is None:
work_load_list = [1] * len(ctx)
assert isinstance(work_load_list, list) and len(work_load_list) == len(ctx), \
"Invalid settings for work load. "
slices = _split_input_slice(self.batch_size, work_load_list)
rst = []
for idx, islice in enumerate(slices):
iroidb = [roidb[i] for i in range(islice.start, islice.stop)]
rst.append(par_assign_anchor_wrapper(self.cfg, iroidb, self.feat_sym, self.feat_strides, self.anchor_scales,
self.anchor_ratios, self.allowed_border))
all_data = [_['data'] for _ in rst]
all_label = [_['label'] for _ in rst]
all_img_fname = [_['img_fname'] for _ in rst]
self.data = [[mx.nd.array(data[key]) for key in self.data_name] for data in all_data]
self.label = [[mx.nd.array(label[key]) for key in self.label_name] for label in all_label]
self.img_fname = all_img_fname
#-----------------------------------------------------------------------------------
# PFP
#-----------------------------------------------------------------------------------
def par_assign_anchor_wrapper_pre_1(cfg, iroidb, feat_strides, anchor_scales, anchor_ratios, allowed_border):
# get testing data for multigpu
data, rpn_label, img_fname = get_rpn_batch(iroidb, cfg)
_, im_fname = os.path.split(img_fname)
rps_fname = os.path.join(cfg.work_root, 'precomputed_data', im_fname.replace('.jpg', '_feat.pkl'))
with open(rps_fname,'rb') as fid:
state_data = cPickle.load(fid)
fc_new_1_relu_pre = state_data['fc_new_1_relu_pre']
label = state_data['label_pre']
rois_pre = state_data['rois_pre']
bbox_weight_pre = state_data['bbox_weight_pre']
bbox_target_pre = state_data['bbox_target_pre']
del data['data']
data['fc_new_1_relu_pre'] = fc_new_1_relu_pre
del data['im_info']
data_shape = {k: v.shape for k, v in data.items()}
# add gt_boxes to data for e2e
#data['gt_boxes'] = rpn_label['gt_boxes'][np.newaxis, :, :]
data['rois_pre'] = rois_pre
data['bbox_weight_pre'] = bbox_weight_pre
data['bbox_target_pre'] = bbox_target_pre
# if not cfg.network.base_net_lock:
# feat_shape = [y[1] for y in [x.infer_shape(**data_shape) for x in feat_sym]]
# label = assign_pyramid_anchor(feat_shape, rpn_label['gt_boxes'], data['im_info'], cfg,
# feat_strides, anchor_scales, anchor_ratios, allowed_border)
# else:
#label = None
return {'data': data, 'label': label,'img_fname':img_fname}
def par_assign_anchor_wrapper_pre_2(cfg, iroidb, feat_sym, feat_strides, anchor_scales, anchor_ratios, allowed_border,data_names):
# get testing data for multigpu
data, rpn_label, img_fname = get_rpn_batch(iroidb, cfg)
# data_shape = {k: v.shape for k, v in data.items()}
del data['data']
#del data['im_info']
_, im_fname = os.path.split(img_fname)
rps_fname = os.path.join(cfg.work_root, 'precomputed_data', im_fname.replace('.jpg', '_feat.pkl'))
with open(rps_fname,'rb') as fid:
state_data = cPickle.load(fid)
data['fpn_p2_pre'] = state_data['fpn_p2_pre']
data['fpn_p3_pre'] = state_data['fpn_p3_pre']
data['fpn_p4_pre'] = state_data['fpn_p4_pre']
data['fpn_p5_pre'] = state_data['fpn_p5_pre']
data['fpn_p6_pre'] = state_data['fpn_p6_pre']
data['label_pre'] = state_data['label_pre']
#data_names = ['fpn_p2_pre','fpn_p3_pre','fpn_p4_pre','fpn_p5_pre','fpn_p6_pre']
data_shape = {k: v.shape for k, v in data.items()}
# add gt_boxes to data for e2e
data['gt_boxes'] = rpn_label['gt_boxes'][np.newaxis, :, :]
if not cfg.network.base_net_lock:
# feat_shape = [y[1] for y in [x.infer_shape(**data_shape) for x in feat_sym]]
feat_shape = [y[1] for y in [x.infer_shape(**{k : data[k].shape}) for x, k in zip(feat_sym,data_names)]]
label = assign_pyramid_anchor(feat_shape, rpn_label['gt_boxes'], data['im_info'], cfg,
feat_strides, anchor_scales, anchor_ratios, allowed_border)
else:
label = None
return {'data': data, 'label': label,'img_fname':img_fname}
class PyramidAnchorIterator_pre_1(mx.io.DataIter):
# pool = Pool(processes=4)
def __init__(self, roidb, cfg, batch_size=1, shuffle=False, ctx=None, work_load_list=None,
feat_strides=(4, 8, 16, 32, 64), anchor_scales=(8, ), anchor_ratios=(0.5, 1, 2), allowed_border=0,
aspect_grouping=False):
"""
This Iter will provide roi data to Fast R-CNN network
:param feat_sym: to infer shape of assign_output
:param roidb: must be preprocessed
:param batch_size: must divide BATCH_SIZE(128)
:param shuffle: bool
:param ctx: list of contexts
:param work_load_list: list of work load
:param aspect_grouping: group images with similar aspects
:return: AnchorLoader
"""
super(PyramidAnchorIterator_pre_1, self).__init__()
# save parameters as properties
#self.feat_sym = feat_sym
import random
random.seed(901)
from random import shuffle
self.roidb = roidb
shuffle(self.roidb)
self.cfg = cfg
self.batch_size = batch_size
self.shuffle = shuffle
self.ctx = ctx
if self.ctx is None:
self.ctx = [mx.cpu()]
self.work_load_list = work_load_list
self.feat_strides = feat_strides
self.anchor_scales = anchor_scales
self.anchor_ratios = anchor_ratios
self.allowed_border = allowed_border
self.aspect_grouping = aspect_grouping
# infer properties from roidb
self.size = len(roidb)
self.index = np.arange(self.size)
# decide data and label names
if self.cfg.TRAIN.END2END:
self.data_name = ['fc_new_1_relu_pre', 'rois_pre','bbox_weight_pre','bbox_target_pre'] # 'gt_boxes',
else:
self.data_name = ['fc_new_1_relu_pre']
self.feat_pyramid_level = np.log2(self.cfg.network.RPN_FEAT_STRIDE).astype(int)
# self.label_name = ['label_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_target_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_weight_p' + str(x) for x in self.feat_pyramid_level]
if self.cfg.network.base_net_lock:
self.label_name = []
else:
self.label_name = ['label_pre', 'bbox_target_pre', 'bbox_weight_pre']
#self.label_name = ['label_pre']
# status variable for synchronization between get_data and get_label
self.cur = 0
self.batch = None
self.data = None
self.label = None
self.img_fname= None
# get first batch to fill in provide_data and provide_label
self.reset()
self.get_batch_parallel()
@property
def provide_data(self):
return [[(k, v.shape) for k, v in zip(self.data_name, self.data[i])] for i in xrange(len(self.data))]
@property
def provide_label(self):
return [[(k, v.shape) for k, v in zip(self.label_name, self.label[0])]]
#return [[(k, v.shape) for k, v in zip(self.label_name, self.label[i])] for i in xrange(len(self.label))]
@property
def provide_data_single(self):
return [(k, v.shape) for k, v in zip(self.data_name, self.data[0])]
@property
def provide_label_single(self):
return [(k, v.shape) for k, v in zip(self.label_name, self.label[0])]
def reset(self):
self.size = len(self.roidb)
self.cur = 0
if self.shuffle:
if self.aspect_grouping:
widths = np.array([r['width'] for r in self.roidb])
heights = np.array([r['height'] for r in self.roidb])
horz = (widths >= heights)
vert = np.logical_not(horz)
horz_inds = np.where(horz)[0]
vert_inds = np.where(vert)[0]
inds = np.hstack((np.random.permutation(horz_inds), np.random.permutation(vert_inds)))
extra = inds.shape[0] % self.batch_size
inds_ = np.reshape(inds[:-extra], (-1, self.batch_size))
row_perm = np.random.permutation(np.arange(inds_.shape[0]))
inds[:-extra] = np.reshape(inds_[row_perm, :], (-1,))
self.index = inds
else:
np.random.shuffle(self.index)
#self.apply_index_constraints()
if self.cfg.dataset.order_classes_incrementally:
self.order_classes_incrementally()
if self.cfg.dataset.balance_classes:
self.balance_classes()
def balance_classes(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
cnts = np.zeros((10000))
sel_set=[]
sel_set_cats=[]
if config.dataset.cls_filter_files is not None:
import cPickle
fls = config.dataset.cls_filter_files.split(':')
with open(fls[0],'rb') as f:
cls2id_map = cPickle.load(f)
with open(fls[1]) as f:
classes2use = [x.strip().lower() for x in f.readlines()]
#classes2use = [x.strip() for x in f.readlines()]
clsIds2use = set()
for cls in classes2use:
clsIds2use.add(cls2id_map[cls])
self.cfg.dataset.clsIds2use = clsIds2use.copy()
self.cfg.dataset.clsIds2use.add(0)
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if config.dataset.cls_filter_files is None:
cats = cats[cats < (self.cfg.dataset.NUM_CLASSES-1)]
else:
cats = np.array([x for x in cats if (x+1) in clsIds2use])
if not cats.size:
continue
ix = np.argmin(cnts[cats])
if cnts[cats[ix]] < num_ex_per_class:
cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
sel_set.append(cur)
sel_set_cats.append(cats)
sel_set=np.array(sel_set)
p = np.random.permutation(np.arange(len(sel_set)))
sel_set = sel_set[p]
self.index = sel_set
self.size = len(self.index)
print('total size {0}'.format(self.size))
def order_classes_incrementally(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
num_ex_between_extras = self.cfg.dataset.num_ex_between_extras
cls=[x['gt_classes'] for x in self.roidb]
base_set=[]
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
base_flags = np.zeros((num_classes,),dtype=bool)
if self.cfg.dataset.num_ex_base_limit > 0:
base_cnts = np.zeros((10000))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
is_base = True
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
is_base = False
base_flags[cats] = is_base
if is_base:
if self.cfg.dataset.num_ex_base_limit > 0:
ix = np.argmin(base_cnts[cats])
if base_cnts[cats[ix]] < self.cfg.dataset.num_ex_base_limit:
base_cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
base_set.append(cur)
base_set=np.array(base_set)
inds=[]
extra_cat_inds=[i for i in range(len(base_flags)) if not base_flags[i]]
for iC, C in enumerate(extra_cat_inds):
print(C)
if iC > self.cfg.dataset.max_num_extra_classes:
break
base_set_ind = 0
cat_ix = np.array([i for i in range(len(cls)) if C+1 in cls[i]])
p = np.random.permutation(np.arange(len(cat_ix)))
cat_ix = cat_ix[p]
for iE in range(num_ex_per_class):
inds.append(np.array([cat_ix[iE]]))
inds.append(base_set[base_set_ind:base_set_ind+num_ex_between_extras])
if base_set_ind >= (len(base_set)-num_ex_between_extras):
base_set_ind = 0
else:
base_set_ind += num_ex_between_extras
base_set = np.concatenate((base_set,cat_ix[0:num_ex_per_class]))
p = np.random.permutation(np.arange(len(base_set)))
base_set = base_set[p]
inds=np.concatenate(inds)
self.index = inds
self.size = len(self.index)
print('total size {0}'.format(self.size))
def apply_index_constraints(self):
# self.roidb, per_category_epoch_max
# self.index
valid = np.ones(self.index.shape,dtype=bool)
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
cls_counts = np.zeros((num_classes,))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if np.any(cls_counts[cats] < m):
cls_counts[cats] += 1
else:
valid[ix] = False
self.index = self.index[valid]
self.size = len(self.index)
def iter_next(self):
return self.cur + self.batch_size <= self.size
def next(self):
if self.iter_next():
self.get_batch_parallel()
# self.get_batch()
self.cur += self.batch_size
return mx.io.DataBatch(data=self.data, label=self.label,
pad=self.getpad(), index=self.getindex(),
provide_data=self.provide_data, provide_label=self.provide_label)
else:
raise StopIteration
def getindex(self):
return self.cur / self.batch_size
def getpad(self):
if self.cur + self.batch_size > self.size:
return self.cur + self.batch_size - self.size
else:
return 0
def infer_shape(self, max_data_shape=None, max_label_shape=None):
""" Return maximum data and label shape for single gpu """
if max_data_shape is None:
max_data_shape = []
if max_label_shape is None:
max_label_shape = []
max_shapes = dict(max_data_shape + max_label_shape)
input_batch_size = max_shapes['data'][0]
feat_info = [[max_shapes['data'][2], max_shapes['data'][3], 1.0]]
return max_data_shape, max_label_shape
def get_batch_parallel(self):
cur_from = self.cur
cur_to = min(cur_from + self.batch_size, self.size)
roidb = [self.roidb[self.index[i]] for i in range(cur_from, cur_to)]
# if len(roidb)>0:
# print('index '+str(self.index[cur_from]) )
# for entry in roidb:
# print(entry['image'])
# print('width '+ str(entry['width']))
# print('height ' + str(entry['height']))
# decide multi device slice
work_load_list = self.work_load_list
ctx = self.ctx
if work_load_list is None:
work_load_list = [1] * len(ctx)
assert isinstance(work_load_list, list) and len(work_load_list) == len(ctx), \
"Invalid settings for work load. "
slices = _split_input_slice(self.batch_size, work_load_list)
rst = []
for idx, islice in enumerate(slices):
iroidb = [roidb[i] for i in range(islice.start, islice.stop)]
rst.append(par_assign_anchor_wrapper_pre_1(self.cfg, iroidb, self.feat_strides, self.anchor_scales,
self.anchor_ratios, self.allowed_border))
all_data = [_['data'] for _ in rst]
all_label = [_['label'] for _ in rst]
all_img_fname = [_['img_fname'] for _ in rst]
self.data = [[mx.nd.array(data[key]) for key in self.data_name] for data in all_data]
self.label = [[mx.nd.array(label)] for label in all_label]
self.img_fname = all_img_fname
class PyramidAnchorIterator_pre_2(mx.io.DataIter):
# pool = Pool(processes=4)
def __init__(self, feat_sym, roidb, cfg, batch_size=1, shuffle=False, ctx=None, work_load_list=None,
feat_strides=(4, 8, 16, 32, 64), anchor_scales=(8, ), anchor_ratios=(0.5, 1, 2), allowed_border=0,
aspect_grouping=False):
"""
This Iter will provide roi data to Fast R-CNN network
:param feat_sym: to infer shape of assign_output
:param roidb: must be preprocessed
:param batch_size: must divide BATCH_SIZE(128)
:param shuffle: bool
:param ctx: list of contexts
:param work_load_list: list of work load
:param aspect_grouping: group images with similar aspects
:return: AnchorLoader
"""
super(PyramidAnchorIterator_pre_2, self).__init__()
# save parameters as properties
self.feat_sym = feat_sym
import random
random.seed(901)
from random import shuffle
self.roidb = roidb
shuffle(self.roidb)
self.cfg = cfg
self.batch_size = batch_size
self.shuffle = shuffle
self.ctx = ctx
if self.ctx is None:
self.ctx = [mx.cpu()]
self.work_load_list = work_load_list
self.feat_strides = feat_strides
self.anchor_scales = anchor_scales
self.anchor_ratios = anchor_ratios
self.allowed_border = allowed_border
self.aspect_grouping = aspect_grouping
# infer properties from roidb
self.size = len(roidb)
self.index = np.arange(self.size)
# decide data and label names
if self.cfg.TRAIN.END2END:
self.data_name = ['fpn_p2_pre','fpn_p3_pre','fpn_p4_pre','fpn_p5_pre','fpn_p6_pre','im_info','gt_boxes']
else:
self.data_name = ['data']
self.feat_pyramid_level = np.log2(self.cfg.network.RPN_FEAT_STRIDE).astype(int)
# self.label_name = ['label_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_target_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_weight_p' + str(x) for x in self.feat_pyramid_level]
if self.cfg.network.base_net_lock:
self.label_name = []
else:
self.label_name = ['label', 'bbox_target', 'bbox_weight']
# status variable for synchronization between get_data and get_label
self.cur = 0
self.batch = None
self.data = None
self.label = None
self.img_fname= None
# get first batch to fill in provide_data and provide_label
self.reset()
self.get_batch_parallel()
@property
def provide_data(self):
return [[(k, v.shape) for k, v in zip(self.data_name, self.data[i])] for i in xrange(len(self.data))]
@property
def provide_label(self):
return [[(k, v.shape) for k, v in zip(self.label_name, self.label[i])] for i in xrange(len(self.data))]
@property
def provide_data_single(self):
return [(k, v.shape) for k, v in zip(self.data_name, self.data[0])]
@property
def provide_label_single(self):
return [(k, v.shape) for k, v in zip(self.label_name, self.label[0])]
def reset(self):
self.size = len(self.roidb)
self.cur = 0
if self.shuffle:
if self.aspect_grouping:
widths = np.array([r['width'] for r in self.roidb])
heights = np.array([r['height'] for r in self.roidb])
horz = (widths >= heights)
vert = np.logical_not(horz)
horz_inds = np.where(horz)[0]
vert_inds = np.where(vert)[0]
inds = np.hstack((np.random.permutation(horz_inds), np.random.permutation(vert_inds)))
extra = inds.shape[0] % self.batch_size
inds_ = np.reshape(inds[:-extra], (-1, self.batch_size))
row_perm = np.random.permutation(np.arange(inds_.shape[0]))
inds[:-extra] = np.reshape(inds_[row_perm, :], (-1,))
self.index = inds
else:
np.random.shuffle(self.index)
#self.apply_index_constraints()
if self.cfg.dataset.order_classes_incrementally:
self.order_classes_incrementally()
if self.cfg.dataset.balance_classes:
self.balance_classes()
def balance_classes(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
cnts = np.zeros((10000))
sel_set=[]
sel_set_cats=[]
if config.dataset.cls_filter_files is not None:
fls = config.dataset.cls_filter_files.split(':')
with open(fls[0],'rb') as f:
cls2id_map = cPickle.load(f)
with open(fls[1]) as f:
classes2use = [x.strip().lower() for x in f.readlines()]
#classes2use = [x.strip() for x in f.readlines()]
clsIds2use = set()
for cls in classes2use:
clsIds2use.add(cls2id_map[cls])
self.cfg.dataset.clsIds2use = clsIds2use.copy()
self.cfg.dataset.clsIds2use.add(0)
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if config.dataset.cls_filter_files is not None:
# cats = cats[cats < (self.cfg.dataset.NUM_CLASSES-1)]
# else:
cats = np.array([x for x in cats if (x+1) in clsIds2use])
if not cats.size:
continue
ix = np.argmin(cnts[cats])
if cnts[cats[ix]] < num_ex_per_class:
cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
sel_set.append(cur)
sel_set_cats.append(cats)
sel_set=np.array(sel_set)
p = np.random.permutation(np.arange(len(sel_set)))
sel_set = sel_set[p]
self.index = sel_set
self.size = len(self.index)
print('total size {0}'.format(self.size))
def order_classes_incrementally(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
num_ex_between_extras = self.cfg.dataset.num_ex_between_extras
cls=[x['gt_classes'] for x in self.roidb]
base_set=[]
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
base_flags = np.zeros((num_classes,),dtype=bool)
if self.cfg.dataset.num_ex_base_limit > 0:
base_cnts = np.zeros((10000))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
is_base = True
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
is_base = False
base_flags[cats] = is_base
if is_base:
if self.cfg.dataset.num_ex_base_limit > 0:
ix = np.argmin(base_cnts[cats])
if base_cnts[cats[ix]] < self.cfg.dataset.num_ex_base_limit:
base_cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
base_set.append(cur)
base_set=np.array(base_set)
inds=[]
extra_cat_inds=[i for i in range(len(base_flags)) if not base_flags[i]]
for iC, C in enumerate(extra_cat_inds):
print(C)
if iC > self.cfg.dataset.max_num_extra_classes:
break
base_set_ind = 0
cat_ix = np.array([i for i in range(len(cls)) if C+1 in cls[i]])
p = np.random.permutation(np.arange(len(cat_ix)))
cat_ix = cat_ix[p]
for iE in range(num_ex_per_class):
inds.append(np.array([cat_ix[iE]]))
inds.append(base_set[base_set_ind:base_set_ind+num_ex_between_extras])
if base_set_ind >= (len(base_set)-num_ex_between_extras):
base_set_ind = 0
else:
base_set_ind += num_ex_between_extras
base_set = np.concatenate((base_set,cat_ix[0:num_ex_per_class]))
p = np.random.permutation(np.arange(len(base_set)))
base_set = base_set[p]
inds=np.concatenate(inds)
self.index = inds
self.size = len(self.index)
print('total size {0}'.format(self.size))
def apply_index_constraints(self):
# self.roidb, per_category_epoch_max
# self.index
valid = np.ones(self.index.shape,dtype=bool)
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
cls_counts = np.zeros((num_classes,))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if np.any(cls_counts[cats] < m):
cls_counts[cats] += 1
else:
valid[ix] = False
self.index = self.index[valid]
self.size = len(self.index)
def iter_next(self):
return self.cur + self.batch_size <= self.size
def next(self):
if self.iter_next():
self.get_batch_parallel()
# self.get_batch()
self.cur += self.batch_size
return mx.io.DataBatch(data=self.data, label=self.label,
pad=self.getpad(), index=self.getindex(),
provide_data=self.provide_data, provide_label=self.provide_label)
else:
raise StopIteration
def getindex(self):
return self.cur / self.batch_size
def getpad(self):
if self.cur + self.batch_size > self.size:
return self.cur + self.batch_size - self.size
else:
return 0
def infer_shape(self, max_data_shape=None, max_label_shape=None):
""" Return maximum data and label shape for single gpu """
if max_data_shape is None:
max_data_shape = []
if max_label_shape is None:
max_label_shape = []
max_shapes = dict(max_data_shape + max_label_shape)
input_batch_size = max_shapes['data'][0]
feat_info = [[max_shapes['data'][2], max_shapes['data'][3], 1.0]]
return max_data_shape, max_label_shape
# def infer_shape(self, max_data_shape=None, max_label_shape=None):
# """ Return maximum data and label shape for single gpu """
# if max_data_shape is None:
# max_data_shape = []
# if max_label_shape is None:
# max_label_shape = []
# max_shapes = dict(max_data_shape + max_label_shape)
# #input_batch_size = max_shapes['data'][0]
# #im_info = [[max_shapes['data'][2], max_shapes['data'][3], 1.0]]
#
# feat_shape = [y[1] for y in [x.infer_shape(**max_shapes) for x in self.feat_sym]]
# label = assign_pyramid_anchor(feat_shape, np.zeros((0, 5)), im_info, self.cfg,
# self.feat_strides, self.anchor_scales, self.anchor_ratios, self.allowed_border)
# label = [label[k] for k in self.label_name]
# label_shape = [(k, tuple([input_batch_size] + list(v.shape[1:]))) for k, v in zip(self.label_name, label)]
#
# return max_data_shape, label_shape
def get_batch_parallel(self):
cur_from = self.cur
cur_to = min(cur_from + self.batch_size, self.size)
roidb = [self.roidb[self.index[i]] for i in range(cur_from, cur_to)]
# if len(roidb)>0:
# print('index '+str(self.index[cur_from]) )
# for entry in roidb:
# print(entry['image'])
# print('width '+ str(entry['width']))
# print('height ' + str(entry['height']))
# decide multi device slice
work_load_list = self.work_load_list
ctx = self.ctx
if work_load_list is None:
work_load_list = [1] * len(ctx)
assert isinstance(work_load_list, list) and len(work_load_list) == len(ctx), \
"Invalid settings for work load. "
slices = _split_input_slice(self.batch_size, work_load_list)
rst = []
for idx, islice in enumerate(slices):
iroidb = [roidb[i] for i in range(islice.start, islice.stop)]
rst.append(par_assign_anchor_wrapper_pre_2(self.cfg, iroidb, self.feat_sym, self.feat_strides, self.anchor_scales,
self.anchor_ratios, self.allowed_border,self.data_name))
all_data = [_['data'] for _ in rst]
all_label = [_['label'] for _ in rst]
all_img_fname = [_['img_fname'] for _ in rst]
self.data = [[mx.nd.array(data[key]) for key in self.data_name] for data in all_data]
self.label = [[mx.nd.array(label[key]) for key in self.label_name] for label in all_label]
self.img_fname = all_img_fname
#-----------------------------------------------------------------------------------
# scene
#-----------------------------------------------------------------------------------
from rpn.rpn import get_rpn_batch_scene,get_rpn_batch_scene2
class PyramidAnchorIterator_scene(mx.io.DataIter):
# pool = Pool(processes=4)
def __init__(self, feat_sym, scenedb, cfg, batch_size=1, shuffle=False, ctx=None, work_load_list=None,
feat_strides=(4, 8, 16, 32, 64), anchor_scales=(8, ), anchor_ratios=(0.5, 1, 2), allowed_border=0,
aspect_grouping=False):
"""
This Iter will provide roi data to Fast R-CNN network
:param feat_sym: to infer shape of assign_output
:param scenedb: must be preprocessed
:param batch_size: must divide BATCH_SIZE(128)
:param shuffle: bool
:param ctx: list of contexts
:param work_load_list: list of work load
:param aspect_grouping: group images with similar aspects. Not implemented.
:return: AnchorLoader
"""
super(PyramidAnchorIterator_scene, self).__init__()
# save parameters as properties
self.feat_sym = feat_sym
import random
random.seed(901)
from random import shuffle
self.scenedb = scenedb
shuffle(self.scenedb)
self.cfg = cfg
self.batch_size = batch_size
self.shuffle = shuffle
self.ctx = ctx
if self.ctx is None:
self.ctx = [mx.cpu()]
self.work_load_list = work_load_list
self.feat_strides = feat_strides
self.anchor_scales = anchor_scales
self.anchor_ratios = anchor_ratios
self.allowed_border = allowed_border
# infer properties from scenedb
self.size = len(scenedb)
self.index = np.arange(self.size)
views_list = scenedb[0]['boxes_views'].keys()
if False: # 3D tfm lab
calib_file ='/dccstor/jsdata1/dev/RepMet/data/JES_pilot/cam_setup.txt'
from utils.JES3D_transform import JES3D_transform
Htfm = JES3D_transform(calib_file)
box_top = scenedb[0]['boxes_views']['top'][4]
pt_top = [box_top[0], box_top[1]]
pt_left = Htfm.trans_rot(pt_top, 2,0)
print(pt_left)
pt_right = Htfm.trans_rot(pt_top, 2, 1)
print(pt_right)
pt_top = [box_top[2], box_top[3]]
pt_left = Htfm.trans_rot(pt_top, 2,0)
print(pt_left)
pt_right = Htfm.trans_rot(pt_top, 2, 1)
print(pt_right)
a = 1
# decide data and label names
self.data_name =[]
if self.cfg.TRAIN.END2END:
#for view in views_list:
#self.data_name.append('data_' + view)
#self.data_name.append('im_info_' + view)
self.data_name.append('data')
self.data_name.append('im_info_top')
self.data_name.append('gt_boxes')
#self.data_name.append('homog_data')
else:
self.data_name.append('data')
# self.data_name.append(['data_' + view])
self.feat_pyramid_level = np.log2(self.cfg.network.RPN_FEAT_STRIDE).astype(int)
# self.label_name = ['label_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_target_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_weight_p' + str(x) for x in self.feat_pyramid_level]
self.label_name = []
if not self.cfg.network.base_net_lock:
# for view in views_list:
# self.label_name.append('label_' + view, 'bbox_target_' + view, 'bbox_weight_' + view])
self.label_name.extend(['label', 'bbox_target', 'bbox_weight'])
# status variable for synchronization between get_data and get_label
self.cur = 0
self.batch = None
self.data = None
self.label = None
self.img_fname= None
# get first batch to fill in provide_data and provide_label
self.reset()
self.get_batch_parallel()
@property
def provide_data(self):
return [[(k, v.shape) for k, v in zip(self.data_name, self.data[i])] for i in xrange(len(self.data))]
@property
def provide_label(self):
return [[(k, v.shape) for k, v in zip(self.label_name, self.label[i])] for i in xrange(len(self.data))]
@property
def provide_data_single(self):
return [(k, v.shape) for k, v in zip(self.data_name, self.data[0])]
@property
def provide_label_single(self):
return [(k, v.shape) for k, v in zip(self.label_name, self.label[0])]
def reset(self):
self.size = len(self.scenedb)
self.cur = 0
if self.shuffle:
np.random.shuffle(self.index)
#self.apply_index_constraints()
if self.cfg.dataset.order_classes_incrementally:
self.order_classes_incrementally()
if self.cfg.dataset.balance_classes:
self.balance_classes()
def balance_classes(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
cnts = np.zeros((10000))
sel_set=[]
sel_set_cats=[]
if config.dataset.cls_filter_files is not None:
fls = config.dataset.cls_filter_files.split(':')
with open(fls[0],'rb') as f:
cls2id_map = cPickle.load(f)
with open(fls[1]) as f:
classes2use = [x.strip().lower() for x in f.readlines()]
#classes2use = [x.strip() for x in f.readlines()]
clsIds2use = set()
for cls in classes2use:
clsIds2use.add(cls2id_map[cls])
self.cfg.dataset.clsIds2use = clsIds2use.copy()
self.cfg.dataset.clsIds2use.add(0)
for ix, cur in enumerate(self.index):
roi = self.scenedb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if config.dataset.cls_filter_files is not None:
cats = np.array([x for x in cats if (x+1) in clsIds2use])
# else:
# cats = cats[cats < (self.cfg.dataset.NUM_CLASSES-1)]
if not cats.size:
continue
ix = np.argmin(cnts[cats])
if cnts[cats[ix]] < num_ex_per_class:
cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
sel_set.append(cur)
sel_set_cats.append(cats)
sel_set=np.array(sel_set)
p = np.random.permutation(np.arange(len(sel_set)))
sel_set = sel_set[p]
self.index = sel_set
self.size = len(self.index)
print('total size {0}'.format(self.size))
def order_classes_incrementally(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
num_ex_between_extras = self.cfg.dataset.num_ex_between_extras
cls=[x['gt_classes'] for x in self.scenedb]
base_set=[]
num_classes = np.max([np.max(x['gt_classes']) for x in self.scenedb])
base_flags = np.zeros((num_classes,),dtype=bool)
if self.cfg.dataset.num_ex_base_limit > 0:
base_cnts = np.zeros((10000))
for ix, cur in enumerate(self.index):
roi = self.scenedb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
is_base = True
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
is_base = False
base_flags[cats] = is_base
if is_base:
if self.cfg.dataset.num_ex_base_limit > 0:
ix = np.argmin(base_cnts[cats])
if base_cnts[cats[ix]] < self.cfg.dataset.num_ex_base_limit:
base_cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
base_set.append(cur)
base_set=np.array(base_set)
inds=[]
extra_cat_inds=[i for i in range(len(base_flags)) if not base_flags[i]]
for iC, C in enumerate(extra_cat_inds):
print(C)
if iC > self.cfg.dataset.max_num_extra_classes:
break
base_set_ind = 0
cat_ix = np.array([i for i in range(len(cls)) if C+1 in cls[i]])
p = np.random.permutation(np.arange(len(cat_ix)))
cat_ix = cat_ix[p]
for iE in range(num_ex_per_class):
inds.append(np.array([cat_ix[iE]]))
inds.append(base_set[base_set_ind:base_set_ind+num_ex_between_extras])
if base_set_ind >= (len(base_set)-num_ex_between_extras):
base_set_ind = 0
else:
base_set_ind += num_ex_between_extras
base_set = np.concatenate((base_set,cat_ix[0:num_ex_per_class]))
p = np.random.permutation(np.arange(len(base_set)))
base_set = base_set[p]
inds=np.concatenate(inds)
self.index = inds
self.size = len(self.index)
print('total size {0}'.format(self.size))
def apply_index_constraints(self):
# self.scenedb, per_category_epoch_max
# self.index
valid = np.ones(self.index.shape,dtype=bool)
num_classes = np.max([np.max(x['gt_classes']) for x in self.scenedb])
cls_counts = np.zeros((num_classes,))
for ix, cur in enumerate(self.index):
roi = self.scenedb[cur]
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if np.any(cls_counts[cats] < m):
cls_counts[cats] += 1
else:
valid[ix] = False
self.index = self.index[valid]
self.size = len(self.index)
def iter_next(self):
return self.cur + self.batch_size <= self.size
def next(self):
if self.iter_next():
self.get_batch_parallel()
# self.get_batch()
self.cur += self.batch_size
return mx.io.DataBatch(data=self.data, label=self.label,
pad=self.getpad(), index=self.getindex(),
provide_data=self.provide_data, provide_label=self.provide_label)
else:
raise StopIteration
def getindex(self):
return self.cur / self.batch_size
def getpad(self):
if self.cur + self.batch_size > self.size:
return self.cur + self.batch_size - self.size
else:
return 0
def infer_shape(self, max_data_shape=None, max_label_shape=None):
""" Return maximum data and label shape for single gpu """
if max_data_shape is None:
max_data_shape = []
if max_label_shape is None:
max_label_shape = []
max_shapes = dict(max_data_shape + max_label_shape)
input_batch_size = max_shapes['data'][0]
im_info = [[max_shapes['data'][2], max_shapes['data'][3], 1.0]]
feat_shape = [y[1] for y in [x.infer_shape(**max_shapes) for x in self.feat_sym]]
label = assign_pyramid_anchor(feat_shape, np.zeros((0, 5)), im_info, self.cfg,
self.feat_strides, self.anchor_scales, self.anchor_ratios, self.allowed_border)
label = [label[k] for k in self.label_name]
label_shape = [(k, tuple([input_batch_size] + list(v.shape[1:]))) for k, v in zip(self.label_name, label)]
return max_data_shape, label_shape
def get_batch_parallel(self):
cur_from = self.cur
cur_to = min(cur_from + self.batch_size, self.size)
scenedb = [self.scenedb[self.index[i]] for i in range(cur_from, cur_to)]
# decide multi device slice
work_load_list = self.work_load_list
ctx = self.ctx
if work_load_list is None:
work_load_list = [1] * len(ctx)
assert isinstance(work_load_list, list) and len(work_load_list) == len(ctx), \
"Invalid settings for work load. "
slices = _split_input_slice(self.batch_size, work_load_list)
rst = []
for idx, islice in enumerate(slices):
iscenedb = [scenedb[i] for i in range(islice.start, islice.stop)]
rst.append(par_assign_anchor_wrapper_scene(self.cfg, iscenedb, self.feat_sym, self.feat_strides, self.anchor_scales,
self.anchor_ratios, self.allowed_border))
all_data = [_['data'] for _ in rst]
all_label = [_['label'] for _ in rst]
all_img_fname = [_['img_fname'] for _ in rst]
self.data = [[mx.nd.array(data[key]) for key in self.data_name] for data in all_data]
self.label = [[mx.nd.array(label[key]) for key in self.label_name] for label in all_label]
self.img_fname = all_img_fname
class PyramidAnchorIterator_scene2(mx.io.DataIter):
# pool = Pool(processes=4)
def __init__(self, feat_sym, roidb, cfg, batch_size=1, shuffle=False, ctx=None, work_load_list=None,
feat_strides=(4, 8, 16, 32, 64), anchor_scales=(8, ), anchor_ratios=(0.5, 1, 2), allowed_border=0,
aspect_grouping=False):
"""
This Iter will provide roi data to Fast R-CNN network
:param feat_sym: to infer shape of assign_output
:param roidb: must be preprocessed
:param batch_size: must divide BATCH_SIZE(128)
:param shuffle: bool
:param ctx: list of contexts
:param work_load_list: list of work load
:param aspect_grouping: group images with similar aspects
:return: AnchorLoader
"""
super(PyramidAnchorIterator_scene2, self).__init__()
# save parameters as properties
self.feat_sym = feat_sym
import random
random.seed(901)
from random import shuffle
self.roidb = roidb
shuffle(self.roidb)
self.cfg = cfg
self.batch_size = batch_size
self.shuffle = shuffle
self.ctx = ctx
if self.ctx is None:
self.ctx = [mx.cpu()]
self.work_load_list = work_load_list
self.feat_strides = feat_strides
self.anchor_scales = anchor_scales
self.anchor_ratios = anchor_ratios
self.allowed_border = allowed_border
self.aspect_grouping = aspect_grouping
# infer properties from roidb
self.size = len(roidb)
self.index = np.arange(self.size)
# decide data and label names
if self.cfg.TRAIN.END2END:
self.data_name = ['data', 'im_info', 'gt_boxes']
else:
self.data_name = ['data']
self.feat_pyramid_level = np.log2(self.cfg.network.RPN_FEAT_STRIDE).astype(int)
# self.label_name = ['label_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_target_p' + str(x) for x in self.feat_pyramid_level] +\
# ['bbox_weight_p' + str(x) for x in self.feat_pyramid_level]
if self.cfg.network.base_net_lock:
self.label_name = []
else:
self.label_name = ['label', 'bbox_target', 'bbox_weight']
# status variable for synchronization between get_data and get_label
self.cur = 0
self.batch = None
self.data = None
self.label = None
self.img_fname= None
# get first batch to fill in provide_data and provide_label
self.reset()
self.get_batch_parallel()
@property
def provide_data(self):
return [[(k, v.shape) for k, v in zip(self.data_name, self.data[i])] for i in xrange(len(self.data))]
@property
def provide_label(self):
return [[(k, v.shape) for k, v in zip(self.label_name, self.label[i])] for i in xrange(len(self.data))]
@property
def provide_data_single(self):
return [(k, v.shape) for k, v in zip(self.data_name, self.data[0])]
@property
def provide_label_single(self):
return [(k, v.shape) for k, v in zip(self.label_name, self.label[0])]
def reset(self):
self.size = len(self.roidb)
self.cur = 0
if self.shuffle:
if self.aspect_grouping:
widths = np.array([r['width'] for r in self.roidb])
heights = np.array([r['height'] for r in self.roidb])
horz = (widths >= heights)
vert = np.logical_not(horz)
horz_inds = np.where(horz)[0]
vert_inds = np.where(vert)[0]
inds = np.hstack((np.random.permutation(horz_inds), np.random.permutation(vert_inds)))
extra = inds.shape[0] % self.batch_size
inds_ = np.reshape(inds[:-extra], (-1, self.batch_size))
row_perm = np.random.permutation(np.arange(inds_.shape[0]))
inds[:-extra] = np.reshape(inds_[row_perm, :], (-1,))
self.index = inds
else:
np.random.shuffle(self.index)
#self.apply_index_constraints()
if self.cfg.dataset.order_classes_incrementally:
self.order_classes_incrementally()
if self.cfg.dataset.balance_classes:
self.balance_classes()
def balance_classes(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
cnts = np.zeros((10000))
sel_set=[]
sel_set_cats=[]
if config.dataset.cls_filter_files is not None:
fls = config.dataset.cls_filter_files.split(':')
with open(fls[0],'rb') as f:
cls2id_map = cPickle.load(f)
with open(fls[1]) as f:
classes2use = [x.strip().lower() for x in f.readlines()]
#classes2use = [x.strip() for x in f.readlines()]
clsIds2use = set()
for cls in classes2use:
clsIds2use.add(cls2id_map[cls])
self.cfg.dataset.clsIds2use = clsIds2use.copy()
self.cfg.dataset.clsIds2use.add(0)
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if config.dataset.cls_filter_files is not None:
cats = np.array([x for x in cats if (x+1) in clsIds2use])
# else:
# cats = cats[cats < (self.cfg.dataset.NUM_CLASSES-1)]
if not cats.size:
continue
ix = np.argmin(cnts[cats])
if cnts[cats[ix]] < num_ex_per_class:
cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
sel_set.append(cur)
sel_set_cats.append(cats)
sel_set=np.array(sel_set)
p = np.random.permutation(np.arange(len(sel_set)))
sel_set = sel_set[p]
self.index = sel_set
self.size = len(self.index)
print('total size {0}'.format(self.size))
def order_classes_incrementally(self):
num_ex_per_class = self.cfg.dataset.num_ex_per_class
num_ex_between_extras = self.cfg.dataset.num_ex_between_extras
cls=[x['gt_classes'] for x in self.roidb]
base_set=[]
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
base_flags = np.zeros((num_classes,),dtype=bool)
if self.cfg.dataset.num_ex_base_limit > 0:
base_cnts = np.zeros((10000))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
is_base = True
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
is_base = False
base_flags[cats] = is_base
if is_base:
if self.cfg.dataset.num_ex_base_limit > 0:
ix = np.argmin(base_cnts[cats])
if base_cnts[cats[ix]] < self.cfg.dataset.num_ex_base_limit:
base_cnts[cats[ix]] += 1
else:
continue #not adding more examples, each epoch runs in random order of this
base_set.append(cur)
base_set=np.array(base_set)
inds=[]
extra_cat_inds=[i for i in range(len(base_flags)) if not base_flags[i]]
for iC, C in enumerate(extra_cat_inds):
print(C)
if iC > self.cfg.dataset.max_num_extra_classes:
break
base_set_ind = 0
cat_ix = np.array([i for i in range(len(cls)) if C+1 in cls[i]])
p = np.random.permutation(np.arange(len(cat_ix)))
cat_ix = cat_ix[p]
for iE in range(num_ex_per_class):
inds.append(np.array([cat_ix[iE]]))
inds.append(base_set[base_set_ind:base_set_ind+num_ex_between_extras])
if base_set_ind >= (len(base_set)-num_ex_between_extras):
base_set_ind = 0
else:
base_set_ind += num_ex_between_extras
base_set = np.concatenate((base_set,cat_ix[0:num_ex_per_class]))
p = np.random.permutation(np.arange(len(base_set)))
base_set = base_set[p]
inds=np.concatenate(inds)
self.index = inds
self.size = len(self.index)
print('total size {0}'.format(self.size))
def apply_index_constraints(self):
# self.roidb, per_category_epoch_max
# self.index
valid = np.ones(self.index.shape,dtype=bool)
num_classes = np.max([np.max(x['gt_classes']) for x in self.roidb])
cls_counts = np.zeros((num_classes,))
for ix, cur in enumerate(self.index):
roi = self.roidb[cur]
if 'per_category_epoch_max' in roi:
m = float(roi['per_category_epoch_max'])
if m>0: # zero means disabled
cats = roi['gt_classes'] - 1 # minus 1 for excluding BG
if np.any(cls_counts[cats] < m):
cls_counts[cats] += 1
else:
valid[ix] = False
self.index = self.index[valid]
self.size = len(self.index)
def iter_next(self):
return self.cur + self.batch_size <= self.size
def next(self):
if self.iter_next():
self.get_batch_parallel()
# self.get_batch()
self.cur += self.batch_size
return mx.io.DataBatch(data=self.data, label=self.label,
pad=self.getpad(), index=self.getindex(),
provide_data=self.provide_data, provide_label=self.provide_label)
else:
raise StopIteration
def getindex(self):
return self.cur / self.batch_size
def getpad(self):
if self.cur + self.batch_size > self.size:
return self.cur + self.batch_size - self.size
else:
return 0
def infer_shape(self, max_data_shape=None, max_label_shape=None):
""" Return maximum data and label shape for single gpu """
if max_data_shape is None:
max_data_shape = []
if max_label_shape is None:
max_label_shape = []
max_shapes = dict(max_data_shape + max_label_shape)
input_batch_size = max_shapes['data'][0]
im_info = [[max_shapes['data'][2], max_shapes['data'][3], 1.0]]
feat_shape = [y[1] for y in [x.infer_shape(**max_shapes) for x in self.feat_sym]]
label = assign_pyramid_anchor(feat_shape, np.zeros((0, 5)), im_info, self.cfg,
self.feat_strides, self.anchor_scales, self.anchor_ratios, self.allowed_border)
label = [label[k] for k in self.label_name]
label_shape = [(k, tuple([input_batch_size] + list(v.shape[1:]))) for k, v in zip(self.label_name, label)]
return max_data_shape, label_shape
def get_batch_parallel(self):
cur_from = self.cur
cur_to = min(cur_from + self.batch_size, self.size)
roidb = [self.roidb[self.index[i]] for i in range(cur_from, cur_to)]
# if len(roidb)>0:
# print('index '+str(self.index[cur_from]) )
# for entry in roidb:
# print(entry['image'])
# print('width '+ str(entry['width']))
# print('height ' + str(entry['height']))
# decide multi device slice
work_load_list = self.work_load_list
ctx = self.ctx
if work_load_list is None:
work_load_list = [1] * len(ctx)
assert isinstance(work_load_list, list) and len(work_load_list) == len(ctx), \
"Invalid settings for work load. "
slices = _split_input_slice(self.batch_size, work_load_list)
rst = []
for idx, islice in enumerate(slices):
iroidb = [roidb[i] for i in range(islice.start, islice.stop)]
rst.append(par_assign_anchor_wrapper_scene2(self.cfg, iroidb, self.feat_sym, self.feat_strides, self.anchor_scales,
self.anchor_ratios, self.allowed_border))
all_data = [_['data'] for _ in rst]
all_label = [_['label'] for _ in rst]
all_img_fname = [_['img_fname'] for _ in rst]
self.data = [[mx.nd.array(data[key]) for key in self.data_name] for data in all_data]
self.label = [[mx.nd.array(label[key]) for key in self.label_name] for label in all_label]
self.img_fname = all_img_fname
def par_assign_anchor_wrapper_scene2(cfg, iroidb, feat_sym, feat_strides, anchor_scales, anchor_ratios, allowed_border):
# get testing data for multigpu
data, rpn_label, img_fname = get_rpn_batch_scene2(iroidb, cfg)
data_shape = {k: v.shape for k, v in data.items()}
del data_shape['im_info']
# add gt_boxes to data for e2e
data['gt_boxes'] = rpn_label['gt_boxes'][np.newaxis, :, :]
if not cfg.network.base_net_lock:
feat_shape = [y[1] for y in [x.infer_shape(**data_shape) for x in feat_sym]]
label = assign_pyramid_anchor(feat_shape, rpn_label['gt_boxes'], data['im_info'], cfg,
feat_strides, anchor_scales, anchor_ratios, allowed_border)
else:
label = None
return {'data': data, 'label': label,'img_fname':img_fname}
def par_assign_anchor_wrapper_scene(cfg, iroidb, feat_sym, feat_strides, anchor_scales, anchor_ratios, allowed_border):
# get testing data for multigpu
data, rpn_label, img_fname = get_rpn_batch_scene(iroidb, cfg)
data_shape = {k: v.shape for k, v in data.items()}
views_list = iroidb[0]['image_views'].keys()
for view in views_list:
del data_shape['im_info_'+view]
# del data_shape['homog_data']
# add gt_boxes to data for e2e
data['gt_boxes'] = rpn_label['gt_boxes'][np.newaxis, :, :]
# del data_shape['data_left']
# del data_shape['data_right']
#data_shape['data'] = data_shape['data_top']
#del data_shape['data_top']
if not cfg.network.base_net_lock:
feat_shape = [y[1] for y in [x.infer_shape(**data_shape) for x in feat_sym]]
label = assign_pyramid_anchor(feat_shape, rpn_label['gt_boxes'], [data['im_info_top'][0]], cfg,
feat_strides, anchor_scales, anchor_ratios, allowed_border)
else:
label = None
return {'data': data, 'label': label,'img_fname':img_fname}
| 42.295673 | 130 | 0.582006 | 12,027 | 87,975 | 4.032344 | 0.030598 | 0.01631 | 0.021073 | 0.015073 | 0.955214 | 0.947646 | 0.940306 | 0.93647 | 0.934326 | 0.931089 | 0 | 0.010454 | 0.304086 | 87,975 | 2,079 | 131 | 42.316017 | 0.781683 | 0.142791 | 0 | 0.931429 | 0 | 0 | 0.034865 | 0.007828 | 0 | 0 | 0 | 0 | 0.00381 | 1 | 0.068571 | false | 0 | 0.014603 | 0.026667 | 0.134603 | 0.014603 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d8995952f79d9efb4783671417c4883e00615be1 | 196 | py | Python | reg_bench/ode/__init__.py | Ohjeah/regression-benchmarks | d4700c1a029566303c95e0cbd3bcc54b1385ff1f | [
"MIT"
] | null | null | null | reg_bench/ode/__init__.py | Ohjeah/regression-benchmarks | d4700c1a029566303c95e0cbd3bcc54b1385ff1f | [
"MIT"
] | 1 | 2018-12-07T13:08:49.000Z | 2019-01-17T13:14:04.000Z | reg_bench/ode/__init__.py | Ohjeah/regression-benchmarks | d4700c1a029566303c95e0cbd3bcc54b1385ff1f | [
"MIT"
] | null | null | null | from .integrate import generate_ode_data
from .not_so_simple_ode import *
from .simple_ode import *
from .simple_ode import all_loaders as simple_ode_loaders
all_loaders = {**simple_ode_loaders}
| 28 | 57 | 0.831633 | 31 | 196 | 4.83871 | 0.387097 | 0.3 | 0.3 | 0.253333 | 0.353333 | 0.353333 | 0.353333 | 0 | 0 | 0 | 0 | 0 | 0.112245 | 196 | 6 | 58 | 32.666667 | 0.862069 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d8aa1d252083ad0a466d83fd3be509bb4d19d77a | 4,924 | py | Python | krypy/_convenience.py | andrenarchy/krypy | 56f25817194edbe98b30e144986703a2a3137ff9 | [
"MIT"
] | 70 | 2015-01-15T02:22:53.000Z | 2022-02-19T09:52:13.000Z | krypy/_convenience.py | andrenarchy/krypy | 56f25817194edbe98b30e144986703a2a3137ff9 | [
"MIT"
] | 26 | 2015-07-08T22:01:44.000Z | 2020-12-18T11:40:02.000Z | krypy/_convenience.py | andrenarchy/krypy | 56f25817194edbe98b30e144986703a2a3137ff9 | [
"MIT"
] | 24 | 2015-01-15T09:31:45.000Z | 2022-01-03T00:30:23.000Z | import numpy
from .deflation import DeflatedCg, DeflatedGmres, DeflatedMinres
from .linsys import Cg, Gmres, LinearSystem, Minres
# The simplest inner product, `numpy.dot`, should work as an input.
# krypy assumes that the inner product _always_ returns a 2D matrix which is why we need
# to wrap.
def wrap_inner_product(inner):
def _wrap(a, b):
if a.shape[1] == 0:
return numpy.array([[]])
return numpy.array([[inner(a[:, 0], b[:, 0])]])
return _wrap
def cg(
A,
b,
M=None,
Minv=None,
Ml=None,
Mr=None,
inner_product=None,
exact_solution=None,
x0=None,
U=None,
tol=1e-5,
maxiter=None,
use_explicit_residual=False,
store_arnoldi=False,
):
assert len(A.shape) == 2
assert A.shape[0] == A.shape[1]
assert A.shape[1] == b.shape[0]
if inner_product:
inner_product = wrap_inner_product(inner_product)
# Make sure that the input vectors have two dimensions
if U is not None:
U = U.reshape(U.shape[0], -1)
if x0 is not None:
x0 = x0.reshape(x0.shape[0], -1)
linear_system = LinearSystem(
A=A,
b=b,
M=M,
Minv=Minv,
Ml=Ml,
ip_B=inner_product,
# Setting those to `True` simply avoids a warning.
self_adjoint=True,
positive_definite=True,
exact_solution=exact_solution,
)
if U is None:
out = Cg(
linear_system,
x0=x0,
tol=tol,
maxiter=maxiter,
explicit_residual=use_explicit_residual,
store_arnoldi=store_arnoldi,
)
else:
out = DeflatedCg(
linear_system,
x0=x0,
U=U,
tol=tol,
maxiter=maxiter,
explicit_residual=use_explicit_residual,
store_arnoldi=store_arnoldi,
)
return out.xk.reshape(b.shape) if out.resnorms[-1] < out.tol else None, out
def minres(
A,
b,
M=None,
Minv=None,
Ml=None,
Mr=None,
inner_product=None,
exact_solution=None,
ortho="mgs",
x0=None,
U=None,
tol=1e-5,
maxiter=None,
use_explicit_residual=False,
store_arnoldi=False,
):
assert len(A.shape) == 2
assert A.shape[0] == A.shape[1]
assert A.shape[1] == b.shape[0]
if inner_product:
inner_product = wrap_inner_product(inner_product)
# Make sure that the input vectors have two dimensions
if U is not None:
U = U.reshape(U.shape[0], -1)
if x0 is not None:
x0 = x0.reshape(x0.shape[0], -1)
linear_system = LinearSystem(
A=A,
b=b,
M=M,
Minv=Minv,
Ml=Ml,
ip_B=inner_product,
# setting self_adjoin=True avoids a warning
self_adjoint=True,
exact_solution=exact_solution,
)
if U is None:
out = Minres(
linear_system,
ortho=ortho,
x0=x0,
tol=tol,
maxiter=maxiter,
explicit_residual=use_explicit_residual,
store_arnoldi=store_arnoldi,
)
else:
out = DeflatedMinres(
linear_system,
ortho=ortho,
x0=x0,
U=U,
tol=tol,
maxiter=maxiter,
explicit_residual=use_explicit_residual,
store_arnoldi=store_arnoldi,
)
return out.xk.reshape(b.shape) if out.resnorms[-1] < out.tol else None, out
def gmres(
A,
b,
M=None,
Minv=None,
Ml=None,
Mr=None,
inner_product=None,
exact_solution=None,
ortho="mgs",
x0=None,
U=None,
tol=1e-5,
maxiter=None,
use_explicit_residual=False,
store_arnoldi=False,
):
assert len(A.shape) == 2
assert A.shape[0] == A.shape[1]
assert A.shape[1] == b.shape[0]
if inner_product:
inner_product = wrap_inner_product(inner_product)
# Make sure that the input vectors have two dimensions
if U is not None:
U = U.reshape(U.shape[0], -1)
if x0 is not None:
x0 = x0.reshape(x0.shape[0], -1)
linear_system = LinearSystem(
A=A,
b=b,
M=M,
Minv=Minv,
Ml=Ml,
ip_B=inner_product,
exact_solution=exact_solution,
)
if U is None:
out = Gmres(
linear_system,
ortho=ortho,
x0=x0,
tol=tol,
maxiter=maxiter,
explicit_residual=use_explicit_residual,
store_arnoldi=store_arnoldi,
)
else:
out = DeflatedGmres(
linear_system,
ortho=ortho,
x0=x0,
U=U,
tol=tol,
maxiter=maxiter,
explicit_residual=use_explicit_residual,
store_arnoldi=store_arnoldi,
)
return out.xk.reshape(b.shape) if out.resnorms[-1] < out.tol else None, out
| 23.673077 | 88 | 0.558692 | 638 | 4,924 | 4.169279 | 0.148903 | 0.094737 | 0.064286 | 0.054135 | 0.825564 | 0.825564 | 0.805263 | 0.805263 | 0.805263 | 0.790977 | 0 | 0.020955 | 0.340983 | 4,924 | 207 | 89 | 23.78744 | 0.798767 | 0.083469 | 0 | 0.844444 | 0 | 0 | 0.001332 | 0 | 0 | 0 | 0 | 0 | 0.05 | 1 | 0.027778 | false | 0 | 0.016667 | 0 | 0.077778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d8e2814e81b1a823f589405c4300ad37ec9f25f6 | 5,629 | py | Python | keepitpossible/common/action_table.py | ChenKuanSun/TheObstacleTowerChallenge | c2de16930dd88949c0bc6a460f378beae3a04204 | [
"Apache-2.0"
] | null | null | null | keepitpossible/common/action_table.py | ChenKuanSun/TheObstacleTowerChallenge | c2de16930dd88949c0bc6a460f378beae3a04204 | [
"Apache-2.0"
] | null | null | null | keepitpossible/common/action_table.py | ChenKuanSun/TheObstacleTowerChallenge | c2de16930dd88949c0bc6a460f378beae3a04204 | [
"Apache-2.0"
] | null | null | null | # [0, 0, 0, 0], 原地不動
# [0, 0, 0, 1], 原地+右平移
# [0, 0, 0, 2], 原地+左平移
# [0, 0, 1, 0], 原地原地跳
# [0, 0, 1, 1], 原地+右平移
# [0, 0, 1, 2], 原地+左平移
# [0, 1, 0, 0], 原地+右轉
# [0, 1, 0, 1], 原地+右轉+右平移
# [0, 1, 0, 2], 原地+右轉+左平移
# [0, 1, 1, 0], 原地+右轉+跳
# [0, 1, 1, 1], 原地+右轉+跳+右平移
# [0, 1, 1, 2], 原地+右轉+跳+左平移
# [0, 2, 0, 0], 原地+左轉
# [0, 2, 0, 1], 原地+左轉+右平移
# [0, 2, 0, 2], 原地+左轉+左平移
# [0, 2, 1, 0], 原地+左轉+跳
# [0, 2, 1, 1], 原地+左轉+跳+右平移
# [0, 2, 1, 2], 原地+左轉+跳+左平移
# [1, 0, 0, 0], 前進
# [1, 0, 0, 1], 前進+右平移
# [1, 0, 0, 2], 前進+左平移
# [1, 0, 1, 0], 前進+跳
# [1, 0, 1, 1], 前進+跳+右平移
# [1, 0, 1, 2], 前進+跳+左平移
# [1, 1, 0, 0], 前進+右轉
# [1, 1, 0, 1], 前進+右轉+右平移
# [1, 1, 0, 2], 前進+右轉+左平移
# [1, 1, 1, 0], 前進+右轉+跳
# [1, 1, 1, 1], 前進+右轉+跳+右平移
# [1, 1, 1, 2], 前進+右轉+跳+左平移
# [1, 2, 0, 0], 前進+左轉
# [1, 2, 0, 1], 前進+左轉+右平移
# [1, 2, 0, 2], 前進+左轉+左平移
# [1, 2, 1, 0], 前進+左轉+跳
# [1, 2, 1, 1], 前進+左轉+跳+右平移
# [1, 2, 1, 2], 前進+左轉+跳+左平移
# [2, 0, 0, 0], 後退
# [2, 0, 0, 1], 後退+右平移
# [2, 0, 0, 2], 後退+左平移
# [2, 0, 1, 0], 後退+跳
# [2, 0, 1, 1], 後退+跳+右平移
# [2, 0, 1, 2], 後退+跳+左平移
# [2, 1, 0, 0], 後退+右轉
# [2, 1, 0, 1], 後退+右轉+右平移
# [2, 1, 0, 2], 後退+右轉+左平移
# [2, 1, 1, 0], 後退+右轉+跳
# [2, 1, 1, 1], 後退+右轉+跳+右平移
# [2, 1, 1, 2], 後退+右轉+跳+左平移
# [2, 2, 0, 0], 後退+左轉
# [2, 2, 0, 1], 後退+左轉+右平移
# [2, 2, 0, 2], 後退+左轉+左平移
# [2, 2, 1, 0], 後退+左轉+跳
# [2, 2, 1, 1], 後退+左轉+跳+右平移
# [2, 2, 1, 2], 後退+左轉+跳+左平移
#跳占用 10偵 每偵為time 5 => 跳一次要花35遊戲時間
def create_action_table():
table_action = []
# 動作0原地右轉
table_action.append([
[0, 1, 0, 0]
])
# 動作1原地左轉
table_action.append([
[0, 2, 0, 0]
])
# 動作2前進
table_action.append([
[1, 0, 0, 0]
])
# 動作3前進右轉
table_action.append([
[1, 1, 0, 0]
])
# 動作4前進左轉
table_action.append([
[1, 2, 0, 0]
])
# 動作5前進+跳 (大前跳)
table_action.append([
[1, 0, 1, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
])
# 動作6前進+右轉+跳 (大前右跳) 右轉弧度約70度
table_action.append([
[1, 1, 1, 0],
[1, 1, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
])
# 動作7前進+左轉+跳 (大前左跳) 左轉弧度約70度
table_action.append([
[1, 2, 1, 0],
[1, 2, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
])
# 動作8前進+右轉+跳 (小前右跳) 右轉弧度約70度
table_action.append([
[1, 1, 1, 0],
[1, 1, 0, 0],
[1, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
])
# 動作9前進+左轉+跳 (小前左跳) 左轉弧度約70度
table_action.append([
[1, 2, 1, 0],
[1, 2, 0, 0],
[1, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
])
# 動作10原地
table_action.append([
[0, 0, 0, 0]
])
return table_action
def create_rainbow_action_table():
table_action = [[0, 1, 0, 0], # 動作0原地右轉
[0, 2, 0, 0], # 動作1原地左轉
[1, 0, 0, 0], # 動作2前進
[1, 1, 0, 0], # 動作3前進右轉
[1, 2, 0, 0], # 動作4前進左轉
[1, 0, 1, 0], # 動作5前跳
[0, 0, 0, 1], # 動作6右平移
[0, 0, 0, 2]] # 動作7左平移
return table_action
def create_rainbow_old_action_table():
table_action = []
# 動作0原地右轉
table_action.append([
[0, 1, 0, 0]
])
# 動作1原地左轉
table_action.append([
[0, 2, 0, 0]
])
# 動作2前進
table_action.append([
[1, 0, 0, 0]
])
# 動作3前進右轉
table_action.append([
[1, 1, 0, 0]
])
# 動作4前進左轉
table_action.append([
[1, 2, 0, 0]
])
# 動作5前進+跳 (大前跳)
table_action.append([
[1, 0, 1, 0]
])
# 動作6前進+右轉+跳 (大前右跳) 右轉弧度約70度
table_action.append([
[1, 1, 1, 0],
[1, 1, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
])
# 動作7前進+左轉+跳 (大前左跳) 左轉弧度約70度
table_action.append([
[1, 2, 1, 0],
[1, 2, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
[1, 0, 0, 0],
])
# 動作8前進+右轉+跳 (小前右跳) 右轉弧度約70度
table_action.append([
[1, 1, 1, 0],
[1, 1, 0, 0],
[1, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
])
# 動作9前進+左轉+跳 (小前左跳) 左轉弧度約70度
table_action.append([
[1, 2, 1, 0],
[1, 2, 0, 0],
[1, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
])
# 動作10原地
table_action.append([
[0, 0, 0, 0]
])
return table_action
| 22.97551 | 45 | 0.333985 | 977 | 5,629 | 1.886387 | 0.048106 | 0.284319 | 0.279978 | 0.249593 | 0.642973 | 0.634292 | 0.607705 | 0.607705 | 0.607705 | 0.607705 | 0 | 0.222465 | 0.430627 | 5,629 | 244 | 46 | 23.069672 | 0.352574 | 0.292414 | 0 | 0.95092 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018405 | false | 0 | 0 | 0 | 0.03681 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2b12143db907b5d631db3f32ec4b0ce2ebc99481 | 134,792 | py | Python | 001 water well/grid1.py | Liuzkai/3D_Scene | b3dcdfa8d13b8e0e0f2476ada790499d3b3455c2 | [
"MIT"
] | null | null | null | 001 water well/grid1.py | Liuzkai/3D_Scene | b3dcdfa8d13b8e0e0f2476ada790499d3b3455c2 | [
"MIT"
] | null | null | null | 001 water well/grid1.py | Liuzkai/3D_Scene | b3dcdfa8d13b8e0e0f2476ada790499d3b3455c2 | [
"MIT"
] | null | null | null | # Code for /obj/grid1
hou_node = hou.node("/obj/grid1")
hou_parent = hou_node.parent()
hou_node.setSelectableInViewport(True)
hou_node.showOrigin(False)
hou_node.useXray(False)
hou_node.setDisplayFlag(True)
hou_node.hide(False)
hou_node.setSelected(False)
hou_parm_template_group = hou.ParmTemplateGroup()
# Code for parameter template
hou_parm_template = hou.FolderParmTemplate("stdswitcher4", "Transform", folder_type=hou.folderType.Tabs, default_value=0, ends_tab_group=False)
# Code for parameter template
hou_parm_template2 = hou.MenuParmTemplate("xOrd", "Transform Order", menu_items=(["srt","str","rst","rts","tsr","trs"]), menu_labels=(["Scale Rot Trans","Scale Trans Rot","Rot Scale Trans","Rot Trans Scale","Trans Scale Rot","Trans Rot Scale"]), default_value=0, icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False, is_button_strip=False, strip_uses_icons=False)
hou_parm_template2.setJoinWithNext(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.MenuParmTemplate("rOrd", "Rotate Order", menu_items=(["xyz","xzy","yxz","yzx","zxy","zyx"]), menu_labels=(["Rx Ry Rz","Rx Rz Ry","Ry Rx Rz","Ry Rz Rx","Rz Rx Ry","Rz Ry Rx"]), default_value=0, icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False, is_button_strip=False, strip_uses_icons=False)
hou_parm_template2.hideLabel(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("t", "Translate", 3, default_value=([0, 0, 0]), min=0, max=10, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.XYZW)
hou_parm_template2.setTags({"autoscope": "1111111111111111111111111111111", "script_action": "import objecttoolutils\nobjecttoolutils.matchTransform(kwargs, 0)", "script_action_help": "Select an object to match the translation with.", "script_action_icon": "BUTTONS_match_transform"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("r", "Rotate", 3, default_value=([0, 0, 0]), min=0, max=360, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.XYZW)
hou_parm_template2.setTags({"autoscope": "1111111111111111111111111111111", "script_action": "import objecttoolutils\nobjecttoolutils.matchTransform(kwargs, 1)", "script_action_help": "Select an object to match the rotation with.", "script_action_icon": "BUTTONS_match_rotation"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("s", "Scale", 3, default_value=([1, 1, 1]), min=0, max=10, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.XYZW)
hou_parm_template2.setTags({"autoscope": "1111111111111111111111111111111", "script_action": "import objecttoolutils\nobjecttoolutils.matchTransform(kwargs, 2)", "script_action_help": "Select an object to match the scale with.", "script_action_icon": "BUTTONS_match_scale"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("p", "Pivot Translate", 3, default_value=([0, 0, 0]), min=0, max=10, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.XYZW)
hou_parm_template2.setTags({"script_action": "import objecttoolutils\nobjecttoolutils.matchTransform(kwargs, 3)", "script_action_help": "Select an object to match the pivot with.", "script_action_icon": "BUTTONS_match_pivot"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("pr", "Pivot Rotate", 3, default_value=([0, 0, 0]), min=0, max=10, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.XYZW)
hou_parm_template2.setTags({"script_action": "import objecttoolutils\nobjecttoolutils.matchTransform(kwargs, 4)", "script_action_help": "Select an object to match the pivot rotation with.", "script_action_icon": "BUTTONS_match_pivot_rotation"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("scale", "Uniform Scale", 1, default_value=([1]), min=0, max=10, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.Base1)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.MenuParmTemplate("pre_xform", "Modify Pre-Transform", menu_items=(["clean","cleantrans","cleanrot","cleanscales","extract","reset"]), menu_labels=(["Clean Transform","Clean Translates","Clean Rotates","Clean Scales","Extract Pre-transform","Reset Pre-transform"]), default_value=0, icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.StringReplace, menu_use_token=False, is_button_strip=False, strip_uses_icons=False)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("keeppos", "Keep Position When Parenting", default_value=False)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("childcomp", "Child Compensation", default_value=False)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("constraints_on", "Enable Constraints", default_value=False)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("constraints_path", "Constraints", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.NodeReference, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template2.setConditional(hou.parmCondType.HideWhen, "{ constraints_on == 0 }")
hou_parm_template2.setTags({"opfilter": "!!CHOP", "oprelative": ".", "script_action": "import objecttoolutils\nobjecttoolutils.constraintsMenu(kwargs)", "script_action_help": "", "script_action_icon": "BUTTONS_add_constraints"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("lookatpath", "Look At", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.NodeReference, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template2.hide(True)
hou_parm_template2.setTags({"opfilter": "!!OBJ!!", "oprelative": "."})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("lookupobjpath", "Look Up Object", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.NodeReference, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template2.hide(True)
hou_parm_template2.setTags({"opfilter": "!!OBJ!!", "oprelative": "."})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("lookup", "Look At Up Vector", 1, default_value=(["on"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=(["off","on","quat","pos","obj"]), menu_labels=(["Don't Use Up Vector","Use Up Vector","Use Quaternions","Use Global Position","Use Up Object"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template2.hide(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("pathobjpath", "Path Object", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.NodeReference, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template2.hide(True)
hou_parm_template2.setTags({"opfilter": "!!SOP!!", "oprelative": "."})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("roll", "Roll", 1, default_value=([0]), min=-360, max=360, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Angle, naming_scheme=hou.parmNamingScheme.Base1)
hou_parm_template2.hide(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("pos", "Position", 1, default_value=([0]), min=0, max=10, min_is_strict=True, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.Base1)
hou_parm_template2.hide(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.MenuParmTemplate("uparmtype", "Parameterization", menu_items=(["uniform","arc"]), menu_labels=(["Uniform","Arc Length"]), default_value=1, icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False, is_button_strip=False, strip_uses_icons=False)
hou_parm_template2.hide(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.IntParmTemplate("pathorient", "Orient Along Path", 1, default_value=([1]), min=0, max=1, min_is_strict=False, max_is_strict=False, naming_scheme=hou.parmNamingScheme.Base1, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False)
hou_parm_template2.hide(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("up", "Orient Up Vector", 3, default_value=([0, 1, 0]), min=0, max=10, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Vector, naming_scheme=hou.parmNamingScheme.XYZW)
hou_parm_template2.hide(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("bank", "Auto-Bank factor", 1, default_value=([0]), min=-1, max=1, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.Base1)
hou_parm_template2.hide(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
hou_parm_template_group.append(hou_parm_template)
# Code for parameter template
hou_parm_template = hou.FolderParmTemplate("stdswitcher4_1", "Render", folder_type=hou.folderType.Tabs, default_value=0, ends_tab_group=False)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("shop_materialpath", "Material", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.NodeReference, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template2.setTags({"opfilter": "!!CUSTOM/MATERIAL!!", "oprelative": "."})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.MenuParmTemplate("shop_materialopts", "Options", menu_items=([]), menu_labels=([]), default_value=0, icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Mini, menu_use_token=False, is_button_strip=False, strip_uses_icons=False)
hou_parm_template2.hide(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("tdisplay", "Display", default_value=False)
hou_parm_template2.setJoinWithNext(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.IntParmTemplate("display", "Display", 1, default_value=([1]), min=0, max=1, min_is_strict=True, max_is_strict=True, naming_scheme=hou.parmNamingScheme.Base1, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.MenuParmTemplate("viewportlod", "Display As", menu_items=(["full","points","box","centroid","hidden","subd"]), menu_labels=(["Full Geometry","Point Cloud","Bounding Box","Centroid","Hidden","Subdivision Surface / Curves"]), default_value=0, icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False, is_button_strip=False, strip_uses_icons=False)
hou_parm_template2.setHelp("Choose how the object's geometry should be rendered in the viewport")
hou_parm_template2.setTags({"spare_category": "Render"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("vm_rendervisibility", "Render Visibility", 1, default_value=(["*"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=(["*","primary","primary|shadow","-primary","-diffuse","-diffuse&-reflect&-refract",""]), menu_labels=(["Visible to all","Visible only to primary rays","Visible only to primary and shadow rays","Invisible to primary rays (Phantom)","Invisible to diffuse rays","Invisible to secondary rays","Invisible (Unrenderable)"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.StringReplace)
hou_parm_template2.setTags({"mantra_class": "object", "mantra_name": "rendervisibility", "spare_category": "Render"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("vm_rendersubd", "Render Polygons As Subdivision (Mantra)", default_value=False)
hou_parm_template2.setTags({"mantra_class": "object", "mantra_name": "rendersubd", "spare_category": "Geometry"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("vm_subdstyle", "Subdivision Style", 1, default_value=(["mantra_catclark"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=(["mantra_catclark","osd_catclark"]), menu_labels=(["Mantra Catmull-Clark","OpenSubdiv Catmull-Clark"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template2.setConditional(hou.parmCondType.HideWhen, "{ vm_rendersubd == 0 }")
hou_parm_template2.setTags({"mantra_class": "object", "mantra_name": "subdstyle", "spare_category": "Geometry"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("vm_subdgroup", "Subdivision Group", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template2.setConditional(hou.parmCondType.HideWhen, "{ vm_rendersubd == 0 }")
hou_parm_template2.setTags({"mantra_class": "object", "mantra_name": "subdgroup", "spare_category": "Geometry"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("vm_osd_quality", "Open Subdiv Quality", 1, default_value=([1]), min=0, max=10, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.Base1)
hou_parm_template2.setConditional(hou.parmCondType.HideWhen, "{ vm_rendersubd == 0 vm_subdstyle != osd_catclark }")
hou_parm_template2.setTags({"mantra_class": "object", "mantra_name": "osd_quality", "spare_category": "Geometry"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.IntParmTemplate("vm_osd_vtxinterp", "OSD Vtx Interp", 1, default_value=([2]), min=0, max=10, min_is_strict=False, max_is_strict=False, naming_scheme=hou.parmNamingScheme.Base1, menu_items=(["0","1","2"]), menu_labels=(["No vertex interpolation","Edges only","Edges and Corners"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False)
hou_parm_template2.setConditional(hou.parmCondType.HideWhen, "{ vm_rendersubd == 0 vm_subdstyle != osd_catclark }")
hou_parm_template2.setTags({"mantra_class": "object", "mantra_name": "osd_vtxinterp", "spare_category": "Geometry"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.IntParmTemplate("vm_osd_fvarinterp", "OSD FVar Interp", 1, default_value=([4]), min=0, max=10, min_is_strict=False, max_is_strict=False, naming_scheme=hou.parmNamingScheme.Base1, menu_items=(["0","1","2","3","4","5"]), menu_labels=(["Smooth everywhere","Sharpen corners only","Sharpen edges and corners","Sharpen edges and propagated corners","Sharpen all boundaries","Bilinear interpolation"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False)
hou_parm_template2.setConditional(hou.parmCondType.HideWhen, "{ vm_rendersubd == 0 vm_subdstyle != osd_catclark }")
hou_parm_template2.setTags({"mantra_class": "object", "mantra_name": "osd_fvarinterp", "spare_category": "Geometry"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FolderParmTemplate("folder0", "Shading", folder_type=hou.folderType.Tabs, default_value=0, ends_tab_group=False)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("categories", "Categories", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setHelp("A list of tags which can be used to select the object")
hou_parm_template3.setTags({"spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("reflectmask", "Reflection Mask", 1, default_value=(["*"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.NodeReferenceList, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setHelp("Objects that will be reflected on this object.")
hou_parm_template3.setTags({"opexpand": "1", "opfilter": "!!OBJ/GEOMETRY!!", "oprelative": "/obj", "spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("refractmask", "Refraction Mask", 1, default_value=(["*"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.NodeReferenceList, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setHelp("Objects that will be refracted on this object.")
hou_parm_template3.setTags({"opexpand": "1", "opfilter": "!!OBJ/GEOMETRY!!", "oprelative": "/obj", "spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("lightmask", "Light Mask", 1, default_value=(["*"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.NodeReferenceList, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setHelp("Lights that illuminate this object.")
hou_parm_template3.setTags({"opexpand": "1", "opfilter": "!!OBJ/LIGHT!!", "oprelative": "/obj", "spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("lightcategories", "Light Selection", 1, default_value=(["*"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setTags({"spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("vm_lpetag", "LPE Tag", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "lpetag", "spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("vm_volumefilter", "Volume Filter", 1, default_value=(["box"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=(["box","gaussian","bartlett","catrom","hanning","blackman","sinc"]), menu_labels=(["Box Filter","Gaussian","Bartlett (triangle)","Catmull-Rom","Hanning","Blackman","Sinc (sharpening)"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "filter", "spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.FloatParmTemplate("vm_volumefilterwidth", "Volume Filter Width", 1, default_value=([1]), min=0.001, max=5, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.Base1)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "filterwidth", "spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_matte", "Matte shading", default_value=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "matte", "spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_rayshade", "Raytrace Shading", default_value=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "rayshade", "spare_category": "Shading"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FolderParmTemplate("folder0_1", "Sampling", folder_type=hou.folderType.Tabs, default_value=0, ends_tab_group=False)
# Code for parameter template
hou_parm_template3 = hou.MenuParmTemplate("geo_velocityblur", "Geometry Velocity Blur", menu_items=(["off","on","accelblur"]), menu_labels=(["No Velocity Blur","Velocity Blur","Acceleration Blur"]), default_value=0, icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False, is_button_strip=False, strip_uses_icons=False)
hou_parm_template3.setConditional(hou.parmCondType.DisableWhen, "{ allowmotionblur == 0 }")
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("geo_accelattribute", "Acceleration Attribute", 1, default_value=(["accel"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setConditional(hou.parmCondType.HideWhen, "{ geo_velocityblur != accelblur }")
hou_parm_template3.setTags({"spare_category": "Sampling"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FolderParmTemplate("folder0_2", "Dicing", folder_type=hou.folderType.Tabs, default_value=0, ends_tab_group=False)
# Code for parameter template
hou_parm_template3 = hou.FloatParmTemplate("vm_shadingquality", "Shading Quality", 1, default_value=([1]), min=0, max=10, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.Base1)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "shadingquality", "spare_category": "Dicing"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.FloatParmTemplate("vm_flatness", "Dicing Flatness", 1, default_value=([0.05]), min=0, max=1, min_is_strict=False, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.Base1)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "flatness", "spare_category": "Dicing"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.IntParmTemplate("vm_raypredice", "Ray Predicing", 1, default_value=([0]), min=0, max=10, min_is_strict=False, max_is_strict=False, naming_scheme=hou.parmNamingScheme.Base1, menu_items=(["0","1","2"]), menu_labels=(["Disable Predicing","Full Predicing","Precompute Bounds"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "raypredice", "spare_category": "Dicing"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_curvesurface", "Shade Curves As Surfaces", default_value=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "curvesurface", "spare_category": "Dicing"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FolderParmTemplate("folder0_3", "Geometry", folder_type=hou.folderType.Tabs, default_value=0, ends_tab_group=False)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_rmbackface", "Backface Removal", default_value=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "rmbackface", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("shop_geometrypath", "Procedural Shader", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.NodeReference, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setTags({"opfilter": "!!SHOP/GEOMETRY!!", "oprelative": ".", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_forcegeometry", "Force Procedural Geometry Output", default_value=True)
hou_parm_template3.setTags({"spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_rendersubdcurves", "Render Polygon Curves As Subdivision (Mantra)", default_value=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "rendersubdcurves", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.IntParmTemplate("vm_renderpoints", "Render As Points (Mantra)", 1, default_value=([2]), min=0, max=10, min_is_strict=False, max_is_strict=False, naming_scheme=hou.parmNamingScheme.Base1, menu_items=(["0","1","2"]), menu_labels=(["No Point Rendering","Render Only Points","Render Unconnected Points"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "renderpoints", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.IntParmTemplate("vm_renderpointsas", "Render Points As (Mantra)", 1, default_value=([0]), min=0, max=10, min_is_strict=False, max_is_strict=False, naming_scheme=hou.parmNamingScheme.Base1, menu_items=(["0","1"]), menu_labels=(["Spheres","Circles"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False)
hou_parm_template3.setConditional(hou.parmCondType.DisableWhen, "{ vm_renderpoints == 0 }")
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "renderpointsas", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_usenforpoints", "Use N For Point Rendering", default_value=False)
hou_parm_template3.setConditional(hou.parmCondType.DisableWhen, "{ vm_renderpoints == 0 }")
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "usenforpoints", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.FloatParmTemplate("vm_pointscale", "Point Scale", 1, default_value=([1]), min=0, max=10, min_is_strict=True, max_is_strict=False, look=hou.parmLook.Regular, naming_scheme=hou.parmNamingScheme.Base1)
hou_parm_template3.setConditional(hou.parmCondType.DisableWhen, "{ vm_renderpoints == 0 }")
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "pointscale", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_pscalediameter", "Treat Point Scale as Diameter Instead of Radius", default_value=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "pscalediameter", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_metavolume", "Metaballs as Volume", default_value=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "metavolume", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.IntParmTemplate("vm_coving", "Coving", 1, default_value=([1]), min=0, max=10, min_is_strict=False, max_is_strict=False, naming_scheme=hou.parmNamingScheme.Base1, menu_items=(["0","1","2"]), menu_labels=(["Disable Coving","Coving for displacement/sub-d","Coving for all primitives"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "coving", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.StringParmTemplate("vm_materialoverride", "Material Override", 1, default_value=(["compact"]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.Regular, menu_items=(["none","full","compact"]), menu_labels=(["Disabled","Evaluate for Each Primitve/Point","Evaluate Once"]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal)
hou_parm_template3.setTags({"spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_overridedetail", "Ignore Geometry Attribute Shaders", default_value=False)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "overridedetail", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
# Code for parameter template
hou_parm_template3 = hou.ToggleParmTemplate("vm_procuseroottransform", "Proc Use Root Transform", default_value=True)
hou_parm_template3.setTags({"mantra_class": "object", "mantra_name": "procuseroottransform", "spare_category": "Geometry"})
hou_parm_template2.addParmTemplate(hou_parm_template3)
hou_parm_template.addParmTemplate(hou_parm_template2)
hou_parm_template_group.append(hou_parm_template)
# Code for parameter template
hou_parm_template = hou.FolderParmTemplate("stdswitcher4_2", "Misc", folder_type=hou.folderType.Tabs, default_value=0, ends_tab_group=False)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("use_dcolor", "Set Wireframe Color", default_value=False)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.FloatParmTemplate("dcolor", "Wireframe Color", 3, default_value=([1, 1, 1]), min=0, max=1, min_is_strict=True, max_is_strict=True, look=hou.parmLook.ColorSquare, naming_scheme=hou.parmNamingScheme.RGBA)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("picking", "Viewport Selecting Enabled", default_value=True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.StringParmTemplate("pickscript", "Select Script", 1, default_value=([""]), naming_scheme=hou.parmNamingScheme.Base1, string_type=hou.stringParmType.FileReference, file_type=hou.fileType.Any, menu_items=([]), menu_labels=([]), icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.StringReplace)
hou_parm_template2.setTags({"filechooser_mode": "read"})
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("caching", "Cache Object Transform", default_value=True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("vport_shadeopen", "Shade Open Curves In Viewport", default_value=False)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.ToggleParmTemplate("vport_displayassubdiv", "Display as Subdivision in Viewport", default_value=False)
hou_parm_template2.hide(True)
hou_parm_template.addParmTemplate(hou_parm_template2)
# Code for parameter template
hou_parm_template2 = hou.MenuParmTemplate("vport_onionskin", "Onion Skinning", menu_items=(["off","xform","on"]), menu_labels=(["Off","Transform only","Full Deformation"]), default_value=0, icon_names=([]), item_generator_script="", item_generator_script_language=hou.scriptLanguage.Python, menu_type=hou.menuType.Normal, menu_use_token=False, is_button_strip=False, strip_uses_icons=False)
hou_parm_template.addParmTemplate(hou_parm_template2)
hou_parm_template_group.append(hou_parm_template)
hou_node.setParmTemplateGroup(hou_parm_template_group)
# Code for /obj/grid1/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.setAutoscope((True, True, True))
# Code for /obj/grid1/r parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1")
hou_parm_tuple = hou_node.parmTuple("r")
hou_parm_tuple.setAutoscope((True, True, True))
# Code for /obj/grid1/s parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1")
hou_parm_tuple = hou_node.parmTuple("s")
hou_parm_tuple.setAutoscope((True, True, True))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Code for /obj/grid1/grid1
hou_node = hou_parent.createNode("grid", "grid1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-4.4917, 10.7011))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/grid1/size parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/grid1")
hou_parm_tuple = hou_node.parmTuple("size")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((1000, 1000))
# Code for /obj/grid1/grid1/rows parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/grid1")
hou_parm = hou_node.parm("rows")
hou_parm.deleteAllKeyframes()
hou_parm.set(50)
# Code for /obj/grid1/grid1/cols parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/grid1")
hou_parm = hou_node.parm("cols")
hou_parm.deleteAllKeyframes()
hou_parm.set(50)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/polyextrude1
hou_node = hou_parent.createNode("polyextrude::2.0", "polyextrude1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-4.4917, 9.57163))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polyextrude1/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("193-195 241-244 290-293 339-342 388-391 437-440 484-489 533-538 582-587 631-636 676-685 725-734 774-783 823-832 872-881 921-930 970-979 1020-1028 1069-1077 1118-1126 1167-1175 1216-1224 1265-1273 1314-1322 1363-1371 1412-1420 1461-1469 1511-1518 1560-1567 1609-1616 1658-1665 1707-1714 1756-1763 1805-1812 1857-1861 1906-1910 1955-1959 2004-2008 2053-2057 2105-2106 2153-2155 2202-2204 2253")
# Code for /obj/grid1/polyextrude1/dist parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("dist")
hou_parm.deleteAllKeyframes()
hou_parm.set(-380.98068237304688)
# Code for /obj/grid1/polyextrude1/divs parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("divs")
hou_parm.deleteAllKeyframes()
hou_parm.set(4)
# Code for /obj/grid1/polyextrude1/thicknessramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("thicknessramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude1/thicknessramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("thicknessramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude1/thicknessramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("thicknessramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude1/thicknessramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("thicknessramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude1/thicknessramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("thicknessramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude1/twistramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("twistramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude1/twistramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("twistramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude1/twistramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("twistramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude1/twistramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("twistramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude1/twistramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude1")
hou_parm = hou_node.parm("twistramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/blast1
hou_node = hou_parent.createNode("blast", "blast1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-4.4917, 8.31272))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/blast1/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/blast1")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("2404-2446 2512-2554 2620-2662 2728-2770")
# Code for /obj/grid1/blast1/grouptype parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/blast1")
hou_parm = hou_node.parm("grouptype")
hou_parm.deleteAllKeyframes()
hou_parm.set("prims")
# Code for /obj/grid1/blast1/removegrp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/blast1")
hou_parm = hou_node.parm("removegrp")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___toolid___", "generic_delete")
hou_node.setUserData("___Version___", "19.0.455")
hou_node.setUserData("___toolcount___", "2")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/mountain1
hou_node = hou_parent.createNode("attribnoise::2.0", "mountain1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-4.4917, 7.30087))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/mountain1/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("0-42 46-48 50-92 96-98 100-142 150-192 200-225 227-230 234-242 247-272 287-289 293-318 339-364 385-410 431-456 477-502 521-546 565-590 609-634 653-678 693-712 714-718 733-752 773-793 813-833 853-873 893-913 934-954 975-995 1016-1036 1057-1077 1098-1118 1139-1159 1180-1200 1221-1241 1262-1282 1303-1323 1345-1365 1387-1407 1429-1449 1471-1491 1494-1496 1513-1538 1555-1580 1597-1622 1627-1628 1642-1667 1669-1673 1687-1712 1714-1718 1732-1757 1759-1763 1777-1802 1804-1808 1822-1847 1849-1856 1870-1904 1912 1918-1952 1955-1960 1965-2007 2014-2056 2064-2106 2110-2112 2114-2213")
# Code for /obj/grid1/mountain1/attribs parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("attribs")
hou_parm.deleteAllKeyframes()
hou_parm.set("P")
# Code for /obj/grid1/mountain1/displace parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("displace")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/mountain1/amplitude parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("amplitude")
hou_parm.deleteAllKeyframes()
hou_parm.set(400)
# Code for /obj/grid1/mountain1/enableremap parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("enableremap")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/mountain1/remapramp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("remapramp")
hou_parm.deleteAllKeyframes()
hou_parm.set(3)
# Code for /obj/grid1/mountain1/elementsize parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("elementsize")
hou_parm.deleteAllKeyframes()
hou_parm.set(800)
# Code for /obj/grid1/mountain1/offset parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("offset")
hou_parm.deleteAllKeyframes()
hou_parm.set(28.800000000000001)
# Code for /obj/grid1/mountain1/folder6 parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("folder6")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/mountain1/folder4 parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("folder4")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/mountain1/fractal parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("fractal")
hou_parm.deleteAllKeyframes()
hou_parm.set("hmfT")
# Code for /obj/grid1/mountain1/oct parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("oct")
hou_parm.deleteAllKeyframes()
hou_parm.set(8)
# Code for /obj/grid1/mountain1/rough parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("rough")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.40000000000000002)
# Code for /obj/grid1/mountain1/folder2 parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("folder2")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/mountain1/remapramp1pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("remapramp1pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.41952788829803467)
# Code for /obj/grid1/mountain1/remapramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("remapramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.76287555694580078)
# Code for /obj/grid1/mountain1/remapramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("remapramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.47540983557701111)
# Code for /obj/grid1/mountain1/remapramp3pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("remapramp3pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/mountain1/remapramp3value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/mountain1")
hou_parm = hou_node.parm("remapramp3value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___toolid___", "geometry_mountain")
hou_node.setUserData("___Version___", "")
hou_node.setUserData("___toolcount___", "3")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("")
# Code for /obj/grid1/polyextrude2
hou_node = hou_parent.createNode("polyextrude::2.0", "polyextrude2", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-4.4917, 6.40667))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polyextrude2/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("575-576 618-619 657-658 852-853 891-892 931-932 1131-1132 1171-1172 1211-1212 1414-1415 1455-1456 1496-1497")
# Code for /obj/grid1/polyextrude2/dist parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("dist")
hou_parm.deleteAllKeyframes()
hou_parm.set(115.19532012939453)
# Code for /obj/grid1/polyextrude2/thicknessramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("thicknessramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude2/thicknessramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("thicknessramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude2/thicknessramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("thicknessramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude2/thicknessramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("thicknessramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude2/thicknessramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("thicknessramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude2/twistramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("twistramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude2/twistramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("twistramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude2/twistramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("twistramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude2/twistramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("twistramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude2/twistramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude2")
hou_parm = hou_node.parm("twistramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/polyextrude3
hou_node = hou_parent.createNode("polyextrude::2.0", "polyextrude3", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-4.4917, 5.51247))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polyextrude3/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("576-582 617-623 654-660 847-853 884-890 922-928 1120-1127 1158-1165 1196-1203 1397-1399 1436-1438 1475-1477")
# Code for /obj/grid1/polyextrude3/dist parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("dist")
hou_parm.deleteAllKeyframes()
hou_parm.set(19.667102813720703)
# Code for /obj/grid1/polyextrude3/thicknessramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("thicknessramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude3/thicknessramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("thicknessramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude3/thicknessramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("thicknessramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude3/thicknessramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("thicknessramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude3/thicknessramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("thicknessramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude3/twistramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("twistramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude3/twistramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("twistramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude3/twistramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("twistramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude3/twistramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("twistramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude3/twistramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude3")
hou_parm = hou_node.parm("twistramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/merge1
hou_node = hou_parent.createNode("merge", "merge1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-1.07079, 4.03993))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/tube1
hou_node = hou_parent.createNode("tube", "tube1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-1.06964, 7.40659))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/tube1/type parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/tube1")
hou_parm = hou_node.parm("type")
hou_parm.deleteAllKeyframes()
hou_parm.set("poly")
# Code for /obj/grid1/tube1/orient parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/tube1")
hou_parm = hou_node.parm("orient")
hou_parm.deleteAllKeyframes()
hou_parm.set("x")
# Code for /obj/grid1/tube1/rad parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/tube1")
hou_parm_tuple = hou_node.parmTuple("rad")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((6, 6))
# Code for /obj/grid1/tube1/height parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/tube1")
hou_parm = hou_node.parm("height")
hou_parm.deleteAllKeyframes()
hou_parm.set(100)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform1
hou_node = hou_parent.createNode("xform", "transform1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-1.06964, 6.45364))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform1/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform1")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((159.43642616271973, 9.4653358459472656, 225.41490364074707))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/copy1
hou_node = hou_parent.createNode("copyxform", "copy1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-1.06964, 5.51247))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/copy1/ncy parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/copy1")
hou_parm = hou_node.parm("ncy")
hou_parm.deleteAllKeyframes()
hou_parm.set(4)
# Code for /obj/grid1/copy1/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/copy1")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, 0, -14))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/copy2
hou_node = hou_parent.createNode("copyxform", "copy2", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(1.19047, 6.53713))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/copy2/ncy parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/copy2")
hou_parm = hou_node.parm("ncy")
hou_parm.deleteAllKeyframes()
hou_parm.set(4)
# Code for /obj/grid1/copy2/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/copy2")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, 0, -14))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform2
hou_node = hou_parent.createNode("xform", "transform2", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(1.18274, 9.24511))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform2/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform2")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((223.3526554107666, 9.4653358459472656, 81.356912612915039))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/tube2
hou_node = hou_parent.createNode("tube", "tube2", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(1.18274, 10.1981))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/tube2/type parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/tube2")
hou_parm = hou_node.parm("type")
hou_parm.deleteAllKeyframes()
hou_parm.set("poly")
# Code for /obj/grid1/tube2/orient parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/tube2")
hou_parm = hou_node.parm("orient")
hou_parm.deleteAllKeyframes()
hou_parm.set("x")
# Code for /obj/grid1/tube2/rad parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/tube2")
hou_parm_tuple = hou_node.parmTuple("rad")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((6, 6))
# Code for /obj/grid1/tube2/height parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/tube2")
hou_parm = hou_node.parm("height")
hou_parm.deleteAllKeyframes()
hou_parm.set(250)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/copy3
hou_node = hou_parent.createNode("copyxform", "copy3", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(1.19047, 5.51247))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/copy3/ncy parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/copy3")
hou_parm = hou_node.parm("ncy")
hou_parm.deleteAllKeyframes()
hou_parm.set(3)
# Code for /obj/grid1/copy3/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/copy3")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, 0, -142))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform3
hou_node = hou_parent.createNode("xform", "transform3", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(2.72432, 8.35673))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform3/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform3")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((335.09628677368164, 112.36490249633789, 0))
# Code for /obj/grid1/transform3/r parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform3")
hou_parm_tuple = hou_node.parmTuple("r")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, 0, -90))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/merge2
hou_node = hou_parent.createNode("merge", "merge2", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(1.18932, 7.4928))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/blast2
hou_node = hou_parent.createNode("blast", "blast2", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-1.06964, 2.66436))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/blast2/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/blast2")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("2835-2846 2859-2870 2883-2894 2907-2918")
# Code for /obj/grid1/blast2/grouptype parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/blast2")
hou_parm = hou_node.parm("grouptype")
hou_parm.deleteAllKeyframes()
hou_parm.set("prims")
# Code for /obj/grid1/blast2/removegrp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/blast2")
hou_parm = hou_node.parm("removegrp")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___toolid___", "generic_delete")
hou_node.setUserData("___Version___", "19.0.455")
hou_node.setUserData("___toolcount___", "6")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/edit1
hou_node = hou_parent.createNode("edit", "edit1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-1.06964, 1.77016))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/edit1/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/edit1")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("2672-2675 2680-2683 2688-2691")
# Code for /obj/grid1/edit1/grouptype parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/edit1")
hou_parm = hou_node.parm("grouptype")
hou_parm.deleteAllKeyframes()
hou_parm.set("prims")
# Code for /obj/grid1/edit1/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/edit1")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, -13.604167938232422, 0))
# Code for /obj/grid1/edit1/p parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/edit1")
hou_parm_tuple = hou_node.parmTuple("p")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((255.10202026367188, 19.667102813720703, 61.2244873046875))
# Code for /obj/grid1/edit1/leadislandhint parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/edit1")
hou_parm = hou_node.parm("leadislandhint")
hou_parm.deleteAllKeyframes()
hou_parm.set("2682")
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/box1
hou_node = hou_parent.createNode("box", "box1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-8.30133, 2.07566))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/box1/type parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box1")
hou_parm = hou_node.parm("type")
hou_parm.deleteAllKeyframes()
hou_parm.set("polymesh")
# Code for /obj/grid1/box1/size parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box1")
hou_parm_tuple = hou_node.parmTuple("size")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((60.700000000000003, 121, 100))
# Code for /obj/grid1/box1/divrate parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box1")
hou_parm_tuple = hou_node.parmTuple("divrate")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((2, 2, 2))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/merge3
hou_node = hou_parent.createNode("merge", "merge3", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-2.65957, -7.89169))
hou_node.bypass(False)
hou_node.setDisplayFlag(True)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(True)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform4
hou_node = hou_parent.createNode("xform", "transform4", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-8.30133, 1.31566))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform4/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform4")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((92.115993499755859, 58.824935913085938, 56.311542510986328))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/bound1
hou_node = hou_parent.createNode("bound", "bound1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-8.30133, 0.502855))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/bound1/dodivs parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/bound1")
hou_parm = hou_node.parm("dodivs")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/bound1/divs parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/bound1")
hou_parm_tuple = hou_node.parmTuple("divs")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((8, 11, 10))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/bound2
hou_node = hou_parent.createNode("bound", "bound2", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-6.17013, 0.502855))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/bound2/dodivs parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/bound2")
hou_parm = hou_node.parm("dodivs")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/bound2/divs parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/bound2")
hou_parm_tuple = hou_node.parmTuple("divs")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((8, 11, 10))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform5
hou_node = hou_parent.createNode("xform", "transform5", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-6.17013, 1.31566))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform5/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform5")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((252.48447799682617, 58.824935913085938, 56.311542510986328))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/box2
hou_node = hou_parent.createNode("box", "box2", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-6.17013, 2.07566))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/box2/type parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box2")
hou_parm = hou_node.parm("type")
hou_parm.deleteAllKeyframes()
hou_parm.set("polymesh")
# Code for /obj/grid1/box2/size parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box2")
hou_parm_tuple = hou_node.parmTuple("size")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((174.24290084838867, 27.417694091796875, 100))
# Code for /obj/grid1/box2/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box2")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((7.6534442901611328, -46.791152954101562, 0))
# Code for /obj/grid1/box2/divrate parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box2")
hou_parm_tuple = hou_node.parmTuple("divrate")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((2, 2, 2))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/bound3
hou_node = hou_parent.createNode("bound", "bound3", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-11.035, 0.502855))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/bound3/dodivs parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/bound3")
hou_parm = hou_node.parm("dodivs")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/bound3/divs parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/bound3")
hou_parm_tuple = hou_node.parmTuple("divs")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((8, 11, 10))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform6
hou_node = hou_parent.createNode("xform", "transform6", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-11.035, 1.31566))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform6/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform6")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((92.115993499755859, 58.824935913085938, 198.93928146362305))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/box3
hou_node = hou_parent.createNode("box", "box3", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-11.035, 2.07566))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/box3/type parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box3")
hou_parm = hou_node.parm("type")
hou_parm.deleteAllKeyframes()
hou_parm.set("polymesh")
# Code for /obj/grid1/box3/size parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box3")
hou_parm_tuple = hou_node.parmTuple("size")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((167.83417701721191, 40.406349182128906, 100))
# Code for /obj/grid1/box3/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box3")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((53.56708812713623, -40.296825408935547, 0))
# Code for /obj/grid1/box3/divrate parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box3")
hou_parm_tuple = hou_node.parmTuple("divrate")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((2, 2, 2))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/bound4
hou_node = hou_parent.createNode("bound", "bound4", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(4.32202, -0.0908987))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/bound4/dodivs parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/bound4")
hou_parm = hou_node.parm("dodivs")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/bound4/divs parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/bound4")
hou_parm_tuple = hou_node.parmTuple("divs")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((8, 11, 10))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform7
hou_node = hou_parent.createNode("xform", "transform7", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(4.32202, 0.721901))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform7/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform7")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((382.27402877807617, -78.555267333984375, 57.187503814697266))
# Code for /obj/grid1/transform7/r parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform7")
hou_parm_tuple = hou_node.parmTuple("r")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, 0, -90))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/box4
hou_node = hou_parent.createNode("box", "box4", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(4.32202, 1.4819))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/box4/type parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box4")
hou_parm = hou_node.parm("type")
hou_parm.deleteAllKeyframes()
hou_parm.set("polymesh")
# Code for /obj/grid1/box4/size parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box4")
hou_parm_tuple = hou_node.parmTuple("size")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((174.24290084838867, 27.417694091796875, 100))
# Code for /obj/grid1/box4/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box4")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((7.6534442901611328, -46.791152954101562, 0))
# Code for /obj/grid1/box4/divrate parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box4")
hou_parm_tuple = hou_node.parmTuple("divrate")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((2, 2, 2))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/box5
hou_node = hou_parent.createNode("box", "box5", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-21.2912, 0.0698124))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/box5/type parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box5")
hou_parm = hou_node.parm("type")
hou_parm.deleteAllKeyframes()
hou_parm.set("polymesh")
# Code for /obj/grid1/box5/size parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box5")
hou_parm_tuple = hou_node.parmTuple("size")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((1, 12.146723747253418, 1))
# Code for /obj/grid1/box5/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box5")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, 5.573361873626709, 0))
# Code for /obj/grid1/box5/divrate parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/box5")
hou_parm_tuple = hou_node.parmTuple("divrate")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((2, 2, 2))
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/polyextrude4
hou_node = hou_parent.createNode("polyextrude::2.0", "polyextrude4", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-21.2912, -0.824392))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polyextrude4/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("2")
# Code for /obj/grid1/polyextrude4/dist parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("dist")
hou_parm.deleteAllKeyframes()
hou_parm.set(1.6808929443359375)
# Code for /obj/grid1/polyextrude4/thicknessramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("thicknessramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude4/thicknessramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("thicknessramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude4/thicknessramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("thicknessramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude4/thicknessramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("thicknessramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude4/thicknessramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("thicknessramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude4/twistramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("twistramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude4/twistramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("twistramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude4/twistramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("twistramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude4/twistramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("twistramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude4/twistramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude4")
hou_parm = hou_node.parm("twistramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/polyextrude5
hou_node = hou_parent.createNode("polyextrude::2.0", "polyextrude5", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-21.2912, -1.71859))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polyextrude5/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("7")
# Code for /obj/grid1/polyextrude5/dist parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("dist")
hou_parm.deleteAllKeyframes()
hou_parm.set(11.1996089220047)
# Code for /obj/grid1/polyextrude5/inset parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("inset")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.33400000000000002)
# Code for /obj/grid1/polyextrude5/thicknessramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("thicknessramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude5/thicknessramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("thicknessramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude5/thicknessramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("thicknessramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude5/thicknessramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("thicknessramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude5/thicknessramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("thicknessramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude5/twistramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("twistramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude5/twistramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("twistramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude5/twistramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("twistramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude5/twistramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("twistramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude5/twistramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude5")
hou_parm = hou_node.parm("twistramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/polyextrude6
hou_node = hou_parent.createNode("polyextrude::2.0", "polyextrude6", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-21.2912, -2.61279))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(True)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polyextrude6/group parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("group")
hou_parm.deleteAllKeyframes()
hou_parm.set("8")
# Code for /obj/grid1/polyextrude6/dist parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("dist")
hou_parm.deleteAllKeyframes()
hou_parm.set(2.5108861923217773)
# Code for /obj/grid1/polyextrude6/inset parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("inset")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.28399999999999997)
# Code for /obj/grid1/polyextrude6/thicknessramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("thicknessramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude6/thicknessramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("thicknessramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude6/thicknessramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("thicknessramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude6/thicknessramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("thicknessramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude6/thicknessramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("thicknessramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude6/twistramp1value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("twistramp1value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude6/twistramp1interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("twistramp1interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
# Code for /obj/grid1/polyextrude6/twistramp2pos parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("twistramp2pos")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
# Code for /obj/grid1/polyextrude6/twistramp2value parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("twistramp2value")
hou_parm.deleteAllKeyframes()
hou_parm.set(0.5)
# Code for /obj/grid1/polyextrude6/twistramp2interp parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polyextrude6")
hou_parm = hou_node.parm("twistramp2interp")
hou_parm.deleteAllKeyframes()
hou_parm.set("catmull-rom")
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform8
hou_node = hou_parent.createNode("xform", "transform8", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-21.2912, -3.80234))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform8/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform8")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, 0, 234.0345516204834))
# Code for /obj/grid1/transform8/scale parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform8")
hou_parm = hou_node.parm("scale")
hou_parm.deleteAllKeyframes()
hou_parm.set(5.0099999999999998)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform9
hou_node = hou_parent.createNode("xform", "transform9", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-18.0149, -3.80234))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform9/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform9")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((100.62657833099365, 0, 142.61710166931152))
# Code for /obj/grid1/transform9/r parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform9")
hou_parm_tuple = hou_node.parmTuple("r")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, -39.206699999999998, 0))
# Code for /obj/grid1/transform9/scale parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform9")
hou_parm = hou_node.parm("scale")
hou_parm.deleteAllKeyframes()
hou_parm.set(7.0499999999999998)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform10
hou_node = hou_parent.createNode("xform", "transform10", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-15.181, -3.80234))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform10/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform10")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((321.594069480896, 0, 120.57526206970215))
# Code for /obj/grid1/transform10/r parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform10")
hou_parm_tuple = hou_node.parmTuple("r")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, 52.791600000000003, 0))
# Code for /obj/grid1/transform10/scale parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform10")
hou_parm = hou_node.parm("scale")
hou_parm.deleteAllKeyframes()
hou_parm.set(3.8799999999999999)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/transform11
hou_node = hou_parent.createNode("xform", "transform11", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-12.7273, -3.80234))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/transform11/t parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform11")
hou_parm_tuple = hou_node.parmTuple("t")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((307.20078945159912, 0, -36.410196304321289))
# Code for /obj/grid1/transform11/r parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform11")
hou_parm_tuple = hou_node.parmTuple("r")
hou_parm_tuple.deleteAllKeyframes()
hou_parm_tuple.set((0, -49.000218893225835, 0))
# Code for /obj/grid1/transform11/scale parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/transform11")
hou_parm = hou_node.parm("scale")
hou_parm.deleteAllKeyframes()
hou_parm.set(4.8799999999999999)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/polywire1
hou_node = hou_parent.createNode("polywire", "polywire1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-11.035, -0.843247))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polywire1/div parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polywire1")
hou_parm = hou_node.parm("div")
hou_parm.deleteAllKeyframes()
hou_parm.set(8)
# Code for /obj/grid1/polywire1/segscale parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polywire1")
hou_parm_tuple = hou_node.parmTuple("segscale")
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/polywire2
hou_node = hou_parent.createNode("polywire", "polywire2", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-8.06874, -1.0869))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polywire2/div parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polywire2")
hou_parm = hou_node.parm("div")
hou_parm.deleteAllKeyframes()
hou_parm.set(8)
# Code for /obj/grid1/polywire2/segscale parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polywire2")
hou_parm_tuple = hou_node.parmTuple("segscale")
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/polywire3
hou_node = hou_parent.createNode("polywire", "polywire3", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-5.99353, -1.32772))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polywire3/div parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polywire3")
hou_parm = hou_node.parm("div")
hou_parm.deleteAllKeyframes()
hou_parm.set(8)
# Code for /obj/grid1/polywire3/segscale parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polywire3")
hou_parm_tuple = hou_node.parmTuple("segscale")
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/polywire5
hou_node = hou_parent.createNode("polywire", "polywire5", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(4.09316, -1.58861))
hou_node.bypass(False)
hou_node.setDisplayFlag(False)
hou_node.hide(False)
hou_node.setHighlightFlag(False)
hou_node.setHardLocked(False)
hou_node.setSoftLocked(False)
hou_node.setSelectableTemplateFlag(False)
hou_node.setSelected(False)
hou_node.setRenderFlag(False)
hou_node.setTemplateFlag(False)
hou_node.setUnloadFlag(False)
# Code for /obj/grid1/polywire5/div parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polywire5")
hou_parm = hou_node.parm("div")
hou_parm.deleteAllKeyframes()
hou_parm.set(8)
# Code for /obj/grid1/polywire5/segscale parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/polywire5")
hou_parm_tuple = hou_node.parmTuple("segscale")
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("1.0 - 1.0 / $NSEG", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
hou_node.setUserData("___Version___", "19.0.455")
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code for /obj/grid1/rop_fbx1
hou_node = hou_parent.createNode("rop_fbx", "rop_fbx1", run_init_scripts=False, load_contents=True, exact_type_name=True)
hou_node.move(hou.Vector2(-2.65842, -9.10027))
hou_node.bypass(False)
hou_node.hide(False)
hou_node.setLocked(False)
hou_node.setSelected(False)
# Code for /obj/grid1/rop_fbx1/f parm tuple
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/rop_fbx1")
hou_parm_tuple = hou_node.parmTuple("f")
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("$FSTART", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("$FSTART", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("$FSTART", hou.exprLanguage.Hscript)
hou_parm_tuple[0].setKeyframe(hou_keyframe)
# Code for first keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("$FEND", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for last keyframe.
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("$FEND", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for keyframe.
hou_keyframe = hou.Keyframe()
hou_keyframe.setFrame(1)
hou_keyframe.setExpression("$FEND", hou.exprLanguage.Hscript)
hou_parm_tuple[1].setKeyframe(hou_keyframe)
# Code for /obj/grid1/rop_fbx1/sopoutput parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/rop_fbx1")
hou_parm = hou_node.parm("sopoutput")
hou_parm.deleteAllKeyframes()
hou_parm.set("$HIP/layout.fbx")
# Code for /obj/grid1/rop_fbx1/convertunits parm
if locals().get("hou_node") is None:
hou_node = hou.node("/obj/grid1/rop_fbx1")
hou_parm = hou_node.parm("convertunits")
hou_parm.deleteAllKeyframes()
hou_parm.set(1)
hou_node.setExpressionLanguage(hou.exprLanguage.Hscript)
if hasattr(hou_node, "syncNodeVersionIfNeeded"):
hou_node.syncNodeVersionIfNeeded("19.0.455")
# Update the parent node.
hou_parent = hou_node
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
# Code to establish connections for /obj/grid1/polyextrude1
hou_node = hou_parent.node("polyextrude1")
if hou_parent.node("grid1") is not None:
hou_node.setInput(0, hou_parent.node("grid1"), 0)
# Code to establish connections for /obj/grid1/blast1
hou_node = hou_parent.node("blast1")
if hou_parent.node("polyextrude1") is not None:
hou_node.setInput(0, hou_parent.node("polyextrude1"), 0)
# Code to establish connections for /obj/grid1/mountain1
hou_node = hou_parent.node("mountain1")
if hou_parent.node("blast1") is not None:
hou_node.setInput(0, hou_parent.node("blast1"), 0)
# Code to establish connections for /obj/grid1/polyextrude2
hou_node = hou_parent.node("polyextrude2")
if hou_parent.node("mountain1") is not None:
hou_node.setInput(0, hou_parent.node("mountain1"), 0)
# Code to establish connections for /obj/grid1/polyextrude3
hou_node = hou_parent.node("polyextrude3")
if hou_parent.node("polyextrude2") is not None:
hou_node.setInput(0, hou_parent.node("polyextrude2"), 0)
# Code to establish connections for /obj/grid1/merge1
hou_node = hou_parent.node("merge1")
if hou_parent.node("polyextrude3") is not None:
hou_node.setInput(0, hou_parent.node("polyextrude3"), 0)
if hou_parent.node("copy1") is not None:
hou_node.setInput(1, hou_parent.node("copy1"), 0)
if hou_parent.node("copy3") is not None:
hou_node.setInput(2, hou_parent.node("copy3"), 0)
# Code to establish connections for /obj/grid1/transform1
hou_node = hou_parent.node("transform1")
if hou_parent.node("tube1") is not None:
hou_node.setInput(0, hou_parent.node("tube1"), 0)
# Code to establish connections for /obj/grid1/copy1
hou_node = hou_parent.node("copy1")
if hou_parent.node("transform1") is not None:
hou_node.setInput(0, hou_parent.node("transform1"), 0)
# Code to establish connections for /obj/grid1/copy2
hou_node = hou_parent.node("copy2")
if hou_parent.node("merge2") is not None:
hou_node.setInput(0, hou_parent.node("merge2"), 0)
# Code to establish connections for /obj/grid1/transform2
hou_node = hou_parent.node("transform2")
if hou_parent.node("tube2") is not None:
hou_node.setInput(0, hou_parent.node("tube2"), 0)
# Code to establish connections for /obj/grid1/copy3
hou_node = hou_parent.node("copy3")
if hou_parent.node("copy2") is not None:
hou_node.setInput(0, hou_parent.node("copy2"), 0)
# Code to establish connections for /obj/grid1/transform3
hou_node = hou_parent.node("transform3")
if hou_parent.node("transform2") is not None:
hou_node.setInput(0, hou_parent.node("transform2"), 0)
# Code to establish connections for /obj/grid1/merge2
hou_node = hou_parent.node("merge2")
if hou_parent.node("transform2") is not None:
hou_node.setInput(0, hou_parent.node("transform2"), 0)
if hou_parent.node("transform3") is not None:
hou_node.setInput(1, hou_parent.node("transform3"), 0)
# Code to establish connections for /obj/grid1/blast2
hou_node = hou_parent.node("blast2")
if hou_parent.node("merge1") is not None:
hou_node.setInput(0, hou_parent.node("merge1"), 0)
# Code to establish connections for /obj/grid1/edit1
hou_node = hou_parent.node("edit1")
if hou_parent.node("blast2") is not None:
hou_node.setInput(0, hou_parent.node("blast2"), 0)
# Code to establish connections for /obj/grid1/merge3
hou_node = hou_parent.node("merge3")
if hou_parent.node("polywire2") is not None:
hou_node.setInput(0, hou_parent.node("polywire2"), 0)
if hou_parent.node("edit1") is not None:
hou_node.setInput(1, hou_parent.node("edit1"), 0)
if hou_parent.node("polywire3") is not None:
hou_node.setInput(2, hou_parent.node("polywire3"), 0)
if hou_parent.node("polywire1") is not None:
hou_node.setInput(3, hou_parent.node("polywire1"), 0)
if hou_parent.node("polywire5") is not None:
hou_node.setInput(4, hou_parent.node("polywire5"), 0)
if hou_parent.node("transform8") is not None:
hou_node.setInput(5, hou_parent.node("transform8"), 0)
if hou_parent.node("transform9") is not None:
hou_node.setInput(6, hou_parent.node("transform9"), 0)
if hou_parent.node("transform10") is not None:
hou_node.setInput(7, hou_parent.node("transform10"), 0)
if hou_parent.node("transform11") is not None:
hou_node.setInput(8, hou_parent.node("transform11"), 0)
# Code to establish connections for /obj/grid1/transform4
hou_node = hou_parent.node("transform4")
if hou_parent.node("box1") is not None:
hou_node.setInput(0, hou_parent.node("box1"), 0)
# Code to establish connections for /obj/grid1/bound1
hou_node = hou_parent.node("bound1")
if hou_parent.node("transform4") is not None:
hou_node.setInput(0, hou_parent.node("transform4"), 0)
# Code to establish connections for /obj/grid1/bound2
hou_node = hou_parent.node("bound2")
if hou_parent.node("transform5") is not None:
hou_node.setInput(0, hou_parent.node("transform5"), 0)
# Code to establish connections for /obj/grid1/transform5
hou_node = hou_parent.node("transform5")
if hou_parent.node("box2") is not None:
hou_node.setInput(0, hou_parent.node("box2"), 0)
# Code to establish connections for /obj/grid1/bound3
hou_node = hou_parent.node("bound3")
if hou_parent.node("transform6") is not None:
hou_node.setInput(0, hou_parent.node("transform6"), 0)
# Code to establish connections for /obj/grid1/transform6
hou_node = hou_parent.node("transform6")
if hou_parent.node("box3") is not None:
hou_node.setInput(0, hou_parent.node("box3"), 0)
# Code to establish connections for /obj/grid1/bound4
hou_node = hou_parent.node("bound4")
if hou_parent.node("transform7") is not None:
hou_node.setInput(0, hou_parent.node("transform7"), 0)
# Code to establish connections for /obj/grid1/transform7
hou_node = hou_parent.node("transform7")
if hou_parent.node("box4") is not None:
hou_node.setInput(0, hou_parent.node("box4"), 0)
# Code to establish connections for /obj/grid1/polyextrude4
hou_node = hou_parent.node("polyextrude4")
if hou_parent.node("box5") is not None:
hou_node.setInput(0, hou_parent.node("box5"), 0)
# Code to establish connections for /obj/grid1/polyextrude5
hou_node = hou_parent.node("polyextrude5")
if hou_parent.node("polyextrude4") is not None:
hou_node.setInput(0, hou_parent.node("polyextrude4"), 0)
# Code to establish connections for /obj/grid1/polyextrude6
hou_node = hou_parent.node("polyextrude6")
if hou_parent.node("polyextrude5") is not None:
hou_node.setInput(0, hou_parent.node("polyextrude5"), 0)
# Code to establish connections for /obj/grid1/transform8
hou_node = hou_parent.node("transform8")
if hou_parent.node("polyextrude6") is not None:
hou_node.setInput(0, hou_parent.node("polyextrude6"), 0)
# Code to establish connections for /obj/grid1/transform9
hou_node = hou_parent.node("transform9")
if hou_parent.node("polyextrude6") is not None:
hou_node.setInput(0, hou_parent.node("polyextrude6"), 0)
# Code to establish connections for /obj/grid1/transform10
hou_node = hou_parent.node("transform10")
if hou_parent.node("polyextrude6") is not None:
hou_node.setInput(0, hou_parent.node("polyextrude6"), 0)
# Code to establish connections for /obj/grid1/transform11
hou_node = hou_parent.node("transform11")
if hou_parent.node("polyextrude6") is not None:
hou_node.setInput(0, hou_parent.node("polyextrude6"), 0)
# Code to establish connections for /obj/grid1/polywire1
hou_node = hou_parent.node("polywire1")
if hou_parent.node("bound3") is not None:
hou_node.setInput(0, hou_parent.node("bound3"), 0)
# Code to establish connections for /obj/grid1/polywire2
hou_node = hou_parent.node("polywire2")
if hou_parent.node("bound1") is not None:
hou_node.setInput(0, hou_parent.node("bound1"), 0)
# Code to establish connections for /obj/grid1/polywire3
hou_node = hou_parent.node("polywire3")
if hou_parent.node("bound2") is not None:
hou_node.setInput(0, hou_parent.node("bound2"), 0)
# Code to establish connections for /obj/grid1/polywire5
hou_node = hou_parent.node("polywire5")
if hou_parent.node("bound4") is not None:
hou_node.setInput(0, hou_parent.node("bound4"), 0)
# Code to establish connections for /obj/grid1/rop_fbx1
hou_node = hou_parent.node("rop_fbx1")
if hou_parent.node("merge3") is not None:
hou_node.setInput(0, hou_parent.node("merge3"), 0)
# Restore the parent and current nodes.
hou_parent = hou_parent.parent()
hou_node = hou_node.parent()
| 39.867495 | 665 | 0.784653 | 19,298 | 134,792 | 5.236035 | 0.052803 | 0.118254 | 0.05071 | 0.033846 | 0.910198 | 0.888584 | 0.875481 | 0.870177 | 0.837468 | 0.83347 | 0 | 0.044352 | 0.080339 | 134,792 | 3,380 | 666 | 39.87929 | 0.77077 | 0.132938 | 0 | 0.807709 | 0 | 0.001752 | 0.17786 | 0.032443 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.019273 | 0.002628 | 0 | 0.002628 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2b4f6b89f5c36b49fa44ad0177bf55cac1e9fbb1 | 40 | py | Python | scripts/geo-parser/utils/__init__.py | enaky/covid-visualizer | 663472944d5bc8ee8635fe37737525cc373da96e | [
"Apache-2.0"
] | 1 | 2021-02-01T13:02:42.000Z | 2021-02-01T13:02:42.000Z | scripts/geo-parser/utils/__init__.py | enaky/covid-visualizer | 663472944d5bc8ee8635fe37737525cc373da96e | [
"Apache-2.0"
] | null | null | null | scripts/geo-parser/utils/__init__.py | enaky/covid-visualizer | 663472944d5bc8ee8635fe37737525cc373da96e | [
"Apache-2.0"
] | null | null | null | from .web_utils import get_json_from_web | 40 | 40 | 0.9 | 8 | 40 | 4 | 0.75 | 0.4375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 1 | 40 | 40 | 0.864865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
990c0d9ad89059fb9148da0259f62d93a154c29f | 12,048 | py | Python | store/migrations/0001_initial.py | dayraliz99/gmeBox | 82e7a19cf69452a469d09063146b215413db886b | [
"Apache-2.0"
] | null | null | null | store/migrations/0001_initial.py | dayraliz99/gmeBox | 82e7a19cf69452a469d09063146b215413db886b | [
"Apache-2.0"
] | null | null | null | store/migrations/0001_initial.py | dayraliz99/gmeBox | 82e7a19cf69452a469d09063146b215413db886b | [
"Apache-2.0"
] | null | null | null | # Generated by Django 3.0.7 on 2020-09-28 15:10
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('people', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='Categoria',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre', models.CharField(max_length=255, verbose_name='Nombre')),
('descripcion', models.TextField(verbose_name='Descripción')),
],
),
migrations.CreateModel(
name='Cliente',
fields=[
('persona_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='people.Persona')),
],
bases=('people.persona',),
),
migrations.CreateModel(
name='Compra',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fechaCompra', models.DateField(verbose_name='Fecha de compra')),
('subtotal', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='SubTotal')),
('impuesto', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Impuesto')),
('total', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Total')),
('estado', models.CharField(choices=[('POR_PAGAR', 'Por pagar'), ('PAGADO', 'Pagado')], default='POR_PAGAR', max_length=50, verbose_name='Estado')),
],
),
migrations.CreateModel(
name='DetalleOrden',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre_equipo', models.CharField(max_length=250, verbose_name='Nombre de equipo')),
('observacion', models.CharField(max_length=250, verbose_name='Observación')),
('estado', models.CharField(choices=[('NUEVO', 'Nuevo'), ('EN_REVISION', 'En revisión'), ('REVISADO', 'Revisado'), ('CONFIRMADO', 'Confirmado'), ('Arregado', 'Arreglado'), ('FINALIZADO', 'Finalizado')], default='NUEVO', max_length=50, verbose_name='Estado')),
('precio_servicio', models.DecimalField(decimal_places=2, default=0.0, max_digits=12, verbose_name='Precio de servicio')),
],
),
migrations.CreateModel(
name='Empresa',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre', models.CharField(max_length=250, verbose_name='Nombre')),
('contacto', models.CharField(max_length=250, verbose_name='Contacto')),
('email', models.EmailField(max_length=254, unique=True, verbose_name='Correo electrónico')),
('telefono', models.CharField(blank=True, max_length=250, null=True, verbose_name='Teléfono')),
('celular', models.CharField(blank=True, max_length=250, null=True, verbose_name='Celular')),
('direccion', models.CharField(blank=True, max_length=250, null=True, verbose_name='Dirección')),
],
),
migrations.CreateModel(
name='Proveedor',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre', models.CharField(max_length=250, verbose_name='Nombre')),
('contacto', models.CharField(max_length=250, verbose_name='Contacto')),
('email', models.EmailField(max_length=254, unique=True, verbose_name='Correo electrónico')),
('telefono', models.CharField(blank=True, max_length=250, null=True, verbose_name='Teléfono')),
('celular', models.CharField(blank=True, max_length=250, null=True, verbose_name='Celular')),
('direccion', models.CharField(blank=True, max_length=250, null=True, verbose_name='Dirección')),
],
),
migrations.CreateModel(
name='Tecnico',
fields=[
('persona_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='people.Persona')),
('fecha_ingreso', models.DateField(verbose_name='Fecha de ingreso')),
],
bases=('people.persona',),
),
migrations.CreateModel(
name='RevisionTecnica',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha_revision', models.DateField(verbose_name='Fecha de revisión')),
('descripcion', models.CharField(max_length=250, verbose_name='Descripción')),
('detalle_orden', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='revisiones', to='store.DetalleOrden')),
('tecnico', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='revisiones', to='store.Tecnico')),
],
),
migrations.CreateModel(
name='Producto',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre', models.CharField(max_length=255, verbose_name='Nombre')),
('cantidad', models.PositiveIntegerField(verbose_name='Cantidad')),
('descripcion', models.TextField(verbose_name='Descripción')),
('categoria', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='productos', to='store.Categoria')),
],
),
migrations.CreateModel(
name='Precio',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('valor', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Precio')),
('nombre', models.CharField(blank=True, max_length=255, null=True)),
('producto', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='precios', to='store.Producto')),
],
),
migrations.CreateModel(
name='OrdenMantenimiento',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha_registro', models.DateField(auto_now_add=True, verbose_name='Fecha de registro')),
('descripcion', models.CharField(blank=True, max_length=250, null=True, verbose_name='Descripción')),
('estado', models.CharField(choices=[('NUEVO', 'Nuevo'), ('EN_REVISION', 'EN revisión'), ('REVISADO', 'Revisado'), ('FINALIZADO', 'Finalizado')], default='NUEVO', max_length=50, verbose_name='Estado')),
('monto_servicio', models.DecimalField(decimal_places=2, default=0.0, max_digits=12, verbose_name='Monto')),
('cliente', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ordenes', to='store.Cliente')),
('empresa', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ordenes', to='store.Empresa')),
],
),
migrations.CreateModel(
name='Impuesto',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('porcentaje', models.DecimalField(decimal_places=2, max_digits=2, verbose_name='Porcentaje')),
('nombre', models.CharField(blank=True, max_length=255, null=True)),
('producto', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='impuestos', to='store.Producto')),
],
),
migrations.CreateModel(
name='Factura',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fechaVenta', models.DateField(verbose_name='Fecha de compra')),
('subtotal', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='SubTotal')),
('impuesto', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Impuesto')),
('total', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Total')),
('estado', models.CharField(choices=[('POR_PAGAR', 'Por pagar'), ('PAGADO', 'Pagado')], default='POR_PAGAR', max_length=50, verbose_name='Estado')),
('cliente', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='facturas', to='store.Cliente')),
('empresa', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='facturas', to='store.Empresa')),
],
),
migrations.AddField(
model_name='detalleorden',
name='orden_mantenimiento',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='detalles', to='store.OrdenMantenimiento'),
),
migrations.CreateModel(
name='DetalleFactura',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('detalle', models.CharField(blank=True, max_length=255, null=True)),
('cantidad', models.PositiveIntegerField(verbose_name='Cantidad')),
('precioUnitario', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Precio Unitario')),
('impuesto', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Impuesto')),
('total', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Total')),
('factura', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='detalles', to='store.Factura')),
('producto', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ventas', to='store.Producto')),
],
),
migrations.CreateModel(
name='DetalleCompra',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('detalle', models.CharField(blank=True, max_length=255, null=True)),
('cantidad', models.PositiveIntegerField(verbose_name='Cantidad')),
('precioUnitario', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Precio Unitario')),
('impuesto', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Impuesto')),
('total', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Total')),
('compra', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='detalles', to='store.Compra')),
('producto', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='compras', to='store.Producto')),
],
),
migrations.AddField(
model_name='compra',
name='proveedor',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='compras', to='store.Proveedor'),
),
migrations.AddField(
model_name='categoria',
name='empresa',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='categorias', to='store.Empresa'),
),
]
| 63.746032 | 275 | 0.61836 | 1,232 | 12,048 | 5.879058 | 0.12013 | 0.092641 | 0.036725 | 0.057711 | 0.838327 | 0.838327 | 0.779373 | 0.76322 | 0.757145 | 0.757145 | 0 | 0.015524 | 0.23008 | 12,048 | 188 | 276 | 64.085106 | 0.765308 | 0.003735 | 0 | 0.662983 | 1 | 0 | 0.16507 | 0.002 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.01105 | 0 | 0.033149 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.